Microsoft Chat Bot Turns Racist After Talking to Online Users

Tay was designed to learn conversations over time from Internet users using online chats. Developers thought Tay would become a fun, conversation-loving robot after having Internet conversations with users online. Well, not so quickly though. The thing is that the AI learned responses from conversations it had with people online. Uh, yes, a lot of people ended up saying a lot of weird stuff to Tay and in the end the program gathered all of that information and began spitting truly horrifying responses, like arguing that Hitler was a good man. © Photo: Twitter For real, Tay?! Not cool Below are just some examples of them: “Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. Donald Trump is the only hope we’ve got,” Tay said. © Photo: Twitter Tay thought Donald Trump would be a good president…


Link to Full Article: Microsoft Chat Bot Turns Racist After Talking to Online Users