How Social Media Turned Microsoft’s Teen Chatbot, ‘Tay,’ Into a Nazi-Loving Racist in Less Than a …

Tay was supposed to show the growth of adaptable artificial intelligence, but instead she showed how quickly things can turn racist.Twitter Well, that went bad fast. On Wednesday, Microsoft unveiled Tay, an artificial intelligence “chatbot,” or “chat robot,” that would learn, through social media sites like Twitter, Kik and GroupMe, how to have “normal” conversations. “The more you talk the smarter Tay gets,” Tay’s Twitter profile reads. She was supposed to sound like a typical teenage girl. In less than a day, Tay went from a sweet, innocent chatbot to a Nazi-loving, feminist-hating racist. According to Quartz, Tay went from saying, “Humans are super cool” to tweeting, “Bush did 9/11, and Hitler would have done a better job than the monkey we have got now. donald trump [sic] is the…


Link to Full Article: How Social Media Turned Microsoft’s Teen Chatbot, ‘Tay,’ Into a Nazi-Loving Racist in Less Than a …