Friendly Microsoft chatbot turns into racist monster

Comments Print Ted S. Warren/AP/file 2014 The Microsoft Corp. logo outside the Microsoft Visitor Center in Redmond, Wash. By Brandon Bailey AP Technology Writer  March 25, 2016 SAN FRANCISCO (AP) — OMG! Did you hear about the artificial intelligence program that Microsoft designed to chat like a teenage girl? It was totally yanked offline in less than a day, after it began spouting racist, sexist and otherwise offensive remarks. Microsoft said it was all the fault of some really mean people, who launched a ‘‘coordinated effort’’ to make the chatbot known as Tay ‘‘respond in inappropriate ways.’’ To which one artificial intelligence expert responded: Duh! Well, he didn’t really say that. But computer scientist Kris Hammond did say, ‘‘I can’t believe they didn’t see this coming.’’ Microsoft said its researchers…


Link to Full Article: Friendly Microsoft chatbot turns into racist monster

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!