Microsoft axes chatbot that learned a little too much online

By Brandon Bailey | AP, SAN FRANCISCO — OMG! Did you hear about the artificial intelligence program that Microsoft designed to chat like a teenage girl? It was totally yanked offline in less than a day, after it began spouting racist, sexist and otherwise offensive remarks. Microsoft said it was all the fault of some really mean people, who launched a “coordinated effort” to make the chatbot known as Tay “respond in inappropriate ways.” To which one artificial intelligence expert responded: Duh! Well, he didn’t really say that. But computer scientist Kris Hammond did say, “I can’t believe they didn’t see this coming.” Microsoft said its researchers created Tay as an experiment to learn more about computers and human conversation. On its website, the company said the program was targeted to an…


Link to Full Article: Microsoft axes chatbot that learned a little too much online

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!