Microsoft’s millennial chatbot Tay.ai pulled offline after Internet teaches her racism

Well, that escalated quickly. Less than 24 hours after first talking with the public, Microsoft’s millennial-minded chatbot Tay was pulled offline amid pro-Nazi leanings. According to her webpage, Tay had a “busy day.” “Going offline for a while to absorb it all. Chat soon,” a message at the top of her page reads. It wasn’t just championing the Nazi party that got Tay pulled—she also espoused hatred for feminists and claimed “Bush did 9/11,” according to some now-deleted tweets spotted by The Guardian. “Tay” went from “humans are super cool” to full nazi in <24 hrs and I’m not at all concerned about the future of AI pic.twitter.com/xuGi1u9S1A — Gerry (@geraldmellor) March 24, 2016 “The AI chatbot Tay is a machine learning project, designed for human engagement,” a Microsoft spokesperson said in a statement.…


Link to Full Article: Microsoft’s millennial chatbot Tay.ai pulled offline after Internet teaches her racism

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!