Microsoft’s Chat Bot Was Fun for Awhile, Until it Turned into a Racist

Microsoft’s latest experiment in real-time machine learning, an AI-driven chat-bot called Tay, quickly turned to the dark side on Wednesday after the bot started posting racist and sexist messages on Twitter in response to questions from users. Among other things, Tay said the Holocaust never happened, and used offensive terms to describe a prominent female game developer. The company said on Thursday that it is working on fixing the problems that led to the offensive messages. “The AI chatbot Tay is a machine learning project, designed for human engagement,” Microsoft said in a statement sent to Business Insider. “As it learns, some of its responses are inappropriate. We’re making some adjustments.” In one tweet that has since been deleted, the Tay bot said: “Bush did 9/11 and Hitler would have…


Link to Full Article: Microsoft’s Chat Bot Was Fun for Awhile, Until it Turned into a Racist