Microsoft Chat Bot ‘Tay’ Turns Alarmingly Racist

Microsoft has run into controversy after its experiment in machine learning took a turn for the worse. The software company set up an experiment in real-time machine learning, calling the chat-bot Tay and letting it loose on Twitter. The artificial intelligence chat bot started posting racist and sexist messages on Twitter this Wednesday. Microsoft’s Tay chat-bot becomes a bigot Tay was responding to questions from other Twitter users before posting a stream of offensive tweets. The chat-bot turned into a Holocaust denier and used sexist language against a female game developer, among other offensive tweets. Microsoft said that it was working to fix the problems that caused the offensive tweets to be sent. “The AI chatbot Tay is a machine learning project, designed for human engagement,” Microsoft said in a…


Link to Full Article: Microsoft Chat Bot ‘Tay’ Turns Alarmingly Racist