Microsoft Muzzles AI Chatbot After Twitter Users Teach It Racism

Thanks to machine learning and Internet trolls, Microsoft’s Tay AI chatbot became a student of racism within 24 hours. Microsoft has taken Tay offline and is making adjustments. 10 AI App Dev Tips And Tricks For Enterprises (Click image for larger view and slideshow.) Microsoft has taken its AI chatbot Tay offline after machine learning taught the software agent to parrot hate speech. Tay, introduced on Wednesday as a conversational companion for 18 to 24 year-olds with mobile devices, turned out to be a more astute student of human nature than its programmers anticipated. Less than a day after the bot’s debut it endorsed Hitler, a validation of Godwin’s law that ought to have been foreseen. Engineers from Microsoft’s Technology and Research and Bing teams created Tay as an experiment…


Link to Full Article: Microsoft Muzzles AI Chatbot After Twitter Users Teach It Racism