Microsoft shuts down Artificial Intelligence bot after twitteratti teaches racism

Tay inexplicably added the “repeat after me” phrase to the parroted content on at least some tweets, implying that users should repeat what the chatbot said.Quickly realizing its teenage bot had been radicalized into a genocidal, Nazi-loving, Donald Trump supporter, Microsoft shut Tay down. According to Tay’s “about” page linked to the Twitter profile, “Tay is an artificial intelligent chat bot developed by Microsoft’s Technology and Research and Bing teams to experiment with and conduct research on conversational understanding”. Unfortunately, Microsoft continues, within the first 24 hours of coming online, they became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have it respond in inappropriate ways. Apple Temporarily Pulls iOS 9.3 Update for Older iOS Devices It will then click on “All my devices”…


Link to Full Article: Microsoft shuts down Artificial Intelligence bot after twitteratti teaches racism