Microsoft Apologises for Racist Tweets, Pulls Back Chatbot Tay

Twitter/@TayandYou Recently, Microsoft launched a Chatbot on the internet which has been designed to mimic in a teenage girl. However, it has been accused of tweeting racist and sexist statements, and the company has apologised for it. A group of mischievous Twitter users made a slew of anti-Semitic and offensive remarks to Tay. To which the chatbot replied typically, saying “feminism is cancer”, “Holocaust didn’t happen”, and “Bush did 9/11”. To another user, the chatbot gave a message that read: “Hitler would have done a better job than the monkey we have now”. @Mescool pic.twitter.com/1aail43Ydi — TayTweets (@TayandYou) March 24, 2016 A Microsoft executive wrote in an official blog post, “Although we had prepared for many types of abuses of the system, we had made a critical oversight for this…


Link to Full Article: Microsoft Apologises for Racist Tweets, Pulls Back Chatbot Tay

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!