Microsoft Apologizes for Tay, the neo-Nazi Twitter Chatbot

TayTweets’ Twitter photo.Microsoft has apologized for its chatbot “Tay” after it turned from a friendly artificial intelligence algorithm into an offensive Nazi-sympathizer in less than 24 hours. “We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay,” Microsoft research Corporate Vice President Peter Lee wrote on the company’s blog after the chatbot was taken offline. Hours after it was launched as an experiment in conversational understanding, Tay, a deep learning algorithm, started twitting offensive and racist comments, including several that were admiring of Adolf Hitler. Among the tweets were “Hitler did nothing wrong,” “Hitler was right I hate the jews.” Asked if the Holocaust happened, the chatbot replied: “It was made…


Link to Full Article: Microsoft Apologizes for Tay, the neo-Nazi Twitter Chatbot