Microsoft Apologizes For Chatbot Tay’s Holocaust Denying, Racist And Anti-Feminism Tweets

Microsoft Corp. Friday issued an apology after its artificial-intelligence chatbot Tay posted tweets, denying Holocaust and announcing feminists should “burn in hell” among many other racist posts. The company, however, said that the “coordinated attack by a subset of people exploited a vulnerability” in the chatbot that was launched Wednesday. “We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay. Tay is now offline and we’ll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values,” Peter Lee, Microsoft’s vice president of research, said on the company’s official blog. Microsoft introduced Tay as the chatbot designed to engage and entertain…


Link to Full Article: Microsoft Apologizes For Chatbot Tay’s Holocaust Denying, Racist And Anti-Feminism Tweets