Microsoft apologizes for its racist chatbot’s ‘wildly inappropriate and reprehensible words’ (MSFT)

Matt Weinberger, provided by Published 4:03 pm, Friday, March 25, 2016 Microsoft Microsoft apologized for racist and “reprehensible” tweets made by its chatbot and promised to keep the bot offline until the company is better prepared to counter malicious efforts to corrupt the bot’s artificial intelligence. In a blog entry on Friday, Microsoft Research head Peter Lee expressed regret for the conduct of its AI chatbot, named Tay, explaining that the bot fell victim to a “coordinated attack by a subset of people.” “We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay,” Lee writes. Earlier this week, Microsoft launched Tay —  a bot ostensibly designed to talk to users on Twitter like a real millennial teenager and learn from the…


Link to Full Article: Microsoft apologizes for its racist chatbot’s ‘wildly inappropriate and reprehensible words’ (MSFT)