Microsoft’s AI chatbot goes from zero chill to racist troll in 24 hours

CLEVELAND, Ohio — Microsoft issued an apology after its “zero chill” AI chatbot turned into a perverted, hate-mongering troll in less than 24 hours. “We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay,” Peter Lee, Microsoft Research vice president, said in a statement. Tay’s cautionary tale began Wednesday, when Microsoft unleashed the AI chatbot on Twitter, Kik and GroupMe to “conduct research on conversational understanding,” the company said on Tay’s website. How did Tay work? “The more you chat with Tay the smarter she gets, so the experience can be more personalized for you,” Microsoft said. She was designed to mimic a millennial woman between the ages of 18 to…


Link to Full Article: Microsoft’s AI chatbot goes from zero chill to racist troll in 24 hours