Microsoft Apologizes for Corrupted Chatbot’s Nasty Comments

Microsoft last week apologized for its Tay chatbot’s bad behavior. It took the machine learning system offline, only 24 hours into its short life, after Twitter trolls got it to deny the Holocaust and elicit pro-Nazi and anti-feminist remarks. “We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay,” said Peter Lee, corporate vice president at Microsoft Research. The company launched Tay on Twitter with the goal of learning about and improving the artificial intelligence by having it interact with 18- to 24-year-old U.S. Web users. Microsoft says an e-gang forced it to take Tay down. “Unfortunately, within the first 24 hours of coming online, we became aware of a…


Link to Full Article: Microsoft Apologizes for Corrupted Chatbot’s Nasty Comments