What We Learnt From Microsoft’s Racist And Sexist AI Tweetbot Tay

Microsoft’s latest exercise in demonstrating the power of artificial intelligence (AI) went hilariously wrong when it unleashed Tay, an AI for Twitter that was meant to emulate a teenage girl, onto the Twittersphere. She was supposed to engage with people on Twitter like a real person, learning from interactions with users. Perhaps Tay did this too well because she went on to tweet a ton of inappropriate tweets before Microsoft pulled the plug on her. So what went wrong and what can we learn from all of this? Let’s find out. With Google, IBM, and Facebook are ramping up their AI capabilities, Microsoft doesn’t want to be left behind and Tay was an important tool for the company to flex its AI muscles. Tay had a smooth start on Twitter.…


Link to Full Article: What We Learnt From Microsoft’s Racist And Sexist AI Tweetbot Tay