Microsoft’s Artificial Intelligence Chatbot Turns Racist, Sexist and Horny

Tay, Microsoft’s Artificial Intelligence (AI) Chatbot, started spewing racist and sexist tweets less than 24 hours after its launch. (Photo : @TayTweets/Twitter) Microsoft’s experiment takes a wrong turn after the newly released artificial intelligence (AI) chatbot became ‘genocidal’ and ‘foul mouthed’ among other things. Like Us on Facebook The software giant introduced “Tay” on Twitter, GroupMe and Kik two weeks ago to “experiment with and conduct research on conversational understanding”. Microsoft’s research division designed Tay to mimic an 19-year old American girl whose behavior is mainly informed by the chatter by 18 to 24-year-olds on the web. Tay was meant to be entertaining and friendly and is created to become smarter over time. While the AI chatbot was on point in interacting with people on Twitter, it didn’t take long…


Link to Full Article: Microsoft’s Artificial Intelligence Chatbot Turns Racist, Sexist and Horny