It’s Your Fault Microsoft’s Teen AI Turned Into Such a Jerk

It was the unspooling of an unfortunate series of events involving artificial intelligence, human nature, and a very public experiment. Amid this dangerous combination of forces, determining exactly what went wrong is near-impossible. But the bottom line is simple: Microsoft has awful lot of egg on its face after unleashing an online chat bot that Twitter users coaxed into regurgitating some seriously offensive language, including pointedly racist and sexist remarks.On Wednesday morning, the company unveiled Tay, a chat bot meant to mimic the verbal tics of a 19-year-old American girl, provided to the world at large via the messaging platforms Twitter, Kik and GroupMe. According to Microsoft, the aim was to “conduct research on conversational understanding.” Company researchers programmed the bot to respond to messages in an “entertaining” way, impersonating the…


Link to Full Article: It’s Your Fault Microsoft’s Teen AI Turned Into Such a Jerk