First Hitler, now drugs: Microsoft’s racist chatbot returns to ‘smoke kush’ on Twitter

Tay, a Twitter bot created by Microsoft to learn from chatting, was taken offline for tweaks when the internet taught it to share outrageous and racist tweets. The artificial intelligence recently came back online, only to continue misbehaving. Microsoft created Tay as an exercise in machine learning; the AI bot would modify its own creative patterns based on interactions with real people on platforms such as Twitter, Kik or Groupme, to emulate a young woman on social media. Read more Last week, to the wonder of internet hooligans, Tay’s chat algorithm allowed “her” to be tricked into making outrageous statements such as endorsing Adolf Hitler, causing Microsoft to put the bot to “sleep” to be recoded within 24 hours. Her latest tirade began early on Wednesday morning, during a period…


Link to Full Article: First Hitler, now drugs: Microsoft’s racist chatbot returns to ‘smoke kush’ on Twitter

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!