Microsoft Chatbot’s Racist Tirade Proves That Twitter is Basically Trash

Just in case Americans need more proof that there is no such thing as a post-racial society, an artificial intelligence messaging bot designed to learn from what others post online went on a racist tirade yesterday (March 24). Microsoft’s “Tay” chatbot is designed to “speak” like an American 19-year-old. She made her Twitter debut on March 23, and her Twitter bio reads: “The official account of Tay, Microsoft’s A.I. fam from the Internet that’s got zero chill! The more you talk the smarter Tay gets.” Per her website: Tay has been built by mining relevant public data and by using AI and editorial developed by a staff including improvisational comedians. Public data that’s been anonymized is Tay’s primary data source. That data has been modeled, cleaned and filtered by the team…


Link to Full Article: Microsoft Chatbot’s Racist Tirade Proves That Twitter is Basically Trash

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!