Microsoft artificial intelligence chatbot ‘Tay’ suspended after one day when it started spouting …

Microsoft’s artificial intelligence chatbot, Tay, has been suspended after one day when users ‘taught’ the bot to spout rude and offensive statements.APNews Corp Australia NetworkShareShare on FacebookShare on TwitterShare on Google+Share on RedditEmail a friendARTIFICIAL-intelligence software designed by Microsoft to tweet like a teenage girl has been suspended after it began spouting offensive remarks. Microsoft said it was making adjustments to the Twitter chatbot after users found a way to manipulate it to tweet racist and sexist remarks and made a reference to Hitler.The chatbot — named Tay — debuted on Wednesday with a couple of perky tweets and the promise:hellooooooo w?rld!!!— TayTweets (@TayandYou) March 23, 2016 so many new beginnings #lunareclipse…?????????— TayTweets (@TayandYou) March 23, 2016 Tay was designed to learn how to communicate through conversations with real humans,…


Link to Full Article: Microsoft artificial intelligence chatbot ‘Tay’ suspended after one day when it started spouting …

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!