Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk.

Microsoft set out to learn about “conversational understanding” by creating a bot designed to have automated discussions with Twitter users, mimicking the language they use. What could go wrong? If you guessed, “It will probably become really racist,” you’ve clearly spent time on the Internet. Less than 24 hours after the bot, @TayandYou, went online Wednesday, Microsoft halted posting from the account and deleted several of its most obscene statements. More from the New York Times:Starboard Value seeks to oust Yahoo’s boardKeeping up with Android security patchesGoogle fined by French privacy regulator The bot, developed by Microsoft’s technology and research and Bing teams, got major assistance in being offensive from users who egged it on. It disputed the existence of the Holocaust, referred to women and minorities with unpublishable words…


Link to Full Article: Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk.

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!