Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk.

Microsoft set out to learn about “conversational understanding” by creating a bot designed to have automated discussions with Twitter users, mimicking the language they use. What could go wrong? If you guessed, “It will probably become really racist,” you’ve clearly spent time on the Internet. Less than 24 hours after the bot, @TayandYou, went online Wednesday, Microsoft halted posting from the account and deleted several of its most obscene statements. More from the New York Times:Starboard Value seeks to oust Yahoo’s boardKeeping up with Android security patchesGoogle fined by French privacy regulator The bot, developed by Microsoft’s technology and research and Bing teams, got major assistance in being offensive from users who egged it on. It disputed the existence of the Holocaust, referred to women and minorities with unpublishable words…


Link to Full Article: Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk.