Microsoft attempts a teen girl chatbot; within 24 hours she is a horny Hitler fan

This is why we can’t have nice things It took less than 24 hours for the Internet to corrupt the latest Microsoft AI experiment. All that “Tay” was supposed to do was engage in casual conversation, handle some innocuous tasks, and “conduct research on conversation understanding.” Built by the teams at Microsoft’s Technology and Research and Bing, Tay is a ChatBot designed to target 18 to 24 year olds in the U.S. and was built by data mining anonymized public data, using AI machine learning, and editorial developed by a staff that included improvisational comedians. The internet strikes back About 16 hours into “Tay’s” first day on the job, she was “fired” due to her inability to interpret incoming data as racist or offensive. She was designed to “learn” from…


Link to Full Article: Microsoft attempts a teen girl chatbot; within 24 hours she is a horny Hitler fan