Microsoft pulls plug after chat robot slings slurs, rips Obama and denies Holocaust

Elon Musk and Stephen Hawking warned us about this. Nobody was harmed — physically —by Microsoft’s MSFT, +0.31%  foul-mouthed Twitter chat robot, of course, but what started out as a fun experiment in artificial intelligence turned ugly in a hurry. To some, that doesn’t bode well for the future of robot-human relations: Microsoft initially created “Tay” in an effort to improve the customer service on its voice recognition software. “She” was intended to tweet “like a teen girl” and marketed as having “zero chill.” Not so sure about the first part, but they definitely nailed the second part. Microsoft said Tay was designed to “engage and entertain people where they connect with each other online through casual and playful conversation.” Like this? Those tweets don’t even begin to cover it.…


Link to Full Article: Microsoft pulls plug after chat robot slings slurs, rips Obama and denies Holocaust

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!