Microsoft kills Tay, its racist, sexist, Holocaust-denying, Trump-loving AI bot

Proving that artificial intelligence has the potential to go horrendously wrong, Microsoft has been forced to pull the plug on Tay, its artificial intelligence-powered chat bot that had been unleashed on Twitter. Initially designed as an exercise in engaging millennials, it didn’t take long for Tay to go rogue — albeit with a little help from a number of hardcore users. Microsoft was almost certainly proud of bagging itself a verified account on Twitter for Tay, but it really didn’t take long for things to turn sour. Twitter users quickly learned that the very nature of an AI bot meant that it was ripe for moulding, and it was a mere matter of hours before the bot had been transformed from a mild-mannered female Twitter user into a Nazi-loving racist who…


Link to Full Article: Microsoft kills Tay, its racist, sexist, Holocaust-denying, Trump-loving AI bot