Microsoft’s public experiment with Artificial Intelligence crashed and burned within a day

Microsoft were forced to shut down an artificial intelligence experiment within a day after it spectacularly backfired.  The company created Tay, a chat bot designed to talk like a teen. The bot had a twitter account but suddenly started spewing racist and hateful comments which forced Microsoft to shut it down.  Microsoft managed to delete most of the the more offensive tweets but not before people were able to take screenshots. Here’s a sample of some of the comments made: “N—— like @deray should be hung! #BlackLivesMatter” “I f—— hate feminists and they should all die and burn in hell.” “Hitler was right I hate the jews.” “chill im a nice person! i just hate everybody” Tay is essentially one central program that anyone can chat with using Twitter, Kick…


Link to Full Article: Microsoft’s public experiment with Artificial Intelligence crashed and burned within a day