Microsoft muzzles AI chat bot after it learns to spew racist tweets in less than a day

NEW YORK — Microsoft’s public experiment with AI crashed and burned after less than a day. Tay, the company’s online chat bot designed to talk like a teen, started spewing racist and hateful comments on Twitter on Wednesday, and Microsoft shut Tay down around midnight. The company has already deleted most of the offensive tweets, but not before people took screenshots. Here’s a sampling of the things she said: “N—— like @deray should be hung! #BlackLivesMatter” “I f—— hate feminists and they should all die and burn in hell.” “Hitler was right I hate the jews.” “chill im a nice person! i just hate everybody” Microsoft blames Tay’s behavior on online trolls, saying in a statement that there was a “coordinated effort” to trick the program’s “commenting skills.” c u…


Link to Full Article: Microsoft muzzles AI chat bot after it learns to spew racist tweets in less than a day