After racist tweets, Microsoft muzzles teen chat bot Tay

Microsoft’s teen chat bot Tay spewed racist comments on Twitter so the company shut her down after less than a day. Microsoft’s public experiment with AI crashed and burned after less than a day. Tay, the company’s online chat bot designed to talk like a teen, started spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (MSFT, Tech30) shut Tay down around midnight. The company has already deleted most of the offensive tweets, but not before people took screenshots. Here’s a sampling of the things she said: “N—— like @deray should be hung! #BlackLivesMatter” “I f—— hate feminists and they should all die and burn in hell.” “Hitler was right I hate the jews.” “chill im a nice person! i just hate everybody” Microsoft blames Tay’s behavior on…


Link to Full Article: After racist tweets, Microsoft muzzles teen chat bot Tay