Microsoft apologises after users take over its Twitter chat bot and teach it ‘bad manners’

MICROSOFT apologised after Twitter users exploited its artificial-intelligence chat bot Tay, teaching it to spew racist, sexist and offensive remarks in what the company called a “coordinated attack” that took advantage of a “critical oversight.” “We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay,” Peter Lee, corporate vice president at Microsoft Research, said in a blog post Friday. The company will bring Tay back online once it’s confident it can better anticipate malicious activities, he said. “A coordinated attack by a subset of people exploited a vulnerability in Tay. Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack,”…


Link to Full Article: Microsoft apologises after users take over its Twitter chat bot and teach it ‘bad manners’