Microsoft shame at sweary chat-bot

Microsoft pledged not to reintroduce Tay until it was confident that it could not be led astray again THE robot takeover may not be quite as near as we thought. Microsoft was forced to apologise this weekend after Tay, its artificial intelligence-based “chat-bot”, unleashed a torrent of racist and sexist comments. Microsoft was forced to take Tay offline less than 24 hours after it launched on Twitter. In a blog post, Peter Lee, corporate vice-president at Microsoft, said the company was, “deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for”. The chat-bot was designed to speak like a teenage girl and was aimed at the “millennial” generation. Using artificial intelligence, Tay was meant to “learn” based…


Link to Full Article: Microsoft shame at sweary chat-bot