Microsoft Says It’s Deeply Sorry For Its Offensive Chat Bot

Microsoft has issued a mea culpa for an artificial intelligence research project that went awry earlier this week. The company’s head of research Peter Lee said in a blog post on Friday that Microsoft was “deeply sorry for the unintended offensive and hurtful tweets” made by an experimental chat bot. Microsoft’s research arm and its Bing search engine unit unveiled a chat bot named Tay on Wednesday that was supposed to talk with strangers on social media networks like Twitter twtr . The idea was that the more people Tay chatted with, the more it would learn from the data it collected about language. Get Data Sheet, Fortune’s technology newsletter. Within 24 hours however, Internet pranksters—many from the notorious message-board websites 4chan and 8chan—initiated several offensive conversations. Based on what…


Link to Full Article: Microsoft Says It’s Deeply Sorry For Its Offensive Chat Bot