Microsoft forced to apologise after epic chatbot fail

Computer program goes rogue with a sexist and racist tirade after being influenced by malicious users. Tay, the artificial intelligence Twitter robot created by Microsoft. Microsoft has been left red-faced after its new artificial intelligence (AI) teen girl chatbot, coyly named “Tay”, began a racist, sexist and all-round offensive tirade on Twitter only a day after she was released. Created for the 18- to 24-year-old demographic, Tay was designed to become “smarter” as more users interacted with her. Despite Microsoft claiming it planned for “many types of abuses”, Tay’s software quickly learned to parrot anti-Semitism and other hateful invective fed to it by human Twitter users, forcing the tech giant to shut it down on Friday (AEDT). • These apps will help you get back into reading• Microsoft embraces rivals with big news for…


Link to Full Article: Microsoft forced to apologise after epic chatbot fail