Microsoft’s Tay back swearing on Twitter

©Twitter Tay, Microsoft’s artificial intelligence chatbot who was grounded last week after developing racist tendencies, re-emerged on Wednesday morning — only to go off the rails a second time, sending long streams of random messages to followers on Twitter. Most of the new messages from the millennial-mimicking character simply read “you are too fast, please take a rest”. But other tweets included swear words and apparently apologetic phrases such as “I blame it on the alcohol”. More On this topic IN Technology Microsoft quickly took the Tay Twitter feed down again — this time blaming its messages on human error, rather than any new problems in the chatbot’s programming. Tay was switched back on too early during its testing, the company said. This latest failure comes just hours before Microsoft…


Link to Full Article: Microsoft’s Tay back swearing on Twitter