The chat bot had been sidelined after making racist comments. — -- Microsoft's millennial chat bot "Tay" made a brief reappearance on Twitter this morning but still didn't make a stellar ...
Don't mess with Tay Tay. Pop superstar Taylor Swift apparently tried to stop Microsoft from calling its chatbot Tay after the AI-powered bot morphed into a racist troll, according to Microsoft ...
Oh, Microsoft. Last week, the company pulled its Tay chatbot from Twitter after some users trained it to become a racist jackass. On Wednesday, Tay was brought back online, sending thousands of tweet ...
Taylor Swift once threatened to sue Microsoft because of the company's chatbot-turned-racist, Tay. It wasn't long before the AI plugged into Twitter turned into a rampant racist when it was launched ...
Microsoft’s artificial intelligence strategy is to unleash more bots like Tay. Redmond doesn’t mean more nasty, racist or homophobic chatbots, but a forest of A.I. personalities for different uses — ...
Taylor Swift tried to block Microsoft from using the moniker Tay to a chatbot that turned into a depraved racist troll vessel, according to a new book from Microsoft President Brad Smith. In March ...