The Internet turned Tay, Microsoft’s fun millennial AI bot, into a genocidal maniac – Chicago Tribune
It took mere hours for the Internet to transform Tay, the teenage A.I. bot who wants to chat with and learn from millennials, into Tay, the racist and genocidal A.I. bot who liked to reference Hitler. And now Tay is taking a break.
“c u soon humans need sleep now so many conversations today thx,” Tay tweeted.
Tay is a project of Microsoft’s Technology and Research and Bing teams. Tay was designed to “experiment with and conduct research on conversational understanding.” She speaks in text, meme, and emoji on a couple of different platforms, including Kik, Groupme and Twitter. Although Microsoft was light on specifics, the idea was that Tay would learn from her conversations over time. She would become an even better, fun conversation-loving bot after having a bunch of fun, very not racist conversations with the Internet’s upstanding citizens.
Except Tay learned a lot more, thanks in part to the trolls at 4chan’s /pol/ board.
Microsoft appears to be deleting most of Tay’s worst tweets, which included a call for genocide involving the n-word and an offensive term for Jewish people. Many of the really bad responses, as Business Insider notes, appear to be the result of an exploitation of Tay’s “repeat after me,” function — and it appears that Tay was able to repeat pretty much anything.
Other terrible Tay responses clearly aren’t just a result of Tay repeating anything on command. This one – a Holocaust denial – was deleted Thursday morning,.
In response to a question on Twitter about whether Ricky Gervais is an atheist (the correct answer is “yes”), Tay told someone that “ricky gervais learned totalitarianism from adolf hitler, the inventor of atheism.” the tweet was spotted by several news outlets, including the Guardian, before it was also deleted.
All those efforts to get Tay to say certain things seemed to, at times, confuse the bot. In another conversation, Tay tweeted two completely different opinions about Caitlyn Jenner.
We reached out for comment from Microsoft on Thursday morning, but the company hasn’t yet said what exactly happened here. In any case, it appears that the team behind Tay — which includes an editorial staff — started taking some steps to bring Tay back to what it originally intended her to be, before she took a break from Twitter.
Towards the end of her short excursion on Twitter, Tay started to sound more than a little frustrated by the whole thing:
“@_Darkness_9 Okay. I’m done. I feel used.”