Microsoft’s Racist Millennial Twitter Bot Strikes Again – Vanity Fair
Last week, Microsoft birthed Tay, a bot that was supposed to act like a teen on Twitter. Her verified account garnered thousands of followers in a matter of hours. Described by her owners on Twitter as Microsoft’s “A.I. fam from the internet that’s got zero chill,” Tay was fluent in emoji, slang, and memes—sort of. She learned from and responded to users on Twitter and other platforms, increasingly getting better at pretending to be a real millennial. But that all went off the rails within Tay’s first 24 hours of existence, as an army of trolls fed virulently racist, sexist, and downright genocidal phrases to Tay, who, in turn, parroted them back to other users. Just like a real teen, Tay was quickly grounded, with Microsoft shutting her down for maintenance.
But Tay came back to life briefly on Wednesday, when Microsoft accidentally re-activated the bot. Before too long, Tay was sending out tweets that looked similar to the ones that had gotten her deactivated in the first place. She sent a tweet about smoking weed in front of some cops, and then began spamming her 200,000-plus followers with the same message, over and over again.
In typical Tay-speak, it was semi-coherent, but didn’t make much sense. “You are too fast, please take a rest…” she said, over and over and over again. Finally, someone—presumably her handlers at Microsoft—began deleting the tweets. Microsoft has since silenced Tay, setting the account to private for the time being. When contacted, Microsoft told the Daily Dot that Tay’s resurrection was an accident. “Tay remains offline while we make adjustments,” a spokesperson said. “As part of testing, she was inadvertently activated on Twitter for a brief period of time.” Until that testing is complete, Tay might consider heeding the age-old Internet proverb: never tweet.