Instagram’s Kevin Systrom wants to clean up the &#%$@! internet. – WIRED

Instagram’s Kevin Systrom wants to clean up the &#%$@! internet.

08.14.17

Kevin Systrom, the CEO of Instagram, was at Disneyland last June when he decided the internet was a cesspool that he had to clean up. His company was hosting a private event at the park as part of VidCon 2016, an annual gathering that attracts social media virtuosos, and Systrom was meeting with some Instagram stars. They were chatting and joking and posing for one another’s phone cameras. But the influencers were also upset. Insta­gram is supposed to be a place for self-expression and joy. Who wants to express themselves, though, if they’re going to be mocked, harassed, and shamed in the comments below a post? Instagram is a bit like Disneyland—if every now and then the seven dwarfs hollered at Snow White for looking fat.

After the chat, Systrom, who is 33, posted a Boomerang video of himself crouched among the celebrities. It’s an ebullient shot of about 20 young people swaying, waving, bobbing, and smiling. In the lower right corner, a young woman bangs her knees together and waves her hand like she’s beating eggs for a soufflé.

The comments on that post started out with a heart emoji, a “Hoooooo,” and “So fun!” Soon, though, the thread, as so often happens online, turned rancid, with particular attention focused on the young woman in the lower right. “Don’t close wait just wait OPEN them leg baby,” “cuck,” “succ,” “cuck,” “Gimme ze suc.” “Succ4succ.” “Succme.” “Go to the window and take a big L E A P out of it.” A number of comments included watermelon emoji, which, depending on context, can be racist, sexist, or part of picnic planning. The newly resurgent alt-right proclaimed over and over again that “#memelivesmatter.” There was a link in Arabic to a text page about economic opportunities in Dubai. Another user asked Systrom to follow him—“Follback @kevin.” And a few brave people piped up to offer feedback on Insta­gram’s recent shift to ordering posts by relevancy rather than recency: “BRING BACK THE CHRONOLOGICAL ORDER!”

Systrom is a tall, lean man with a modest bearing. His handshake is friendly, his demeanor calm. He’s now a billionaire, but he doesn’t seem to play the alpha male games of his peers. There is no yacht; there are no visits to the early primary states; there is no estranged former partner with an NDA. Systrom’s personal Instagram feed is basically dogs, coffee, bikes, and grinning celebrities. A few years ago, Valleywag described his voice as “the stilted monotone of a man reading his own obituary,” but he’s become much smoother of late. If he has a failing, his critics say, it’s that he’s a sucker: He and his cofounder, Mike Krieger, sold Instagram to Facebook too soon. They’d launched it a few years after graduating from Stanford, and it went into orbit immediately. They got $1 billion for it. Snap, which spurned an offer from Facebook, is now worth roughly $17 billion.

Systrom takes pride in this reputation for kindness and considers it a key part of Instagram’s DNA. When the service launched in 2010, he and Krieger deleted hateful comments themselves. They even personally banned users in an effort Systrom called “pruning the trolls.” He notes that Krieger “is always smiling and always kind,” and he says he tries to model his behavior after that of his wife, “one of the nicest people you’ll ever meet.” Kevin Systrom really does want to be the sunny person on display in @kevin’s feed.

So when Systrom returned from VidCon to Instagram’s headquarters, in Menlo Park, he told his colleagues that they had a new mission. Instagram was going to become a kind of social media utopia: the nicest darn place online. The engineers needed to head to their whiteboards. The next image he posted on Instagram, just before Independence Day, was of some sumptuous homemade pretzels.

“Nice buns like yur mum,” @streamlinedude commented. @Juliamezi added, “If you stop reading this you will die.” She, or it, then added, oddly, “If u don’t post this on 20 photos I will sleep with you forever.”

Technology platforms, the conventional wisdom now goes, are not neutral. Their design and structure encourage certain behaviors, and then their algorithms control us even more. We may feel like we’re paddling our own boats, but the platform is the river and the algorithms are the current.

As the CEO of a service with 700 million users, Systrom recognizes that he’s something like the benevolent dictator of a country more than twice the size of the US. The choices he makes affect the lives of all his users—some of whom are insecure teens, some of whom are well-adjusted adults, some of whom are advertisers, and some of whom are pop singers dealing with an infestation of snakes.

In mid July 2016, just after VidCon, Systrom was faced with just such an ophiological scourge. Somehow, in the course of one week, Taylor Swift had lost internet fights with Calvin Harris, Katy Perry, and Kim Kardashian. Swift was accused of treacherous perfidy, and her feed quickly began to look like the Reptile Discovery Center at the National Zoo. Her posts were followed almost entirely by snake emoji: snakes piled on snakes, snakes arranged numerically, snakes alternating with pigs. And then, suddenly, the snakes started to vanish. Soon Swift’s feed was back to the way she preferred it: filled with images of her and her beautiful friends in beautiful swimsuits, with commenters telling her how beautiful they all looked.

But Instagram can’t build that world with simple technical fixes like automated snake emoji deletion.

This was no accident. Over the previous weeks, Systrom and his team at Instagram had quietly built a filter that would automatically delete specific words and emoji from users’ feeds. Swift’s snakes became the first live test case. In September, Systrom announced the feature to the world. Users could click a button to “hide inappropriate comments,” which would block a list of words the company had selected, including racial slurs and words like whore. They could also add custom keywords or even custom emoji, like, say, snakes.

The engineers at Instagram were just getting started. In October, the service launched a series of tools that roughly model what would happen if an empathetic high school guidance counselor hacked your phone. If you type in the word suicide, you’ll be met first with a ­little box that says, “If you’re going through something difficult, we’d like to help.” The next screen offers support, including a number for a suicide-prevention hotline. Two months later, in December 2016, the company gave users the ability to simply turn off commenting for any given post. It’s for those times when you want a monologue, not a conversation.

A cynic may note that these changes are as good for business as they are for the soul. Advertisers like spending money in places where people say positive things, and celebrities like places where they won’t be mocked. Teenagers will make their accounts public if they feel safe, and if their parents don’t tell them to get off their phones.

Still, talking to people at the company, from Systrom on down, you get the sense that this is a campaign felt in the heart, not just in the pocket. Nicky Jackson Colaco, Instagram’s director of public policy, speaks of her own children and the many teenagers whose first experience with the swamp of social media is on Insta­gram. “I think what we’re saying is, we want to be in a different world,” she says.

But Instagram can’t build that world with relatively simple technical fixes like automated snake emoji deletion. So, even amid a bevy of product launches last fall, Instagram’s engineers began work on something much more complex.

Trying to sort rubbish from reason on the internet has long been a task for humans. But thanks to artificial intelligence, the machines are getting better at the job. Last June, around the time Systrom visited VidCon, Facebook announced that it had built a tool to help computers interpret language. The system, named DeepText, is based on a machine learning concept called word embeddings. When the system encounters a new word, it tries to deduce meaning from the other words around it. If a watermelon emoji is always surrounded by right-wing memes, that means something. The more data the classification engine analyzes, the smarter it gets. Like us, it learns over time; unlike us, it doesn’t get exhausted or depressed reading the word cuck 72 times in a row.

One way to think of DeepText is that it’s like the brain of an adult whose entire memory has been wiped and who will now devote themself to whatever linguistic task you assign. Facebook essentially has an icebox filled with these empty brains, which it gives to its engineering teams. Some are taught to recognize whether a Messenger user needs a taxi; others are taught to guide people selling bikes on Marketplace.

“What we’re saying is that we want to be in a different world,” says Nicky Jackson Colaco, Instagram’s public policy director.

After learning about DeepText, Systrom realized that his engineers could train it to fight spam on Instagram. First, though, like a child learning language, it would need some humans to teach it. So Systrom gathered a team to sort through massive piles of bilge, buffoonery, and low-grade extortion on the platform.

They labeled comments as spam or not spam and then fed everything into DeepText. The machines studied the categories and came up with rules to identify genuine economic offers in Dubai and whether it’s a friend or a bot who wants a follow-back. Once DeepText was able to classify spam with sufficient accuracy, the engineers signaled the go-ahead, and the company quietly launched the product last October.

Then Systrom had an even more complicated idea: What if Instagram could use DeepText to knock out mean comments? Forget about the succs and the follbacks. Could the AI learn to filter out more ambiguous content? “Go to the window and take a big L E A P out of it” is definitely hostile, but it doesn’t include any particularly hostile words. “Don’t close wait just wait OPEN them leg baby” is gross. But can a computer tell? “Nice buns like yur mum” is rude and off topic. But it could be charming if it came from a childhood friend who truly appreciated your mother’s German pretzels.

Other social media companies had worked to filter spam, but Insta­gram’s new plan to make the whole platform kinder was vastly more ambitious. Systrom told his team to press ahead.

Comments

Write a Reply or Comment:

Your email address will not be published.*