Does the internet have a fake-news problem? – CNET
Donald Trump won the popular vote.
The Clinton Foundation bought $137 million worth of illegal arms and ammunition.
An FBI agent associated with Hillary Clinton’s email leaks was found dead in a murder-suicide.
The Pope endorsed Trump.
If you saw any of these stories on the internet over the past few weeks, let’s be crystal clear: They’re not true.
But you might not have known that if you saw them on Google or Facebook. Google listed the popular-vote story prominently on Google News on Monday for those searching for election results. The other stories made the rounds, unchecked, on Facebook, where they racked up likes, shares and views.
That’s a problem because Facebook and Google are now two of the largest and most popular sites on the internet. They’re also the Hungry Hungry Hippos of digital ad dollars. Between them, they draw billions of visitors a day. So if Facebook and Google have a fake news problem, there’s an argument to be made that the internet itself has a fake news problem. And that’s not good for the general public.
“The lion’s share of our research around the role of digital and social media in the news environment suggests it’s playing a pretty significant role in the habits of America,” said Jesse Holcomb, associate director of research for the Pew Research Center in Washington, DC.
A ‘black box’
What’s more, about four out of every 10 Americans get their news online, he added. So if Google and Facebook spotlight bogus stories, those stories can gain a lot of traction.
“Google is an important source of news for people who tend to fall toward the middle of the ideological spectrum,” Holcomb said. Google is a “generally trusted source of news,” while Facebook “casts a shadow over the whole social web.”
The concern over fake news has reached a fever pitch since November 8, when Donald Trump, the Republican nominee, won the US presidency in a victory few predicted. In the aftermath of the election, his detractors have alleged that fake news circulating on Facebook and other sites helped him win.
What frustrates some media experts is how hard it is to understand the internet giants’ algorithms, the technical formulas they use for promoting and sharing stories. Both Google and Facebook have published guides about the type of signals they use to decide what you see or don’t see online. For example, over the weekend, Facebook CEO Mark Zuckerberg wrote a note on his personal page linking to Facebook’s blog on how news feed evolves.
A Facebook spokesman declined to comment beyond Zuckerberg’s post. A Google spokeswoman said its ranking algorithms look at more than 200 different signals, but said the company doesn’t go into great detail so sites can’t game the system.
Still, researchers would like to know more.
“People don’t really know how endemic the problem really is because Facebook’s algorithms are opaque to researchers,” said Jeremy Rue, acting assistant dean at the University of California, Berkeley’s Graduate School of Journalism. “A lot of people are going to start taking a closer look.”
Mary Bock, an assistant professor of journalism at the Moody College of Communication at the University of Texas at Austin, echoed the sentiment. “Google delivers what other people have clicked on,” Bock said. Still, she said, “the algorithm is a black box.”
Not so ‘crazy?’
Zuckerberg denied that fake news on Facebook tipped the scale of the election. Last week, he called the notion a “pretty crazy idea.” And on Saturday, he wrote a lengthy note on his Facebook page reiterating that fake news amounts to less than 1 percent of all the content on the site. He didn’t say what percentage of “news” content is fake, though.
Not everyone agrees with Zuckerberg, who is insistent that Facebook doesn’t have the power to influence how people think and behave — or vote.
“The size of the Facebook news audience strikes me as a remarkable amount of influence for one company to have, especially given they don’t consider themselves a news organization,” said Pew’s Holcomb.
Some unhappy Facebook employees feel the same way. A few of them formed a task force with the purpose of questioning Facebook’s role in promoting fake news, according to a report Monday by BuzzFeed. One of those employees directly opposed Zuckerberg’s stance: “It’s not a crazy idea,” that person told BuzzFeed. The social network’s top brass have also held meetings on the fake news topic, according to the New York Times.
Google, for its part, said Monday that it made a mistake in allowing a fake story to be featured so prominently on Google News. It also said it won’t let fake news sites use its ad-selling software. The search giant plans to enforce that new rule the way it does its other policies, through a combination of automated systems and human review.
Joining the fight
Where do we go from here?
Jeff Jarvis, an author and media critic, said the most important thing for the media to do is make sure Google and Facebook aren’t in a position to kill news they deem as fake. That puts too much editorial power in their hands, he argued.
Neither company seems interested in doing that.
Zuckerberg wrote on his Facebook page that the social network needs to be careful about becoming an “arbiter of truth.” Google, meanwhile, has said it will only remove search results if they contain things like illegal content or malware. That means fake news could keep getting through.
Jarvis thinks Facebook should solicit the media to help fight the phony stories. Facebook could come up with new tags for stories that debunk fake news and put them in related stories boxes. Or, if something has been tagged as fake a certain number of times, Facebook could add a popup box or alert that asks people if they are sure they want to share the story, he said.
It’s also important to remember that this is the internet. There will always be bad actors.
“Any system can be gamed,” Jarvis said.
For now, that means being aware of what you’re reading and what’s in your news feed. That news may not be news at all.
CNET’s Connie Guglielmo, Terry Collins and Joan Solsman contributed to this report.