Russian content on Facebook, Google and Twitter reached far more users than first disclosed, testimony says – Chicago Tribune
Facebook plans to tell lawmakers on Tuesday that 126 million of its users may have seen content produced and circulated by Russian operatives, many times more than the company had previously disclosed about the reach of the online influence campaign targeting American voters.
The company previously reported that an estimated 10 million users had seen ads bought by Russian-controlled accounts and pages. But Facebook has been silent regarding the spread of free content despite independent researchers suggesting that it was seen by far more users than the ads were.
Tuesday’s planned disclosure, contained in draft company testimony obtained by The Washington Post ahead of three Capitol Hill hearings this week, comes as Facebook and other tech giants face mounting pressure to fully investigate the Russian campaign to influence American voters and reveal their findings to the public.
Google acknowledged for the first time Monday that it had found evidence that Russian operatives used the company’s platforms to influence American voters, saying in a blog post that it had found 1,108 videos with 43 hours of content related to the Russian effort on YouTube. It also found $4,700 worth of Russian search and display ads.
Twitter also plans to tell congressional investigators that it has identified 2,752 accounts controlled by Russian operatives and more than 36,000 bots that tweeted 1.4 million times during the election, according to a draft of Twitter’s testimony obtained by The Post. The company previously reported 201 accounts linked to Russia.
Although the Russian effort sprawled across many U.S.-based technology platforms, attention has focused most heavily on Facebook, which has faced repeated calls from lawmakers and researchers to dig more deeply into its data and disclose more of what it has found.
There have been similar calls within the company, where debates over what to reveal publicly have yielded cautious compromises that have left members of the company’s security team frustrated, according to people familiar with private conversations among Facebook employees.
Such concerns have focused on forensic evidence the security team collected about Russia’s online influence campaign that was, after months of internal company wrangling, not included in a 13-page “white paper” issued publicly in April, according to people familiar with the negotiations. The report spoke in general terms about “information operations” but included only a single page on the U.S. election and did not at any point use the word “Russia” or “Russian.”
Several independent researchers also say Facebook likely has the ability to search for data that could substantiate allegations of possible collusion between the Russian disinformation operation and the Trump campaign’s social media efforts. The possible sharing of content, the timing of social media posts and other forensic information known only to the company could help answer questions central to congressional investigations and the probe led by Special Counsel Robert Mueller.
“If there was collusion in the social media campaign between the Russians and the Trump campaign, they would have that evidence,” said Philip Howard of Oxford University’s Computational Propaganda Project. “It is a needle in a haystack for us outside researchers.”
The president and his campaign officials have denied colluding in any way with the Russians.
The push for more information is likely to emerge as an important theme during the congressional hearings Tuesday and Wednesday, when lawmakers plan to push for more details.
“I hope they will be more forthcoming,” said Sen. Mark Warner, Va., the top Democrat on the Senate Intelligence Committee, one of three committees holding hearings on these issues this week. “I think there’s a lot more that Americans deserve to know.”
Facebook’s chief security officer, Alex Stamos, said in a statement to The Post on Monday that the company is doing everything it can to assist investigators.
“By publicly describing our understanding of information operations in April, and by fully cooperating with the various investigations into Russian interference, I’m confident that we are doing everything we can to be helpful and contribute our piece of the broader picture,” Stamos said. He did not directly respond to a question about reports of frustrations on his team.
Facebook spokesman Jay Nancarrow acknowledged the importance of probing company data for the possibility of collusion. “We believe this is a matter that government investigators need to determine, which is why we are fully cooperating with them to help them make their assessment,” he said.
Facebook has said Russia’s efforts to influence the election involved 470 accounts and pages that spent more than $100,000 on 3,000 ads that reached 10 millions users. But outside researchers have said for weeks that free posts almost certainly reached much larger audiences – a point that Facebook will concede in its testimony on Tuesday.
Facebook’s general counsel, Colin Stretch, plans to tell the Senate Judiciary Committee that between 2015 and 2017, a single Russian operation in St. Petersburg generated about 80,000 posts and that roughly 29 million people potentially saw that content in their news feeds.
Because those posts were also liked, shared and commented on by Facebook users, the company estimates that as many as 126 million people may have seen material in their news feeds that originated from Russian operatives, which was crafted to mimic American commentary on politics and social matters such as immigration, African American activism and the rising prominence of Muslims in the United States.
Stretch plans to characterize that content as a tiny fraction of what users see every day in their Facebook news feeds.
The company has long sought to play down the impact of manipulation of its platform during the 2016 campaign. Chief executive Mark Zuckerberg initially dismissed the importance of phony news reports spreading unchecked on Facebook, saying it was “a pretty crazy idea” to suggest that “fake news” could have affected the outcome of the election. He later apologized for the remark.
But from the first days after the election, many employees expressed frustration and dismay that a social media platform they had built helped elect a president many of them disliked deeply, according to current and former employees and others familiar with internal company conversations.
Some Facebook employees also expressed regret that it had removed human editors from the “trending topics” feature seen in the news feeds of users after allegations surfaced several months before the November election about supposed liberal bias in how stories were selected and portrayed. Company officials, reluctant to be seen as favoring one part of the political spectrum, bowed to demands from conservatives for changes.