Facebook’s struggles with ‘fake news’ underscore the urgent need for a new social media legal framework, experts warn.
“It’s an enormous problem,” Keith Altman, a lawyer at 1-800 Law Firm, told FoxNews.com. “It’s the distribution, the infrastructure of these sites that allow the misinformation to be disseminated.”
Earlier this week Facebook CEO Mark Zuckerberg defended the social media network’s news algorithm against allegations that the company allowed ‘fake news’ to tilt the election.
“Personally I think the idea that fake news on Facebook—of which it’s a very small amount of the content—influenced the election in any way is a pretty crazy idea,” Zuckerberg said at conference in Half Moon Bay, Calif, according to the Wall Street Journal.
Facebook’s Trending Topics fell prey to some high-profile fake stories after the social network implemented an algorithmic feed this summer. These included a false article that Fox News had fired anchor Megyn Kelly and a hoax article about the Sept.11 attacks. On another occasion a seemingly innocent hashtag that appeared in Trending Topics linked to an inappropriate video.
While these incidents were clearly embarrassing for Facebook, social media companies are protected by Section 230 of the Communications Decency Act, which says that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
This means that online “intermediaries” that host or republish speech are protected against a number of laws that might otherwise be used to hold them legally responsible for what others say and do, according to the Electronic Frontier Foundation.
Altman, however, says the “unfettered” nature of social media sites such as Facebook, where content can be shared with vast numbers of people, necessitates a new legal framework.
“I think that [Section] 230 needs to be looked at and maybe more clearly defined,” Altman told FoxNews.com. “A better framework and accountability needs to be implemented to cause these companies to act responsibly.”
Altman is also representing the family of Naomi Gonzalez, who was killed in the Paris terror attacks in a lawsuit claiming that Google, Facebook and Twitter provided “material support” to Islamic State.
Eric Feinberg, a founding partner of deep web analysis company GIPEC, says that his firm has developed technology that can anticipate where the likes of fake news stories are lurking on the Internet.
“We deep dive into the web looking for bad patterns,” Feinberg told FoxNews.com. “We can anticipate what is bad, nefarious, on the web.”
Like Altman, Feinberg also cited the shortcomings of Section 230. “All of this is a huge problem,” he said.
With more than 1.7 billion monthly active users, Facebook’s role in society continues to be closely scrutinized. Last month, for example, civil rights groups including the American Civil Liberties Union and Black Lives Matter signed a letter to the company’s CEO Mark Zuckerberg urging him to clarify Facebook’s policy on content removal.
Earlier this month Facebook said that it will block a British car insurer from profiling users of the social network to decide whether they deserve a discount on their insurance.
Facebook has not yet responded to a request for comment on this story from FoxNews.com.
Follow James Rogers on Twitter @jamesjrogers