In spring 2019, Facebook researchers looked into whether the Share button helped amplify misinformation. In a report called “Deep Reshares and Misinformation,” they confirmed their suspicions.
The report noted that people are four times more likely to see misinformation when they encounter a post via a share of a share — kind of like a retweet of a retweet — compared to a typical photo or link on Facebook. Add a few more shares to the chain, and people are five to ten times more likely to see misinformation. It gets worse in certain countries. In India, people who encounter “deep reshares,” as the researchers call them, are twenty times more likely to see misinformation.
“Our data,” the researchers concluded, ”reveals that misinformation relies much more on deep reshares for distribution that major publishers do.”
A simple product tweak, the research indicated, would likely help Facebook constrain its misinformation problem more than an army of content moderators — all without removing a single post. In this scenario, adding some friction after the first share, or blocking sharing altogether after one share, could help mitigate the spread of misinformation on Facebook.
“I'm an advocate of significant friction around sharing,” Aviv Ovadaya, a misinformation researcher and founder of the Thoughtful Technology Project, told me. “This analysis supports that conclusion, that sharing as functionality, especially beyond one's Friends of Friends, helps lower quality content more than it helps higher quality content.”
Like Twitter’s Retweet and WhatsApp’s Forward, the Facebook Share button encourages people to thoughtlessly pass along posts that exploit their emotions and biases. The WhatsApp Forward amplified so much bad information that the company slowed it down. The Twitter Retweet amplified so many links people didn’t click, that the company now asks to “read before you retweet.” Facebook put some sharing friction in place — including interstitials if you’re about to share an old article — but the research makes the need for more aggressive action clear.
What Facebook did with its research is unclear, however. The company, now Meta, didn’t respond to a request for comment. Twitter, too, did not respond to an email asking whether it had similar research. In the internal comments discussing the study, Facebook employees discussed the merits of blocking shares after a certain depth or simply using this data to assist Facebook’s teams looking for misinformation. It doesn’t appear the company took meaningful action.
This lack of action prompted Haugen’s Lawyer, Lawrence Lessig, to suggest that Apple should threaten Facebook with removal from the App Store if it didn’t put limits on reshares. This action would bring up all manner of anticompetitive issues and likely create a bad precedent. But Lessig, speaking on Big Technology Podcast, advocated for it nonetheless.
“Facebook regulating itself? We tried that, turns out that that doesn't work. But we can see other companies stepping up and trying to create a standard of safety,” Lessig said. “They have an extraordinary opportunity to leverage their brand around safety in a way that could actually help make the internet platform safer.”
The research report comes from Frances Haugen’s disclosures to the Securities and Exchange Commission, which were also provided to Congress in redacted form by her legal team. The redacted versions received by Congress were obtained by a consortium of news organizations, including Big Technology. This study became available to the consortium on Wednesday.
After some delay, it appears Haugen’s documents are on their way to being made available more widely. “The plan is to get these documents in every place around the world where Facebook is affecting them,” said Lessig.
Though wonky, these less explosive documents reveal a detailed picture of how Facebook operates behind the scenes. They’re valuable tools to understand how the company works, and the ways it might ameliorate its problems. In this case, the solution is clear: rein in the share button.