How information spread on Facebook during and after the 2020 election

Annenberg School for Communications’ Sandra González-Bailón and colleagues analyzed the spread of over one billion Facebook posts to reveal how information flowed on the social network.

In a study published in the journal Sociological Science, Sandra González-Bailón and colleagues analyzed over one billion Facebook posts published or reshared by more than 110 million users during the months preceding and following the 2020 election.

Sandra González-Bailón
Sandra González-Bailón is the Carolyn Marvin Professor of Communication in the Annenberg School for Communication, director of the Center for Information Networks and Democracy, and professor of sociology in the School of Arts & Sciences. (Image: Courtesy of the Annenberg School for Communication)

“Social media creates the possibility for rapid, viral spread of content,” says González-Bailón, the Carolyn Marvin Professor of Communication in the Annenberg School for Communication, director of the Center for Information Networks and Democracy, and professor of sociology in the School of Arts & Sciences. ”But that possibility does not always materialize. Understanding how and when information spreads is essential because the diffusion of online content can have downstream consequences, from whether people decide to vaccinate to whether they decide to join a rally.”

The research team paid particular attention to whether political content and misinformation spread differently than other content on the platform. They also looked at whether Facebook’s content moderation policies significantly impacted the spread of information.

They discovered that, overall, Facebook Pages, rather than users or Groups, were the main spreaders of content on the platform because they broadcasted posts to many users at once.

Misinformation, however, was primarily spread from user to user, suggesting that the platform’s content moderation created an enforcement gap for user-transmitted messages.

“A very small minority of users who tend to be older and more conservative were responsible for spreading the most misinformation,” González-Bailón says. “We estimate that only about 1% of users account for most misinformation re-shares. However, millions of other users gained exposure to misinformation through the peer-to-peer diffusion channels this minority activated.”

Read more at Annenberg School for Communication.