How unflagged, factual content drives vaccine hesitancy

A new paper from computational social scientist Duncan Watts examines how factual, vaccine-skeptical content on Facebook has a greater overall effect than ‘fake news,’ discouraging millions from the COVID-19 shot.

What threatens public health more, a deliberately false Facebook post about tracking microchips in the COVID-19 vaccine that is flagged as misinformation, or an unflagged, factual article about the rare case of a young, healthy person who died after receiving the vaccine?

According to Duncan J. Watts, Stevens University Professor in Computer and Information Science at Penn Engineering and director of the Computational Social Science (CSS) Lab, along with MIT’s David G. Rand and incoming CSS postdoctoral fellow Jennifer Allen, the latter is much more damaging. “The misinformation flagged by fact-checkers was 46 times less impactful than the unflagged content that nonetheless encouraged vaccine skepticism,” they conclude in a new paper in Science.

Many hands holding smartphones and other sources of information about COVID-19.
Image: iStock/zubada

Historically, research on “fake news” has focused almost exclusively on deliberately false or misleading content, on the theory that such content is much more likely to shape human behavior. But, as Allen points out, “When you actually look at the stories people encounter in their day-to-day information diets, fake news is a minuscule percentage. What people are seeing is either no news at all or mainstream media.”

One of the paper’s key findings is that “fake news,” or articles flagged as misinformation by professional fact-checkers, has a much smaller overall effect on vaccine hesitancy than unflagged stories that the researchers describe as “vaccine-skeptical,” many of which focus on statistical anomalies that suggest that COVID-19 vaccines are dangerous.

“Since the 2016 U.S. presidential election, many thousands of papers have been published about the dangers of false information propagating on social media,” says Watts. “But what this literature has almost universally overlooked is the related danger of information that is merely biased. That’s what we look at here in the context of COVID vaccines.” 

As the researchers point out, being able to quantify the impact of misleading but factual stories points to a fundamental tension between free expression and combating misinformation, as Facebook would be unlikely to shut down mainstream publications. “Deciding how to weigh these competing values is an extremely challenging normative question with no straightforward solution,” the authors write in the paper.

This story is by Ian Schefler. Read more at Penn Engineering Today.