TV news top driver of political echo chambers in U.S.

Duncan Watts and colleagues found that 17% of Americans consume television news from partisan left- or right-leaning sources compared to just 4% online. For TV news viewers, this audience segregation tends to last month over month.

An illustration of an old television with a person in sunglasses on it. On top sits a laptop computer with an arm reaching out past the screen, holding a rolled up newspaper. Another newspaper lays flat on top of the screen.

In the lead-up to the 2016 presidential election, many maligned the echo chambers they believed they saw taking over social media: People with like-minded friends were all sharing content with the same political bent, amplifying a singular set of messages, and leading to greater polarization overall.

But according to research led by Penn Integrates Knowledge University Professor Duncan Watts, just 4% of Americans actually fall into such echo chambers online. The number for television, however, is much higher, with 17% of people in the United States consuming TV news from partisan left- or right-leaning sources alone, news diets they tend to maintain month over month.

The team, which included Watts, Homa Hosseinmardi, and doctoral student Baird Howland of Penn, Daniel Muise of Stanford University, and Markus Mobius and David Rothschild of Microsoft Research, shared their findings in Science Advances. The researchers say the results point to television as the top driver of partisan audience segregation among Americans.

“If you think about where people are getting their news, it’s five to one from TV versus online,” says Watts, who has appointments in Penn’s School of Engineering and Applied Science, Annenberg School for Communication, and Wharton School. “If you’re worried about why people believe what they believe, you really need to look at what they’re watching on television.”

Comparing TV and online news

The work is part of the Penn Media Accountability Project from Watts’ Computational Social Science Lab. “One of our main goals is to take a broad view of the information ecosystem and ask questions about the relationship between information and misinformation,” Watts says. “The media produces information, people encounter and consume it, and it has some effect on them, which may or not may result in increasing polarization or diminishing trust in institutions.”

a portrait of duncan watts
Penn Integrates Knowledge Professor and Stevens University Professor Duncan Watts.

A paper he and colleagues published in 2020 in Science Advances showed that television news viewing far outpaces news consumption online. Yet no one had closely analyzed the former on its own or in comparison to social media and other Internet-based news sources. So, the Penn team decided to look at source diversity of TV news viewing.

They wanted to understand whether people watched a mix of programs with a range of political angles or only those that aligned with their own beliefs. In other words, did audiences fall into echo chambers, which the researchers defined in this case as more than 50% of news coming from either left-leaning or right-leaning sources alone.

Though the 2016 presidential election had heightened general awareness of the concept, a debate within the research community had arisen several years prior, following publication of a book called “The Filter Bubble.” “The thesis of that was that online algorithms steer users into these filter bubbles where they only get information that confirms what they already believe,” Watts says. “This was a claim that struck many people as plausible, and the idea of filter bubbles became commonplace.”

Yet subsequent research found they weren’t actually all that common online. “Filter bubbles or echo chambers, to the extent they exist online, apply at most to a small percentage of the population,” Watts says. “But no one had compared online to TV.”

Conducting the analysis

The Nielsen Company collects data from hundreds of thousands of people each month. Using data sets from two Nielsen panels—a TV panel that monthly includes about 85,000 Americans and a computer Web-browsing panel with about 60,000—the researchers amassed more than 3 billion unique viewing and browsing events that took place between 2016 and 2019. They focused on national news programming, excluding local TV news but including syndicated content.

They then assigned what they called “partisan bias labels” to websites and television stations, with MSNBC and Fox falling on either end of the spectrum, and news content from ABC, CBS, and NBC falling mostly in the middle. Analyzing the data, the researchers came to four conclusions.

If you’re worried about why people believe what they believe, you really need to look at what they’re watching on television. Penn Integrates Knowledge Professor Duncan Watts

“First, TV looks much worse than online. About 17% of the population is in either a left-leaning or right-leaning filter bubble,” Watts says. “Roughly half are in the MSNBC/CNN world, and Fox definitely constitutes an echo chamber, with about 8.5% of the population.” The second result, an extension of the first, finds that although these echo chambers are generally transient, they stick around longer for television news than online; there’s about a 25% chance of a partisan TV news diet lasting six months, but just a 5% chance the same will hold for online news consumption.

Third, TV viewers’ news diets are far more concentrated on preferred sources than the online audience. “It’s not just that people are in these echo chambers, it’s that they’re consuming a lot more news than the people who are not in them. It’s only 21% of the population but it’s 60% of news consumption,” Watts says. “Online, it’s not such a big deal. Even the people who are in echo chambers are not as echo chamber-y as the TV ones.” Finally, although the television news audience is contracting, those who remain are moving from the center to the extremes.

Implications of the findings

Watts argues that nothing can change and no interventions can be created without a better understanding of the actual problem. “All of the attention focused on social media algorithms looks somewhat overblown in light of these findings,” he says. “Millions of people in America consume a lot of Fox and not very much of anything else. If Fox is serving up propaganda and misleading content, that seems like a bigger thing to worry about to me than Facebook’s newsfeed algorithm. What we’re trying to say is, ‘Look over here.’”

That doesn’t mean ignoring social media altogether; 4% of the American population still equates to millions of people.

Yet Watts and his colleagues see that as a small number, relatively speaking. “Seventeen percent is a lot bigger than 4%, and 17% of the population is a pretty substantial voting bloc. But compared to 100%, even 17% is not a big number. The majority of people in America still get a somewhat balanced portfolio of news.”

In the future, Watts says he hopes to train the spotlight on talk radio and local news, two areas where little research has been done, mostly because they’re challenging to study. “We need a much better way to measure what’s being said in all of these environments and how it’s affecting people’s beliefs,” he says. “We’re saying we should be asking a different question.”

Duncan Watts is the Stevens Penn Integrates Knowledge University Professor at the University of Pennsylvania. He holds faculty appointments in the Annenberg School for Communication, the Department of Computer and Information Science in the School of Engineering and Applied Science, and the Department of Operations, Information, and Decisions in the Wharton School, where he is the inaugural Rowan Fellow. He also runs the Computational Social Science Lab.

Other contributors to the research included Homa Hosseinmardi, an associate research scientist in the Computational Social Science Lab and Annenberg School for Communication doctoral student Baird Howland of Penn; Daniel Muise of Stanford University; and Markus Mobius and David Rothschild of Microsoft Research.

Funding for this research came from Carnegie Corporation and Wharton alum Richard Mack.