Is social media making America more polarized?

In his 2020 victory address, President Biden called for an end to what he termed this “era of grim demonization.” He forcefully urged Congress and fellow Americans to overcome their political differences.

“The refusal of Democrats and Republicans to cooperate with one another is not due to some mysterious force beyond our control,” Biden said. “It’s a decision. It’s a choice we make.”

And yet, that choice isn’t just up to US citizens or individuals on Capitol Hill. It’s also a decision for today’s largest social media company, according to a paper in the American Economic Review.

Author Ro’ee Levy found rigorous evidence from a field experiment that Facebook’s algorithm results in people being exposed to more news matching their own opinions, and it may be increasing polarization.

In the past, everyone was really concerned about what the editor of The New York Times put above the fold. Now, we should be concerned about what Facebook’s algorithm decides to rank higher.

Polarization in the United States has been on the rise for some time. As of 2014, Republicans and Democrats were more divided than at any point in the previous two decades.

Other studies have argued that this growing division drives dysfunction in Congress and undermines trust in important institutions.

Meanwhile, Facebook has emerged as a dominant source of news.

As recently as 2008, fewer than one in eight Americans consumed news on any social media site at all. By 2019, 52 percent of Americans were receiving at least some of their news on Facebook, which was more than the share getting news on all other social media platforms combined.

“In the past, everyone was really concerned about what the editor of The New York Times put above the fold. Now, we should be concerned about what Facebook’s algorithm decides to rank higher,” Levy told the AEA in an interview.

This outsized role in the news landscape hasn’t been properly reflected in recent research on media polarization, according to Levy. His work helps fill that gap with one of the largest field experiments conducted on Facebook to date.

In early 2018, Levy recruited over 30,000 people to take a survey about their political beliefs and to give access to the Facebook posts they liked, clicked, or shared.

Toward the end of the survey, Levy randomly assigned the participants to one of three groups and asked them to subscribe to up to four news outlets. A third of the participants were offered liberal outlets, such as The New York Times and MSNBC, another third were offered conservative outlets, such as The Wall Street Journal and Fox News, and the final third was a control group not offered any outlets.

Nearly 2,000 participants also installed a browser extension collecting data on their Facebook feed and news-related browsing behavior.

Levy discovered that individuals were quite willing to subscribe to news outlets with political leanings opposite to their own, and more surprisingly, they were much more likely than the control group to visit the websites of news outlets with opposing views.

While this extra exposure to news from the other side of the political aisle didn’t change political opinions, it did change people’s attitudes toward the other side. In particular, users were less likely to report feeling hostile about an opposing party. Levy hypothesized that participants appeared to develop a better grasp of the other side’s arguments and so understood better why an opposing party supported certain positions.

Overall, the research indicates that people are far more willing to engage with counter-attitudinal news than previously expected.

How much does Facebook’s algorithm filter?

The gray bars in the chart below show the number of posts seen in a user’s feed in pro- and counter-attitudinal treatments. The gap between the two is broken down into the share of extra posts from Facebook’s algorithm (Algorithm), participants’ tendency to subscribe to more outlets in the pro-attitudinal treatment (Subscriptions), participants’ tendency to use Facebook less often in the counter-attitudinal treatment (Usage), and interactions between these factors (Combinations).

However, Levy’s study also provides strong evidence that Facebook’s algorithm currently tailors users’ feeds in a way that filters out differing views—even if a user subscribes to a counter-attitudinal news page—creating a so-called “filter bubble.”

For instance, a liberal subscribing to MSNBC’s Facebook page will see more posts from MSNBC compared to a liberal subscribing to Fox News. This is in stark contrast to traditional media outlets; cable providers don’t cut Fox News programming for liberals that watch MSNBC.

Levy’s work suggests that Facebook may be exacerbating polarization. But it also shows how to help social media companies and users choose a more balanced news feed.

“I didn’t force anyone to read anything. All I did was kind of nudge them . . . and they developed positive attitudes toward the other party,” Levy said. “This is something that can easily be replicated in other settings. I find that encouraging.”

Social Media, News Consumption, and Polarization: Evidence from a Field Experiment” appears in the March issue of the American Economic Review.