Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Reason
Reason
Elizabeth Nolan Brown

Do Social Media Algorithms Polarize Us? Maybe Not.

A safer, saner social media world is possible, former Facebook product manager Frances Haugen told members of Congress in 2021. Instead, she said, leaders at the social media company chose engagement over democracy, using algorithms that kept people glued to the site but also angry, anxious, and ill-informed.

Haugen's diagnosis of the cause of our current political dysfunction (social media algorithms) and cure ("get rid of the engagement-based ranking" of content and return to displaying posts in simple chronological order) has become dogma for many politicians, members of the press, and would-be change-makers. Doing away with algorithms would also halt hate speech and misinformation, these groups insist.

But more and more research is casting doubt on such claims. The latest comes from a collaboration between academics and Facebook parent company Meta, who set out to explore the impact of algorithms in the lead-up to the 2020 election.

The first results from this project were published in four papers in July. Michael W. Wagner, the project's independent rapporteur, called the studies works of "independent," "rigorous," and "ethical" research.

With users' consent, researchers tweaked various elements of their Facebook feeds to see what effect it would have on things like political outlooks and media diet. The interventions differed by study, but all revolved around assessing how Facebook algorithms affected user experiences.

What they found cuts to the heart of the idea that Facebook could produce kinder, better informed citizens by simply relying less on engagement metrics and algorithms to determine what content gets seen. "Both altering what individuals saw in their news feeds on Facebook and Instagram and altering whether individuals encountered reshared content affects what people saw on the platforms," reported Wagner in the July 28 issue of Science. But "these changes did not reduce polarization or improve political knowledge during the 2020 US election. Indeed, removing reshared content reduced political knowledge."

In one study, led by Princeton University's Andy Guess and published in Science, select users were switched from algorithm-driven Facebook and Instagram feeds to feeds that showed posts from friends and pages they followed in reverse chronological order—just the sort of tweak social media critics like Haugen have pushed for. The shift "substantially decreased the time they spent on the platforms and their activity" and led to users seeing less "content classified as uncivil or containing slur words," concludes the study (titled "How do social media feed algorithms affect attitudes and behavior in an election campaign?").

But "the amount of political and untrustworthy content they saw increased on both platforms." Despite shifting the types of content users saw and their time spent on Facebook, the switch "did not cause detectable changes in downstream political attitudes, knowledge, or offline behavior" during the three-month study period.

Despite some limitations, "our findings rule out even modest effects, tempering expectations that social media feed-ranking algorithms directly cause affective or issue polarization in individuals or otherwise affect knowledge about political campaigns or offline political participation," the team concludes. They suggest more research focus on offline factors "such as long-term demographic changes, partisan media, rising inequality, or geographic sorting."

In another study—also led by Guess—researchers excluded re-shared content from some users' news feeds. These users wound up seeing substantially less political news ("including content from untrustworthy sources"), clicking on less partisan news, and reacting less overall. Yet "contrary to expectations," the shift didn't significantly alter "political polarization or any measure of individual-level political attitudes." Those in the experimental group also ended up less informed about the news.

"We conclude that though re-shares may have been a powerful mechanism for directing users' attention and behavior on Facebook during the 2020 election campaign, they had limited impact on politically relevant attitudes and offline behaviors," write Guess and colleagues in "Reshares on social media amplify political news but do not detectably affect beliefs or opinions," also published in Science.

In another experiment, researchers tweaked some Facebook feeds to reduce exposure to "like-minded sources" by about a third. As a result, these users indeed saw content from a more "cross-cutting" range of sources and less "uncivil" content. But this failed to alter their political attitudes or belief in misinformation.

Ultimately, the results "challenge popular narratives blaming social media echo chambers for the problems of contemporary American democracy," writes a research team led by Brendan Nyhan, Jaime Settle, Emily Thorson, Magdalena Wojcieszak, and Pablo Barberá in "Like-minded sources on Facebook are prevalent but not polarizing," published in Nature. "Algorithmic changes…do not seem to offer a simple solution for those problems."

For years, politicians have been proposing new regulations based on simple technological "solutions" to issues that stem from much more complex phenomena. But making Meta change its algorithms or shifting what people see in their Twitter feeds can't overcome deeper issues in American politics—including parties animated more by hate and fear of the other side than ideas of their own. This new set of studies should serve as a reminder that expecting tech companies to somehow fix our dysfunctional political culture won't work.

The post Do Social Media Algorithms Polarize Us? Maybe Not. appeared first on Reason.com.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.