Voting booths

Social media and the 2020 election

The 2016 U.S. presidential election raised significant concerns among academics, political observers, policymakers, and others about the role social media played in the spread of misinformation during the campaign and the concurrent rise in polarization. So when Andrew Guess of Princeton’s School of Public and International Affairs was given the chance to use de-identified primary data from two of the world’s most influential platforms to study their effects on the 2020 election, he leapt at it.

“I've long believed that conducting on-platform experiments to study the impact of feature changes and other interventions on real-user behavior is the holy grail for research on social media and politics,” said Guess, an assistant professor of politics and public affairs at SPIA whose research bridges political communication, public opinion and political behavior. “To that end, for several years I've been involved with efforts to facilitate data-sharing between researchers and platforms, so when this unique opportunity arose, I was fortunate to be asked to join the project team as it was coming together.”

Andrew Guess

Guess and his 16 academic collaborators joined with researchers from Meta, the parent company of Facebook and Instagram, to explore how changes in those platforms’ delivery of content affected users. The first of the team’s work was published this week — three papers in Science and one in Nature.

Guess is the lead author on two of the Science papers. One looks at the effects of changing the feed-ranking algorithms on Facebook and Instagram, and the other explores what happens when reshared content is removed from users’ Facebook feeds.

“I was part of the academic research team that helped to develop the overarching study design, such as which data we'd collect, what questions we specifically wanted to ask and the design of the platform experiments that we'd conduct,” he said. “After the initial development period, we split into teams focusing on specific experiments. It was a very hands-on and collaborative experience.”

The U.S. 2020 Facebook and Instagram Election Study was launched when internal researchers at Meta partnered with researchers at the University of Texas at Austin and New York University. Meta’s involvement gave Guess and his collaborators unprecedented access to user data with no substantive restrictions.

The academic team proposed and selected specific research questions and study designs with Meta’s explicit agreement that the company could reject them only for legal, privacy or infeasibility reasons. Meta could not restrict or censor findings, and the academic lead authors had final say over writing and research decisions. "Due  in large part to the fact that the relevant data about users' experiences and behaviors are locked up inside the platforms themselves, comprehensive evidence of these impacts has been scarce,” Guess said. 

In “How Do Social Media Feed Algorithms Affect Attitudes and Behavior in an Election Campaign?,” Guess and his coauthors investigated the effects of Facebook and Instagram feed algorithms for 23,000 consenting Facebook users and 21,000 consenting Instagram users from October through December 2020. The team assigned a sample of users to reverse chronologically ordered feeds — in which the newest content appears at the top of the feed — while the control group’s feeds were governed by the platforms’ default algorithms.

They found that moving users out of the algorithmic feeds “substantially decreased the time they spent on the platforms and their activities.” The move also affected the kind of content users saw, with “political and untrustworthy content” increasing on both platforms, “uncivil and hateful content” decreasing on Facebook, and “content from moderate friends and ideologically mixed audiences” rising on Facebook.

Despite these changes, the users who were fed chronologically based content did not report significant changes in their levels of issue polarization, affective polarization, political knowledge or other key attitudes over the course of the study.

“Algorithmic ranking systems and platform features that enable virality substantially shape people's experience on these platforms, such as the kinds of content they encounter and how they engage with it,” Guess said. “But we find limited evidence that these effects impacted people's political attitudes and behavior off-platform.”

Guess’ other Science paper, “Reshares on Social Media Amplify Political News But Do Not Detectably Affect Beliefs or Opinions,” studied the effects of exposure to reshared content during the 2020 election. A random subset of 23,000 Facebook users were assigned feeds that contained no reshared content over the study’s three months.

Removing reshares, the researchers report, greatly reduced “the amount of political news, including content from untrustworthy sources to which users [were] exposed; decrease[d] overall clicks and reactions; and reduce[d] partisan news clicks.” As in the first Science study, the team found no changes in political polarization or individual-level political attitudes in the subset who saw no reshared content. Those users did, however, display marked reductions in accurate news knowledge.

“This implies that reshares make people more knowledgeable,” said Guess, “which may seem counterintuitive since potentially viral reshared content is thought to promote misinformation — and, in fact, we present evidence that it does to some extent. But more reshared content is actually from trustworthy than untrustworthy sources, so on net people are more informed.”

The three papers published today in Science and the one in Nature are the first of at least 10 more that are in the pipeline. In addition, the team is giving access to de-identified data from the studies to other academic researchers who want to explore their own research questions.