Changes in social media algorithms did not reduce political polarisation, study finds
Image credit: Shutterstock
A group of researchers, with cooperation from Meta, looked at content that millions of Facebook and Instagram users were shown during and after the 2020 US presidential election to analyse its impact on political views.
The results of the research found that tweaking the platform’s features had an impact on the content users were shown on their feeds, but it did not alter their political ideology or opinions of the candidates.
This was one of the conclusions found in the first four papers published as part of the US 2020 Facebook and Instagram Election Study, a partnership between Meta researchers and independent external academics that was prompted by accusations that Facebook and Instagram algorithms amplified misinformation and political polarisation.
In order to shed light on the topic, the researchers analysed data from millions of Facebook and Instagram users and experimented with the algorithms of those who agreed to take part in the project.
The team’s conclusions pointed out the existence of political echo chambers on Facebook, where conservatives and liberals are exposed to vastly different opinions and sources of information. These echo chambers could be changed by tweaking the algorithms to stop re-shares, selecting a chronological feed and reducing exposure to like-minded content, the researchers found.
However, the changes did not have a direct impact on the users’ political beliefs.
“We now know just how influential the algorithm is in shaping people’s on-platform experiences, but we also know that changing the algorithm for even a few months isn’t likely to change people’s political attitudes,” said Talia Stroud of the University of Texas at Austin and Joshua Tucker of New York University, the project’s co-leaders.
In addition to testing the algorithms, the researchers also analysed the news consumption on social media of 208 million US Facebook users. They found that many political news URLs were seen, and engaged with, primarily by conservatives or liberals, but not both.
The findings suggest that social media users look for content that aligns with their political views and that the algorithms help by “making it easier for people to do what they’re inclined to do,” according to David Lazer, a Northeastern University professor who worked on all four papers.
However, the authors did find that 97 per cent of the political news sources on Facebook identified by fact-checkers as having spread misinformation were more popular with conservatives than liberals.
“Most misinformation, as identified by Meta’s Third-Party Fact-Checking Program, exists within this homogeneously conservative corner,” the study authors said.
Meta’s global affairs president Nick Clegg said that these new studies added “to a growing body of research showing there is little evidence that key features of Meta’s platforms alone cause harmful ‘affective’ polarisation, or have meaningful effects on key political attitudes, beliefs or behaviours”.
However, the US non-profit Free Press said Meta was misrepresenting the studies.
“Meta execs are seizing on limited research as evidence that they shouldn’t share blame for increasing political polarisation and violence,” said Nora Benavidez, Free Press’ senior counsel and director of digital justice and civil rights.
“This calculated spin of these surveys is simply part of an ongoing retreat from liability for the scourge of political disinformation that has spread online and undermined free, fair and safe elections worldwide.”
The University of Konstanz’s David Garcia, who was part of the research team, stressed that the studies’ results “do not show that the platforms are not the problem, but they show that they are not the solution”.
The researchers said Meta exerted no control over their findings, but they noted that the limited time frame could have skewed the results.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.