top of page

How Social Media Algorithms Are Increasing Political Polarisation

Erica Bell

Over the last two decades, Republicans and Democrats in the United States (U.S.) have become more politically divided than at any other point in history. A 2014 survey found that polarisation between the two manifested both in politics and everyday life, with the greatest increase in division occurring from 2011-2014. This period was also when social media giants Facebook, Instagram, Twitter and YouTube experienced their highest user growth. Social media algorithms are a driving force of this polarisation as a result of how they curate a user's news feed and recommendations to preference content specifically curated for them–creating an online echo chamber that isolates their viewpoints.


Social media algorithms are persuasive and influence what information you can view and who you vote for. This is referred to as persuasive technology, which is designed with the underlying motive of modifying behaviour and exploiting users' emotional and psychological responses. Facebook's algorithm has been shown to preference content in newsfeeds that elicits the most emotional response from the user, in an effort to keep them browsing on the platform for as long as possible. Many unsupervised machine learning algorithms used by social media companies have also been designed to prioritise high-engagement content without consideration for misinformation. This is also why fake news spreads on Twitter six times faster than legitimate news.


A 2017 Japanese study looked at users on Twitter who engaged or followed people of differing viewpoints. The study found that both sides would rarely discuss issues that overlap, suggesting that when each side is concerned with different issues, they rarely cross community lines, thereby, creating an echo chamber only enhanced by Twitter's 'who to follow' feature. Similar situations exist across social media platforms with Youtube's recommendations based on 'watch history' and Instagram's 'explore page'.


Individuals who participate in homogeneous discussion groups tend to adopt more extreme positions after frequently interacting with their like-minded peers. The proliferation of extremist groups such as Proud Boys, Generation Identitær and QAnon, all began online during a peak in 2014-2017 of social media algorithms recommending extremist and alt-right content that was being frequently circulated by users. Computer Scientist, Jaron Lanier, has also proposed that Donald Trump's extremism has developed in part due to his Twitter addiction, citing that the social media platform's algorithm is designed to modify user behaviour to become more "hostile [and] paranoid" towards opposing sides.


Distrust of facts and journalism, which are necessary to protect democracy, have become an ongoing consequence of social media algorithms prioritising engagement over accuracy. A series of Pew Research polls show that Republicans trust fewer news sites than they used to, with Fox News, Trump's speeches and posts on social media, being some of the few sources they regularly read and believe. Hyper-partisanship and divergent interpretations of the news are then fed back through social media, amplifying these divisions. This has also caused greater instances of online disinformation campaigns and movements, which were evident in the U.S. 2020 election and continue today with COVID-19.


Social media companies are now experimenting with how their algorithms can more effectively identify misinformation online. In the lead-up to the U.S. 2020 election, Twitter and Facebook were able to significantly limit misinformation spread on their platforms by more actively taking the content and accounts down through enhanced machine learning. Facebook recently announced that it would no longer recommend political groups to users. However, misinformation recognition from algorithms has yet to be perfected and online extremists have been able to circumvent content moderation efforts by using more cryptic language and subtle dog whistles. The 2020 House hearing on 'Americans at Risk: Manipulation and Deception in the Digital Age' presents a strong case for a collaborative solution between governments and social media companies to crack down on disinformation by working towards creating more ethical technology.


Many have suggested that people would become less polarised if they broke out of their echo chambers and followed people from opposing viewpoints. However, research shows that once group polarisation has taken effect on a person, they tend to regard the expression of opposing viewpoints as an attack on their identity, affirming their negative attitude toward their political opposition. The solution to social media-induced political polarisation will likely involve huge structural changes in our society to deal with inequality among groups, accountability from social media companies, as well as more mediated productive discussions between individuals of opposing viewpoints.



Erica Bell has recently graduated with a Bachelor of International Studies (Honours) from The University of Wollongong. She is currently an intern at the Australian Institute of International Affairs and has a passion for cyber security and Asia-Pacific diplomacy.

  • Instagram
  • Facebook
  • Twitter
  • LinkedIn
acnc-registered-charity-logo_rgb.png

Young Australians in International Affairs is a registered charity with the Australian Charities and Not-for-Profits Commission.

YAIA would like to acknowledge Aboriginal and Torres Strait Islander peoples as Australia’s First People and Traditional Custodians.​

 

We value their cultures, identities, and continuing connection to country, waters, kin and community.

 

We pay our respects to Elders, both past and present, and are committed to supporting the next generation of young Aboriginal and Torres Strait Islander leaders.

© 2024 Young Australians in International Affairs Ltd

ABN 35 134 986 228
ACN 632 626 110

bottom of page