top of page

From Echo Chambers to Extreme Views: The Mechanics of Online Polarization

  • Writer: Vusi Kubheka
    Vusi Kubheka
  • Jun 27, 2024
  • 10 min read

Updated: Jul 5, 2024


Much research has extensively explored news consumption on social media and its polarizing effect on public opinion. The Reuters Institute reported that 30% of participants access news through social media, while Garrett's 2009 study found about 63% of internet users did the same [1]. Evidence informs us that social media users increasingly engage in echo chambers, driven by selective content consumption and social media algorithms. This has compromised civility, evident in events like the July 2021 unrest in Gauteng and KwaZulu Natal, and the politicization of public health issues such as Covid-19 hesitancy, fuelled by anti-vaccination conspiracy theories. Research asserts that a lack of diversity in multi-perspective and evidence-based information can have dangerous ramifications for society by inciting the spread of misinformation, facilitating political polarization, and promoting social extremism.




"The lack of exposure to differing opinions also generates a false perception of consensus, thus creating a different perception of reality across groups."


With the internet providing users unprecedented access to information and research showing that individuals’ decisions about which news to consume is informed by their political beliefs, scholars warn that this increased choice of information and opinions is linked with individuals excluding exposure to alternative opinions [2, 3]. When people are able to choose how and where they attain information from, they tend to prefer information that closely accompanies their opinions and ideological beliefs and avoid information that challenges their views [2, 3]. This fragmentation enabled by social media means that political communication among active internet users has succumbed to  ‘insular’ homogenous communities divided along partisan ideologies – commonly described as 'echo chambers'.



What Are Echo Chambers and How Do They Form?


Echo chambers can be understood as an environment where the belief, opinion or political sentiment of an individual about a specific topic is reinforced through repeated interactions with peers who hold similar perspectives. Two conditions are needed for the emergence of an echo chamber. Firstly, there needs to be a group of individuals that share a common opinion or attitude in opposition to other individuals regarding the same topic. Secondly, there should be social interactions that disseminate information about the topic among these individuals that can influence their beliefs on the subject. Such interactions are more likely between individuals that share similar opinions, where there is a certain degree of homophily. Therefore, echo chambers are defined by the coexistence of two elements: 1) opinion polarization with respect to a particular topic and 2) a degree of homophily in the social interactions of individuals (homophily is the preference to form strong social connections with people who share one’s defining characteristics) [4].


Selective exposure, confirmation bias and cognitive dissonance are dominant forms in which individuals consume content on social media [4]. This has created an environment where individuals are ringed by people whose opinions agree with their own.  Group polarization theory informs us that an echo chamber can operate as a mechanism that reinforces existing opinions within a group, consequently steering the entire group towards more extreme opinions [4]. The lack of exposure to differing opinions also generates a false perception of consensus, thus creating a different perception of reality across groups. Without a common ground on which to operate, democratic debate is likely to be inhibited.



Understanding Echo Chambers through Selective Exposure, Confirmation Bias, and Cognitive Dissonance


These concepts are not new and existed before the prominence of the internet. Even before the modern algorithms of social media, Garrett’s (2009) study showed that when individuals had the option of selecting news options representing a range of political opinions, they consistently chose news items that supported their own positions and spent more time reading them. These results align with many studies exploring selective exposure theory – which claims that individuals have a tendency to intentionally prefer information and opinions that confirm or reinforce their perspectives and ignore or have a weaker interest in information or opinions that refute their perspectives on an issue.





This behaviour is rooted in cognitive dissonance theory, the idea that individuals are naturally inclined to achieve cognitive equilibrium or consistency by reducing cognitive dissonance and increasing personal bias. More simply, individuals will avoid information or stimuli that is incongruent with their pre-existing worldview because the inconsistency between their mental representation of the world and the contrasting environment is inherently dissatisfying and uncomfortable. To avoid or reduce this discomfort, individuals seek to align their pre-existing beliefs with any new information. By avoiding cognitive dissonance, Bessi et al., (2015) suggest that this may be the innate root cause of the formation of echo chambers. In addition to filtering which information is engaged with, individuals assign more weight or validity to information that supports their pre-existing beliefs, otherwise known as confirmation bias. All of these cognitive mechanisms direct social media users' decisions about whether to spread content.


Even with advances in ICT and the growing prominence of social media enabling “more robust and pluralistic form of public debate” on paper, the reality is that greater exposure to information and opinions is leading individuals to practice more selective exposure and confirmation bias [3]. Psychological explanations of ideological homophily suggest that dissonance reduction, identity maintenance, and motivated reasoning are highly general and prevalent in the general population [5]. Decision science also offers a possible explanation, arguing that most people rely on heuristics to understand or solve problems, which are inherently prone to systematic biases in their perceptions of facts. People tend to seek out and evaluate evidence in ways that support their existing positions or those of others who share their ideological views. Some psychologists suggest that these effects are intensified by dogmatism, an aversion to complexity and political conservatism (and the trait of being resistant to revising one’s beliefs based on empirical evidence) [5].





Yet, in all honesty, the formation of online communities is not inherently the problem. Rather, the problem occurs because partisan individuals in these communities become impermeable to outside opinions and information. This contributes to the intensifying of these individuals' “opinions as a result of being exposed to more homogeneous viewpoints and fewer credible opposing opinions” [6].



How Algorithms Shape Echo Chambers and Polarize Users


By accounting for users’ preferences, social media algorithms mediate and facilitate content promotion and thus the spread of information [7]. In their comparative analysis of four social networking platforms (Facebook, Twitter, Reddit and Gab), Bessi, A., et al., (2015) found that social media platforms with algorithms that harness social feedback may intensify polarization and enable the emergence of echo chambers.


Social media’s algorithms and human’s limited attention spans combine to influence our exposure and what we select by promoting content that is similar to what we have already seen, thus diminishing content diversity and consequently leading to polarization [8]. Pew Internet and the American Life Project's startling discovery that 26% more U.S internet users choose to consume political information that aligns with their political beliefs in 2008 compared to 2004 hints at the potential extent of this polarisation [6]. These algorithms promote content that adheres to users’ pre-existing worldviews and subdues dissenting information, contributing to the formation polarized groups organised around shared narratives [7]. In these echo chambers, misinformation has the potential to proliferate more easily. With the World Economic Forum identifying digital misinformation as one of the main threats to society, this ‘infodemic’ is being muddied by the impact of bots and automated accounts [9]. This has a profound impact on policymaking, political communication and the evolution of public debate [7].



Photo by <a href="https://unsplash.com/@martenbjork?utm_content=creditCopyText&utm_medium=referral&utm_source=unsplash">Marten Bjork</a> on <a href="https://unsplash.com/photos/person-holding-space-gray-iphone-x-FVtG38Cjc_k?utm_content=creditCopyText&utm_medium=referral&utm_source=unsplash">Unsplash</a>




The Influence of Echo Chambers on Mis/Disinformation and Public Perception


Schmidt et al.,'s (2017) study found that selective exposure created by Facebook’s algorithm plays a crucial role in shaping users’ engagement with posts from news outlets. Users tended to focus on a very limited set of pages, creating a distinct community structure within these news outlets. The study suggested that the main contributor of misinformation dissemination was the polarization of users on specific narratives rather than the lack of fact-checked certifications.


This is an observation shared by Del Vicario et al., (2016) who found that users exposed to an increased number of unsubstantiated rumours increase their tendency to be “credulous” (gullible; willing to believe or trust too readily, especially without proper or adequate evidence). These authors suggest that the clear path between the producers and consumers of content enabled through social media contributes to users' confusion about causation, and thus encourages speculation, rumours, and mistrust [9, 10]. They argue that we have shifted from an environment mediated by journalists, to a more disintermediated selection process where users embrace their agency to select information that adheres to their pre-existing belief systems, and group themselves with like-minded individuals where they polarize their opinions [11].


Research shows that within these echo chambers, confirmatory information gets accepted even if it contains deliberately false claims while dissenting information is mainly ignored or it might even increase group polarization. Users’ desire to maximise their engagement exacerbates and complicates the situation. In such a disintermediated environment, public debate is littered with large amount of misleading information that influences important decisions, hinders consensus on socially relevant issues and foster speculation, rumours, mistrust or conspiracy thinking [9, 11].




How Online Popularity Fuels Partisan Divides


Jiang, Ren, & Ferrara (2021) were able to link online prominence with partisanship. Results showed that partisan right-leaning users stood out for being more vocal, more active, and more impactful than their left-leaning counterparts. These findings are in keeping with previous research that exposes the “price of bipartisanship”, which informs us that bipartisan users need to relinquish their online influence if they choose to share information from both factions. Similarly, Garibay et al., (2019) showed that users who utilise polarization tactics are able to maintain their influence (so the phenomena of “being edgy for clicks” actually works) [12]. Ergo, users are aware of the incentive to capitalize on their partisanship to maintain or increase their online popularity, which further propagates polarization. Information that is disseminated by highly polarized and influential users has a privileged ability to reinforce political predispositions that already exist, and any polarized misinformation spread by influencers is more likely to be amplified [13].



The Role of Echo Chambers in Shaping Public Health Debates


Even though COVID-19 is mostly in the domain of public health, several researchers have found strong evidence of echo chambers on this topic on both ends of the political spectrum. According to Jiang, Ren, & Ferrara’s (2021) study, tweets from right-leaning users were almost exclusively retweeted by users who were also right-leaning, meanwhile left-leaning and neutral users had a more proportionate distribution of retweeter polarity. Through random simulations, the researchers found that information rarely passed in or out of the right-leaning echo chamber, facilitating a small but intense political bubble. By contrast, left-leaning and neutral users were far more receptive to information from each other. Accordingly, when comparing popular far-left and far-right users, the study showed that users who were popular among the right were only popular to the right, however, users who were popular among the left were popular among all users [13].


Even with Twitter’s commendable efforts to tackle misinformation and promote fact-checking, this study shows that having effective online conversations is being hindered and manipulated by communication bubbles divided by polarization and the popularity it maintains. In particular, Jiang, Ren, & Ferrara (2021) observed that it is more challenging to get information through the right-leaning echo chamber. This presents unique challenges for public figures (and health officials) outside this echo chamber to effectively communicate information [13].





The Connection Between Echo Chambers, Misinformation, and Xenophobic Violence


Chenzi (2021) argues that misinformation is increasingly becoming a core aspect of the rising xenophobia in South African urban settings. When faced with such instances of sensational societal risks social media platforms are utilised as weapons to form online communities who express their concerns around topics of nativeness, inclusion/exclusion, migration, and belonging. These communities consist of individuals who are more receptive to receiving information confirming their concerns about foreigners [14]. The resulting tension between South African and foreign national culminated in several cases of xenophobic violence, which was recognised by cabinet minister Lindiwe Sisulu who responded that the ‘Government would like to caution that the spread of misinformation, fake pictures and videos on social media, as well as fake websites, may be fuelling tensions in our communities between South Africans and foreign nationals’ trumpet concerns.


Chenzi (2021) further argues that reprisal attacks against South African individuals and business were also born out of misinformation during xenophobic episodes and cites Oneko (2019) who claims that:


“At the height of the most recent xenophobic violence in South Africa [September 2019], WhatsApp messages, announcing dates on which foreigners would be attacked and killed if they did not leave the country, circulated. Consequently, attacks on shops owned by people from other parts of Africa spread to different areas (Oneko, 2019).

Their paper details eleven cases where misinformation was proven to incite xenophobic violence in South Africa.



Conclusion


Social media and the internet have transformed the way we interact, debate, and form opinions by providing unprecedented access to information. Initially, it was theorised that these platforms would enhance our ability to facilitate conversations and debates among diverse and widespread audiences. However, the democratic ideals of healthy debate on web-based social platforms are deteriorating (Anderson et al., 2014). This environment negatively impacts the democratic deliberative process by altering the way facts are perceived. A robust civic and democratic space relies on a well-informed public and a healthy ecosystem of competing ideas. When individuals are exposed exclusively to people or information that reinforces their pre-existing beliefs, democracy suffers.





 



References


  1. Schmidt, A.L., et al., Anatomy of news consumption on Facebook. Proceedings of the National Academy of Sciences, 2017. 114(12): p. 3035-3039.

  2. Garrett, R.K., Echo chambers online?: Politically motivated selective exposure among Internet news users1. Journal of Computer-Mediated Communication, 2009. 14(2): p. 265-285.

  3. Barberá, P., et al., Tweeting From Left to Right: Is Online Political Communication More Than an Echo Chamber? Psychol Sci, 2015. 26(10): p. 1531-42.

  4. Bessi, A., et al., Science vs conspiracy: Collective narratives in the age of misinformation. PloS one, 2015. 10(2): p. e0118093.

  5. Kahan, D.M., Ideology, motivated reasoning, and cognitive reflection. Judgment and Decision Making, 2013. 8(4): p. 407-424.

  6. Conover, M., et al. Political polarization on twitter. in Proceedings of the international aaai conference on web and social media. 2011.

  7. Cinelli, M., et al., The COVID-19 social media infodemic. Scientific reports, 2020. 10(1): p. 1-10.

  8. Cinelli, M., et al., Echo chambers on social media: A comparative analysis. arXiv preprint arXiv:2004.09603, 2020.

  9. Bessi, A., et al., Trend of narratives in the age of misinformation. PloS one, 2015. 10(8): p. e0134641.

  10. Del Vicario, M., et al., The spreading of misinformation online. Proceedings of the national academy of Sciences, 2016. 113(3): p. 554-559.

  11. Del Vicario, M., et al., Echo chambers: Emotional contagion and group polarization on facebook. Scientific reports, 2016. 6(1): p. 37825.

  12. Garibay, I., A.V. Mantzaris, A. Rajabi, and C.E. Taylor, Polarization in social media assists influencers to become more influential: analysis and two inoculation strategies. Scientific Reports, 2019. 9(1): p. 18592.

  13. Jiang, J., X. Ren, and E. Ferrara, Social media polarization and echo chambers in the context of COVID-19: Case study. JMIRx med, 2021. 2(3): p. e29570.

  14. Chenzi, V., Fake news, social media and xenophobia in South Africa. African Identities, 2021. 19(4): p. 502-521.


コメント


  • Linkedin
  • Kaggle_logo_edited
  • Twitter
bottom of page