The ABCs of Political Polarization in the Digital Age: Algoritm, Bots & Content
Written by Nafisa Arya Alvita
In using social media, have you noticed the ubiquity of bots in our algorithms? While endlessly scrolling on Instagram or X, have you noticed that each post is filled with replies made by accounts with random letters and numbers as usernames and deceptively human, yet glaringly robotic messages? What about the flooding of political content by public figures, tailored to influence their audiences? While seemingly innocuous, these bot accounts and political content often have a dark implication to them. In a fast-evolving world of interconnectedness, technology and media has become a pivotal part in influencing public opinion and even swaying important political outcomes, such as gubernatorial and even presidential elections–resulting in political polarization.
The Presence of Social Media?
As of January 2024, there are an estimated 5.04 billion social media users, or 62.3 percent of the world’s population (Statista Search Department, 2024). Herman & Chomsky (1988) explains how mass media serves as an instrument in preserving hegemonic interests in shaping the consensus of the masses and aligning them with the status quo. This is the notion known as ‘manufacturing consent’, aptly in line with the context social media currently operates in. Moreover, what are the implications when this consent is manufactured through bots and content controlled by actors with morally gray motives, propelled by a dubious algorithm?
Case Study: US 2016 Presidential Elections
One of the more well-researched cases of this phenomenon was the United States’s presidential election in 2016, resulting in Donald Trump’s win. According to Howard (2018), both electoral campaigns were rife with computational propaganda, in which networks of bots act in concert to promote a candidate or message, to muddy political debate, or to disrupt support for an opponent. These political bots–fully or semi-automated accounts that produce content and engage with humans on political issues–taking advantage of social media’s weaknesses to polarize, amplify, or even silence participants of political conversations on X. This is sharply in line with Herman & Chomsky’s ‘propaganda model’–in order for the government and dominant private interests to get their messages across to the public, they use their money and power to filter out news and information that shape the narrative as well as marginalize dissent.
Too Close to Home?
This phenomenon is not limited to the United States–even in other countries, including Indonesia, social media and bots have been used to sway public opinion. The term political buzzers was locally coined in reference to the usage of accounts–sometimes run by real humans, sometimes by bots–which spread political propaganda in order to influence the masses. Under President Jokowi’s regime, government, political, and even business actors often utilize these buzzers to delegitimize critics and to spread disinformation, for example by framing certain narratives about politicians and policies. (Ufen, 2024).
Not only that, this particular political year has shown an uptick of a reliance on not only bots, but also public figures who become part of political campaigns. Even semi-automated accounts menfess accounts–accounts that upload community-posted posts automatically, though run and moderated by administrators–have also been offered financial incentive to influence political opinion. Such was the case for @jawafess on X, which announced it was offered financial incentives by an anonymous party to influence public opinion regarding the reversal of the Constitutional Court’s ruling.
Ensuring Political Polarization
As a result of the biased algorithm, social media users who are first exposed to certain political sentiments and continuously engage in it, become deeper and deeper entrenched into the very sentiment. This is commonly known as an echo chamber. Barberá (2020) notes that these have become breeding grounds for extremist beliefs. The amount of political bots that flood hashtags with fake news is key to drive disagreement and thus is associated with higher polarization (Azzimonti, et. al., 2022). Political polarization has negative implications for democracy–becoming a launching pad to further sow the seeds of discontent. It could tear away the people from recognizing their common goals as citizens, making it easier for political actors to implement their aforementioned hegemonic interests.
What to do?
Howard (2018) noted that protecting democracy from social media manipulation will require some sort of public policy oversight. However, what if the very ones manipulating social media are the policymakers and government officials themselves? This poses a terrifying challenge for us all and requires us to be ever so vigilant when using social media. We need to learn to be mindful of our algorithms, able to recognize the difference between bot and human accounts and critical of the kind of political content we consume on social media. The companies behind social media platforms also need to be wary of the automated bot accounts being used to push political agendas, ensuring their platforms are a safe space for free speech and healthy political debates, devoid of censorship or disruption from automated bots.
References
@jawafess. (2024). x.com. X (Formerly Twitter). https://x.com/jawafess/status/1826228022107894251
Aldayel, A., & Magdy, W. (2022). Characterizing the role of bots’ in polarized stance on social media. Social Network Analysis and Mining, 12(1). https://doi.org/10.1007/s13278-022-00858-z
Azzimonti, M., & Fernandes, M. (2022). Social media networks, fake news, and polarization. European Journal of Political Economy, 76(102256), 102256. https://doi.org/10.1016/j.ejpoleco.2022.102256
Barberá, P. (2020). Social Media, Echo Chambers, and Political Polarization. In J. A. Tucker & N. Persily (Eds.), Social Media and Democracy: The State of the Field, Prospects for Reform (pp. 34–55). Cambridge University Press.
Guess, A., & Lyons, B. (2020). Misinformation, disinformation, and online propaganda. Cambridge University Press. https://doi.org/10.1017/9781108890960
Herman, E. S., & Chomsky, N. (1988). Manufacturing Consent: the Political Economy of the Mass Media. Pantheon Books.
Howard, P. N. (2018, October 18). How Political Campaigns Weaponize Social Media Bots. IEEE Spectrum; IEEE Spectrum. https://spectrum.ieee.org/how-political-campaigns-weaponize-social-media-bots
Howard, P. N., Woolley, S., & Calo, R. (2018). Algorithms, bots, and political communication in the US 2016 election: The challenge of automated political communication for election law and administration. Journal of Information Technology & Politics, 15(2), 81–93. https://doi.org/10.1080/19331681.2018.1448735
Statista Search Department. (2024). Internet and social media users in the world 2024. Statista. https://www.statista.com/statistics/617136/digital-population-worldwide/
Stella, M., Ferrara, E., & De Domenico, M. (2018). Bots increase exposure to negative and inflammatory content in online social systems. Proceedings of the National Academy of Sciences, 115(49), 12435–12440. https://doi.org/10.1073/pnas.1803470115
Ufen, A. (2023). The Rise of Digital Repression in Indonesia under Joko Widodo. GIGA Focus Asia, 1(1). https://doi.org/10.57671/gfas-24012
Woolley, S. C. (2020). Bots and Computational Propaganda: Automation for Communication and Control. In J. A. Tucker & N. Persily (Eds.), Social Media and Democracy: The State of the Field, Prospects for Reform (pp. 34–55). Cambridge University Press.