Il tuo carrello è attualmente vuoto!
Can Algorithms be violent?
There is this misconception that we live in two realities, a physical and a virtual one. The ‘offline’ and ‘online’ are often diametrically opposed as if the world is made up of two dimensions – a ‘real’ world and a ‘virtual’ online one. I think it’s safe to say that this “digital dualism” theory no longer applies to the current world we live in (Jurgenson 2011). Especially for those of us who have grown up with the internet and have had our “real life” relationships become almost as significant in virtual realms or developed full-blown friendships and relationships solely online. Since the Covid-19 pandemic it’s become clear that, more than ever, the digital world and the tangible physical environment we live in are interwoven. In fact, the digital world is made up of very physical and tangible infrastructures. We just seem to forget it or not really question where all of the internet and this data everyone seems to be interested in fit in our physical reality. Algorithms are a particularly tricky dimension of the digital world. You can’t see them or touch them, they appear invisible and intangible.
What even is an algorithm?
The simple definition: they are instructions for solving a problem and completing a task (Illing 2022).
The more complex definition: they are made up of “flows of purposeful, repetitive, programmable sequences of exchange and interaction between physically disjointed positions held by social actors in organizations and institutions of society” (Castells 2017).
Algorithms are the backbone of our information society and they are much more complex and present in our lives than we realize.
First of all, they are everywhere. They are used when you type a google search, go through your emails, swipe on dating apps, look up directions on a GPS, or when you scroll on social media. They are embedded in every aspect of society yet the people who actually program these algorithms represent a very small percentage of the population who use them. They are based on mathematical formulae to help make our world more efficient and more connected, increasing the flow of information that passes through the world (Castells 2017). Or so it seems.
How can we scope the social and cultural implications of living in a “code-dependent” world? These endless lines of code are actually pretty misunderstood. Usually viewed as these rational, objective processes which help our society become more efficient and productive.
The benefits of algorithms are that they have made our world more connected than ever. They can simply do things we humans cannot. But that does not mean they can do no harm. We blindly rely on these ‘black boxes’ but we need to start properly questioning the power they hold (Illing 2022). Mimi Onohua coined the term “algorithmic violence” in 2018 defining it as: “the violence that an algorithm or automated decision-making system inflicts by preventing people from meeting their basic needs”.
Thus, these “black boxes” are created and operate behind the scenes meaning we are also blind to their negative social, political, and economic impacts. They do not just serve humans on an individual scale but are making impactful decisions on a greater societal scale from healthcare to education, politics, and how our cities are being designed. But we don’t even know how they work or who makes them.
Designing algorithms involves a lot of trial and error which represents a huge time saver for humans and our decision-making capabilities. NEWS FLASH: these black boxes have made our lives extremely easy in many ways. Perhaps too easy?
Our cars know when they are getting too close to an object, Netflix suggests what we should watch, our phones remind us of whose birthday it is and our social media feeds have created perfect little curated worlds for each of us to live in. The mind-fuck of it all is they have managed to make us believe we are in charge of these decisions and of the content we consume. For instance, social media algorithmic recommender systems are contributing to a decrease in human judgment. That is a fact. They have “the capability to shape individual’s decisions without them even knowing it, giving those who have control of the algorithms an unfair position of power.”( L. Raine & J. Anderson).
These digital infrastructures are fueled by data and the collection of data from millions of people around the world which enables us to know where people are most in need. Thus, in theory, algorithms should help us make the world a better place. But, the question is not whether they are good or bad but who programs them and how that defines the information we have access to. The idea that algorithms are rational decision-makers is merely an assumption.
These “black boxes” hide a dark side. Political polarization and extremism have been on the rise on all social media platforms. Algorithms tend to show you what will match your perspectives and outlooks on events occurring in the world. The scholar Adam Gismondi has addressed the importance of humans encountering contrasting viewpoints to challenge our critical thinking. If algorithms are suppressing contradictory information this may have very significant negative impacts on our society and our human capabilities of expressing empathy. Algorithms although complex, are not nuanced. They do not understand emotions and contexts the way humans do.
In addition, they are inherently biased due to the people who program them, who happen to be primarily White and Asian men (R. Lee & J. Anderson 2020). The data selected by them tends to benefit the small percentage of the population which represents them, perpetuating power imbalances found in society concerning race and class.
But when computers are causing this, who do you blame? Algorithms are beginning to be self-taught, further blurring who programs them and who is to blame when things do not go as planned. In the spring of 2016 Microsoft engineers unveiled their new creation, a bot named “Tay” which was meant to interact with people on Twitter. In the span of 24 hours, Tay shifted from spreading positive comments about the human race to agreeing with Holocaust deniers and spouting racist, Hitler-praising tweets. Tay was simply basing her tweets on algorithmic conversational skills, responding to what it was being tweeted (R. Lee & J. Anderson 2020). This violence does not remain online, it seeps into the real world causing sometimes irreparable damage.
This violence is usually exercised through the spread of hate speech on social networks which includes the spread of misinformation. A recent event that illustrates this on a very significant scale is Facebook’s contribution to the violent genocide of Rohingya Muslims in Myanmar in the past year (Global Witness 2021). In February 2021 the Tatmadaw, Myanmar’s military, performed a coup d’état disposing of the democratic party leaders who had been elected.
It is important to point out that Facebook matters. Myanmar is a striking example with half the population actively on it and where even the phones you buy come with it pre-downloaded. Therefore, many political conversations and discussions about the country’s current events take place on Facebook.
Shortly after the coup, Facebook stated it was doing everything to “prevent online harm being linked to offline harm”. A little over a month later, Armed Forces Day happened. The military killed most Rohingya Muslims that day, making the Rohingya Muslims victim of genocide. Over the months of February and March, Facebook in Myanmar was filled with hate speech and posts inciting violence against the Rohingya Muslims which shows how even those in charge of these programs are losing control over algorithms. Research led by Global Witness revealed numerous instances of extreme hate and violent posts violating Facebook’s guidelines. The Myanmar armed forces have a history of violating these guidelines by inciting violence. Indeed, Facebook’s algorithmic recommender systems were promoting military content at a time when people were being killed. Global Witness found a viral post of a woman in a ‘WANTED’ poster for participating in protests against the military. On March 13th a video was shared on a pro-military page, documenting an airstrike killing civilians and where you can hear someone say “you are getting what you deserve” along with laughter, glorifying the suffering of others. In April 2021, Facebook announced, “it would remove praise, support and advocacy of a) violence by security forces and protestors and b) the arrest of civilians by the police or military on the platform” ( Global Witness 2021). Although many pro-military pages were removed the issue remained in the way the algorithms are programmed as they would continue to suggest pages and posts related to the promotion of the military and causing harm to the Rohingya people.
Anybody who’s grown up with the internet the way it was after Facebook and youtube were created has been subject to traumatizing content. Popular and engaging content is often the most shocking and disturbing. We can’t help it, it’s what our brains are attracted to.
Social networks like Facebook and Tiktok are companies that make money by keeping users “hooked” with their algorithmic “traps” (Seaver 2018). There are concrete consequences stemming from shares, likes, and reposts forming “hard-to-escape cultural worlds” online (Seaver 2018). The Facebook whistleblower Frances Haugen revealed that “87% of the spending on combating misinformation is spent on English content” whereas English speakers represent 9% of Facebook users. This is what I mean when I point out that we live in our little curated worlds on social media. We realize how big and vast the internet is but we have no real grasp on how many of us are on it from all over the world every single day, liking, sharing, and posting.
Once again, this reveals that the issue with algorithmic violence lies in who is programming them and for whom they are intended as algorithmic violence clearly targets specific disadvantaged populations (Guardian 2021). This seems to be occurring and true in the planning of our own physical environment with “smart cities”.
As much as algorithms are improving our daily lives in ways we don’t even notice anymore, they undeniably can contribute to violence whether it be online or offline. They have become so versatile in what they can achieve and what they are used for that it is impossible to box algorithms into one just definition. This makes it extremely difficult for those who have nothing to do with digital technologies or who have become so accustomed to them that they don’t question these “black boxes”. We must be curious about how these digital infrastructures are impacting our lives, there is an urgent need for algorithmic transparency and literacy (R. Lee & J. Anderson 2020).
Not to mention that the power imbalances caused by this clearly reinforces the fact that there need to be more legislative regulations on these digital infrastructures. Who’s power even is it? It seems that “digital scientism” is placing more and more faith in the hands of increasingly self-sufficient AI decision-making. Making it even harder to reach algorithmic transparency and accountability for the violence it may cause (R. Lee & J. Anderson 2020). The researcher Andrew Tutt calls for an “FDA for Algorithms,” noting, “the rise of increasingly complex algorithms calls for critical thought about how to best prevent, deter and compensate for the harms that they cause” (R. Lee & J. Anderson 2020). However, if the problem with algorithmic violence also stems from the inherent biases of the people creating them it makes us wonder if digital democracy is a utopia or if it is one day achievable. Algorithmic violence is not temporary but one of the great challenges we are facing in the “code-dependent” age. As infrastructures of code underpin our digitalized contemporary world, it is crucial to break down these invisible infrastructures to prevent algorithmic violence.
-Gabrielle Grangie
Bibliography :
Alang, Navneet. “Life in the Age of Algorithms.” The New Republic. January 15, 2022. Accessed January 15, 2022. https://newrepublic.com/article/133472/life-age algorithms.
“Algorithm of Harm: Facebook Amplified Myanmar Military Propaganda following Coup.” Global Witness. Accessed January 15, 2022. https://www.globalwitness.org/en/ campaigns/digital-threats/algorithm-harm-facebook-amplified-myanmar-military propaganda-following-coup/.
Boellstorff, Tom. Coming of Age in Second Life: An Anthropologist Explores the Virtually Human.
“(PDF) Manuel Castells’ Theory of Information Society as …” Accessed January 15, 2022. https://www.researchgate.net/publication/
325582810_Manuel_Castells_theory_of_information_society_as_media_theory.
Graham, S. ( 2004) ‘The Software-sorted City: Rethinking the “Digital Divide”’, in S. Graham (ed.) The Cybercities Reader, pp. 324-32. London: Routledge.
Illing, Sean. “How Algorithms Are Controlling Your Life.” Vox. October 01, 2018. Accessed January 15, 2022. https://www.vox.com/technology/2018/10/1/17882340/ how-algorithms-control-your-life-hannah-fry.
MimiOnuoha. “MimiOnuoha/On-Algorithmic-Violence: Attempts at Fleshing out the Concept of Algorithmic Violence.” GitHub. Accessed January 15, 2022. https:// github.com/MimiOnuoha/On-Algorithmic-Violence.
Rainie, Lee, and Janna Anderson. “Experts on the Pros and Cons of Algorithms.” Pew Research Center: Internet, Science & Tech. August 06, 2020. Accessed January 15, 2022. https://www.pewresearch.org/internet/2017/02/08/code-dependent-pros-and cons-of-the-algorithm-age/.
“Rohingya Sue Facebook for £150bn over Myanmar Genocide.” The Guardian. December 06, 2021. Accessed January 15, 2022. https://www.theguardian.com/ technology/2021/dec/06/rohingya-sue-facebook-myanmar-genocide-us-uk-legal action-social-media-violence.
Seaver, Nick. “Captivating Algorithms: Recommender Systems as Traps.” Journal of Material Culture24, no. 4 (2018): 421-36. doi:10.1177/1359183518820366.
Star, Susan Leigh. “The Ethnography of Infrastructure.” American Behavioral Scientist 43, no. 3 (November 1999): 377–91. https://doi.org/ 10.1177/00027649921955326.
Bibliography
Alang, Navneet. “Life in the Age of Algorithms.” The New Republic. January 15, 2022.
Accessed January 15, 2022. https://newrepublic.com/article/133472/lifeagealgorithms.
“Algorithm of Harm: Facebook Amplified Myanmar Military Propaganda following Coup.”
Global Witness. Accessed January 15, 2022.
https://www.globalwitness.org/en/campaigns/digital-threats/algorithm-harm-facebook-
amplified-myanmar-militarypropaganda-following-coup/.
Boellstorff, Tom. Coming of Age in Second Life: An Anthropologist Explores the Virtually
Human.
Graham, S. (2004). ‘The Software-sorted City: Rethinking the “Digital Divide”’, in S.
Graham (ed.) The Cybercities Reader, pp. 324-32. London: Routledge.
Illing, Sean. “How Algorithms Are Controlling Your Life.” Vox. October 01, 2018. Accessed
January 15, 2022. https://www.vox.com/technology/2018/10/1/17882340/ how-algorithms-control-your-life-hannah-fry.
Jiménez, Alberto Corsín. “The Right to Infrastructure: A Prototype for Open Source
Urbanism.” Environment and Planning D: Society and Space, vol. 32, no. 2 (April 2014): 342–62. https://doi.org/10.1068/d13077p.
Kirtiklis, Kestas. (2017). Manuel Castells’ theory of information society as media theory.
Lingua Posnaniensis. 59. 65-77. 10.1515/linpo-2017-0006.
MimiOnuoha. “MimiOnuoha/On-Algorithmic-Violence: Attempts at Fleshing out the
Concept of Algorithmic Violence.” GitHub. Accessed January 15, 2022. https://
github.com/MimiOnuoha/On-Algorithmic-Violence.
Rainie, Lee, and Janna Anderson. “Experts on the Pros and Cons of Algorithms.” Pew
Research Center: Internet, Science & Tech. August 06, 2020. Accessed January 15,
2022. https://www.pewresearch.org/internet/2017/02/08/code-dependent-pros-and-
cons-of-the-algorithm-age/
“Rohingya Sue Facebook for £150bn over Myanmar Genocide.” The Guardian. December 06,
2021. Accessed January 15, 2022.
https://www.theguardian.com/technology/2021/dec/06/rohingya-sue-facebook-
myanmar-genocide-us-uk-legalaction-social-media-violence.
Safransky, Sara. “Geographies of Algorithmic Violence: Redlining the Smart City.”
International Journal of Urban and Regional Research, vol. 44, no. 2 (2019): 200-18.
doi:10.1111/1468-2427.12833.
Seaver, Nick. “Captivating Algorithms: Recommender Systems as Traps.” Journal of
Material Culture, vol. 24, no. 4 (2018): 421-36. doi:10.1177/1359183518820366.
Star, Susan Leigh. “The Ethnography of Infrastructure.” American Behavioral Scientist, vol.
43, no. 3 (November 1999): 377–91. https://doi.org/ 10.1177/00027649921955326.
Risposta
Thanks for sharing. I read many of your blog posts, cool, your blog is very good. https://www.binance.com/en/register?ref=JHQQKNKN
Rispondi