In 1969, Johan Galtung coined the phrase "structural violence" to refer to the ways social structures and institutions harm people by preventing them from meeting their fundamental needs.1 The forces that work together to inflict structural violence (things like racism, caste, colonialism, apartheid, transphobia, etc) are often systemic, invisible and intersectional. But crucially, they become embodied as individual experiences.
Along similar lines, it seems we're overdue for a term that allows us to easily (if imperfectly) articulate some realities of the moment we find ourselves in today. Specifically, we need a phrase that addresses newer, often digital and data-driven forms of inequity. I want to posit the phrase algorithmic violence as a first step at articulating these negotiations.2 Algorithmic violence refers to the violence that an algorithm or automated decision-making system inflicts by preventing people from meeting their basic needs. It results from and is amplified by exploitative social, political, and economic systems, but can also be intimately connected to spatially and physically borne effects.
In my view, algorithmic violence sums up all of the things that we have experienced (particularly in the last five to ten years) as we've seen the availability of huge datasets, advances in computational power, leaps in fields like artificial intelligence and machine learning, and the subsequent incorporation and leveraging of all these things into a hierarchical and unequal society.
Like other forms of violence, algorithmic violence stretches to encompass everything from micro occurrences to life-altering realities. It's that unsettling sensation you get when you look at a shirt online and then proceed to see that shirt advertised at you on every single website that you visit for the rest of the day. It's why the public reacts so strongly when companies like Instagram decide to spontaneously change the algorithms behind their content. It's at the core of the frustration that Uber/Lyft/Juno drivers feel when their apps tell them to make seemingly nonsensical pickups or to chase Surge #/ Prime Time deals that ultimately leave them receiving lower wages.
We should also group into algorithmic violence some of the failings of Facebook, like when the company dictated that users must # for accounts with their real names but deactivated accounts of people whose names weren't deemed legitimate.3 These users were removed from a site that for many of them represented a space for communication and connection, all due to the narrow classifications imposed by the company's algorithms. We could include the limitations imposed on job seekers whose resumes are rejected by automated Application Tracking Systems because they're missing the "right" keywords.4 Just as relevant are risk-assessment tools like Compas, which are used to decide which defendants should be sent to prison, and the algorithms behind predictive policing, which have been criticized for using biased data to determine the communities police should patrol.
All of these are forms of algorithmic violence. They not only affect the ways and degrees to which people are able to live their everyday lives, but in the words of Mary K. Anglin, they "impose categories of difference that legitimate hierarchy and inequality."5 Like structural violence, they are procedural in nature, and therefore difficult to see and trace. But more chillingly, they are abstracted from the humans and needs that created them (and there are always humans and needs behind the algorithms that we encounter everyday). Thus, they occupy their own sort of authority, one that seems rooted in rationality, facts, and data, even as they obscure all of these things.6
Finally, algorithmic violence does not operate in isolation. Its predecessors are in the opaque black boxes of credit scoring systems and the schematization of bureaucratic knowledge.7 It's tied to the decades of imperialism—unfolding digitally as well as politically and militarily—that have undergirded our global economic systems. Its emergence is linked to a moment in time where corporate business models and state defense tactics meet at the routine extraction of data from consumers.8
As we continue to see the rise of algorithms being used for civic, social, and cultural decision-making, it becomes that much more important that we name the reality that we are seeing.9 Not because it is exceptional, but because it is ubiquitous. Not because it creates new inequities, but because it has the power to cloak and amplify existing ones. Not because it is on the horizon, but because it is already here.
A final note: One of the reasons I'm publishing this on Github is because this is a work in progress, with a more thorough follow-up piece to come. In the meantime, if you have feedback or opinions, catch me on Twitter (@thistimeitsmimi).
Author: Mimi Onuoha | Published: 2/7/2018 | Last updated: 2/8/2018
1: The phrase is commonly attributed to Johan Galtung, but has been expanded upon by a number of researchers. The idea of the intersectionality of these different forms of violence comes, of course, from Kimberle Crenshaw.
2: Note here that I use violence in the prohibitive sense of the word, e.g. as something that (negatively) shapes the experiences and opportunities experienced by people. This is different from the definition of physical brute force that many think of when they hear the word. While I am well aware of the limitations of the comparison, I refer to definitions of structural violence such as the one from the aptly-named structrualviolence.org: "….the point of the term “structural violence” is to act as an umbrella to encapsulate many different forms of various social and institutional failings that have real, if not always immediately appreciable consequences in peoples’ lives."
3: This has been an ongoing situation that has flared up in numerous ways over the years. Some of the groups affected: indigenous people, trans people, victims of domestic violence.
4: ATS systems are notorious for having specific keyword inputs that employee's resumes must match. See articles like this one, which place the onus on job seekers to figure out the inputs for these HR systems.
5: Mary K. Anglin," Feminist Perspectives on Structural Violence" (paywall)
6: See Nick Carr's work on automation bias, wherein humans are more likely to trust information coming from a machine because of its seemingly neutral positioning.
7: See David Graeber's work on bureaucratic documents and Lisa Jean Moore and Paisley Currah's work on the ways in which birth certificates attempt to fix in place mutable concepts.
8: See Shoshana Zuboff's concept of "surveillance capitalism". I'm writing this intentionally from a US-centric perspective, primarily because so many of the companies whose work intersects with this are based in the States, and I think that this is a crucial dimension of the issue that is necessary to address (there's much that could be said on the specificity and role the US plays in larger labor practices adjacent to this discussion). However, there are a number of global examples that could be pulled into this. See, for instance, writing about India's Aadhaar numbers, credit/social scores in China, Adrian Chen's work on moderators in the Philippines, etc.
9: In December 2017, the NYC City Council passed a bill attempting to provide accountability and transparency for algorithms, the first of its kind.