Why We Need đ„ Intersectional AI
AI systems poorly represent us but make decisions that deeply impact us.
They reinforce a neutral 'view from nowhere' which is actually just the most powerful, narrow perspective. It leaves out the global majority. It leaves out most people.
Take, for example, language models like ChatGPT.
What do language models say about people who do not fit in?
How can language models speak so that I recognize myself?
Unfortunately, what little these systems do say about us is harmful. For example the word "bisexual" is overwhelmingly associated with pornographic content. The word "trans" gets associated with hate speech and trauma. This happens because their text comes unedited from social media sites like Reddit, and (due to the averaging approach of the AI system) gets reduced to a lowest common denominator, erasing all but the most conservative results. AI amplifies the most popular and least nuanced positionâturning a collection of guesses into a rubber-stamped truth.
Think of your search engine suggesting spelling corrections and shopping results â it's convenient at times, but directs you toward a standardized option and removes other possibilities. This becomes dangerous when applied to worldviews and fluid concepts like race and gender. AI is a force multiplier, re-encoding its own accuracy at scale. This may not matter as much if it is planning a recipe, but it matters very much if it is deciding who gets citizenship or access to lifesaving services.
Technical 'solutions' just create even more categories, which create more assumptions. Quick fixes try to eliminate bias by manually blocking the worst results, but still fail to offer better information.
It is essential to understand that these issues cannot be resolved with more data or corrections. Bias is foundational to how AI works and cannot be optimized out. The lived experiences of diverse genders, of varied cultures are too dynamic to be captured and extracted meaningfully by machine learning.
Varied intersectional approaches do need to be adopted â but this means rethinking how these systems work entirely. Even an ideal model should not be used in cases where this can cause harm, or for policies that impact lives. It is dangerous to think we can create diverse AI without reimagining their structures and goals. That would be a troubling step backward.
What we can do instead:
Understand what AI cannot and should not do. Choose NOT to use AI when the stakes are too high.
Donât let responsibility get deflected onto artificial agents. Remember humans design and operate all AI systems.
Support alternative approaches, like small dataset curation by and for the communities they represent, and interdisciplinary innovation by the artists, activists, and researchers working outside of big tech.
We miss a world of valuable perspectives in the current AI race, which wedges people toward a dangerous middle. To be fit for everyone's digital future, we need AI to support more people's goals, even when they don't align with the view from nowhere.
AI is broken. What is Intersectionality and how can it help?
Want more details about Intersectionality, read on!
Ask: "How is AI shifting power?"
AI researcher Pratyusha Ria Kalluri says this is the most important question to ask. Rather than asking whether a technology is biased, fair, or good, ask how it is shifting power1. Tools always wield and shift power, so the question is how and for whose benefit. Kalluri is a founder of the Radical AI Network working on "ai and art that antioppressive and queerly beautiful."2
We see these unequal power distributions increasing as supposed 'automation' hides the very real human labor behind it. Thomas Smits and Melvin Weavers research suggests that "the agency of datasets is produced by obscuring the power and subjective choices of its creators and the countless hours of highly disciplined labour of crowd workers."3 As Christina zur Nedden and Ariana Dongus report, refugees entering Jordan are the subjects of involuntary biometric testing which becomes new AI technologies: "supposedly artificial intelligence is in fact animated by global production networks of click workers."4
Intersectionality helps make sense of power
Intersectionality, recently clarified by Dr. Kimberlé Crenshaw who coined the term in 1989:
"deals with the interactive effects of multiple forms of discriminationâracism, patriarchy, sexism, misogyny, homophobia. [...] Most importantly, Intersectionality is a frame to tell us what's wrong with the way certain discriminations are thought about. [...] Our understanding of racism [and] sexism is incomplete if it doesn't attend to the intersection of both."5
"Intersectionality marks power, as well as exclusion. If we take it seriously it gives us a clue about the ways we need to rethink how our institutions function, what it values, what it should value, in order to make more equitable more institutional practices."5
âDr. KimberlĂ© Crenshaw
What Intersectionality Is NOT
Intersectionality is often misunderstood. Critics point out how this has allowed for even more granular marketing to identity categories as they become visible to and targeted by capitalism (Chun 2018, 85). But in fact, intersectionality is about power. It examines and critiques systems of power and how those systems structure themselves to impact groups and individuals unequally.6
"Intersectionality is [...] not in the body; it's in how the body is situated in the society in which we live. And that's the piece that people often get confused about."5
"Intersectionality is contextually specific. I say it again. When we're talking about the low wages of working class women, that often includes white working class women as well. [...] Whiteness isn't protecting them from class and gender inequality in the way that whiteness sometimes provides insulation from other forms of inequality. That is the point of intersectionality. It's not a one-size-fits-all, it's not a prism that always predicts who's on top and who's on bottom."5
What Intersectionality Offers to Technology
Catherine DâIgnazio and Lauren Klein (2019) agree that intersectionality not only describes different aspects of an individual's position, but also "intersecting forces of privilege and oppression at work in a given society. Oppression involves the systematic mistreatment of certain groups of people by other groups. It happens when power is not distributed equally." Intersectional feminism insists on not just meeting the needs of the most privileged (white) women when arguing for feminism, or the needs of only Black men when combating anti-Blackness, but instead suggests that those with overlapping oppressions such as Black women face unique additional oppression; by addressing their circumstances, for example, life improves for everyone.
It also entails a commitment to âresistant forms of knowingâ, which suggest different lenses through which to approach AI and technology in general. Intersectional AI as sketched out here is indebted to Crenshawâs original definition, as well as the concept's long lineages within Black and indigenous thought, for its critical lenses to analyze technology and for its creative approach to redesigning it.
From inclusion to decentering
Guillermo GĂłmez-Peña says the high-tech world does not question itself as central, nor where it draws its borders: "We are no longer trying to persuade anyone that we are worthy of inclusion (we are de facto insiders/outsiders at the same time, or temporary insiders perhaps, and we know it). [âŠ] What we wish is to remap the hegemonic cartography of cyberspace; to "politicize" the debate; to develop a multi centric theoretical understanding of the cultural, political and aesthetic possibilities of new technologies; to exchange a different sort of information (mytho poetical, activist, per formative, imagistic) [âŠ]."[^Gomez-Pena]
Importantly, Christina Dunbar-Hester (2020) argues that "diversity and inclusion" fixes completely miss the problemânot only because it ignores the workers of the Global South7 but also because it reverses the problem's cause and effect: "to frame social inequality as a question of diversity in technological production, and to expect to change wider inequities by adding 'diverse' individuals to technical cultures, is to misunderstand how the distribution of various social identities in a given sector are outgrowths of differential social power, not the other way aroundâ (16).[^Dunbar]
By a similar logic, removing bias from algorithmic systems âeven if that were possible (which, no.) â does not remove bias from culture, nor does it replace it with anything. Often, even well-meaning efforts to "remove" bias from systems fall back on quantitative / computational methods or resort to representation, which mirror the problems they claim to address, potentially even making tools to be misused in the future:
"Representation as a goal may also result in accepting (and reproducing) notions of fixity in terms of social identity. This should raise skepticismâ (236).[^Dunbar]
These approaches do not examine the structure of the system itself, the logic upon which it is founded, both materially and intellectually. They do not ask who is creating, contributing, or benefitting from its operation. "Diversity [and, I argue, bias examination] is necessary, but not sufficient; it represents a shortcut in what should be a deeper conversation about values and justice" (241).[^Dunbar]
White supremacy culture in AI
The zine "Characteristics of White Supremacy Culture" by Tema Okun8 details traits that permeate broader culture (and tech culture). They don't feel good, and we can avoid them in life and when designing and implementing intersectional technology:
perfectionismsense of urgencydefensivenessquantity over qualityworship of the written wordonly one right waypaternalismeither/or thinkingpower hoardingfear of open conflictindividualismprogress is bigger, moreobjectivityright to comfort
Do you see these values in the digital objects and services you use now? How can you imagine them operating differently? What values would you replace them with?
The reach of AI requires Intersectionality at every level
Intersectional AI needs tools we can access, understand, and actually use. In âAnatomy of an AI System,â Kate Crawford and Vladan Joler (2018) map out the vast tangible effects of convenient, immaterial-seeming AI that require "a vast planetary network, fueled by the extraction of non-renewable materials, labor, and data. [âŠ] it is hard to 'see' any of these processes individually, let alone collectively." And understanding isn't enough, they argue, "without forms of real choice, and corporate accountability, mere transparency won't shift the weight of the current power asymmetries."
"To the casual observer, it looks like it has never been easier to build AI [creating] a false idea of the 'democratization' of AI. While 'off the shelf' machine learning tools, like TensorFlow, are becoming more accessible from the point of view of setting up your own system, the underlying logics of those systems, and the datasets for training them are accessible to and controlled by very few entities. In the dynamic of dataset collection through platforms like Facebook, users are feeding and training the neural networks with behavioral data, voice, tagged pictures and videos or medical data. In an era of extractivism, the real value of that data is controlled and exploited by the very few at the top of the pyramid." (Crawford/Joler 2018) 9
Dongus, A. The Living Pixel: An alternative feminist-materialist geneology of the emergence of computer-vision." SFKP Journal. (German) https://sfkp.ch/artikel/die-lebendigen-pixel
zur Nedden, C. and Dongus, A. 2017. Tested on Millions of people involuntarily. Zeit Online. https://www.zeit.de/digital/datenschutz/2017-12/biometrie-fluechtlinge-cpams-iris-erkennung-zwang
Smits, T., & Wevers, M. (2021). The agency of computer vision models as optical instruments. Visual Communication, 1470357221992097. https://doi.org/10.1177/1470357221992097
Okun, Tema. 1999. https://www.whitesupremacyculture.info/
Lisa Nakamura, Sarah T. Roberts, and others have pointed to the many female and BIPOC tech workers who go unrecognized because their work is not the glamor work of tech, from the physically-taxing work of chip manufacturing on New Mexico reservations to the emotionally-taxing work of content moderation at sites just-off big tech campuses. Christina Dunbar-Hester (2020) argues that even "hacking has never been centered exclusively around white men in the Global North. Furthermore, some of what is required here is to simply shift the frame of what counts as hacking: to redraw boundaries to place social and historical analysis and infrastructural care work within the purview of hacking. In combination, these analytical adjustments can illuminate the 'others' of hackingâwho are already here" (242).
Crawford, Kate, and Vladan Joler. 2018. âAnatomy of an AI System.â SHARE Lab, SHARE Foundation and AI Institute.; Stark, Luke, and Kate Crawford. 2015. "The Conservatism of Emoji: Work, Affect, and Communication." Social Media and Society.
DâIgnazio, Catherine, and Lauren Klein. 2019. Data Feminism.
Dunbar-Hester, Christina. 2020. _Hacking Diversity: The Politics of Inclusion in Open Technology Cultures.
Crenshaw, K. (2015). Demarginalizing the Intersection of Race and Sex: A Black Feminist Critique of Antidiscrimination Doctrine, Feminist Theory and Antiracist Politics. University of Chicago Legal Forum, 1989(1). https://chicagounbound.uchicago.edu/uclf/vol1989/iss1/8
Crenshaw, K. (2021, March 29). What Does Intersectionality Mean?âŻ: 1A. https://www.npr.org/2021/03/29/982357959/what-does-intersectionality-mean
Chun, Wendy Hui Kyong. 2016. Updating to Remain the Same.; 2009. âRace and/as Technology.â Camera Obscura 70. 24:1.; Apprich, Clemens, Wendy Hui Kyong Chun, Florian Cramer, & Hito Steyerl. 2018. Pattern Discrimination.
Daniels, Jessie. 2009. âRethinking Cyberfeminism(s): Race, Gender, and Embodiment.â Womenâs Studies Quarterly.
Gaboury, Jacob. âBecoming NULL: Queer Relations in the Excluded Middle.â Women & Performance: A Journal of Feminist Theory.
GĂłmez-Peña, Guillermo. âThe Virtual Barrio @ The Other Frontier.â
http://riakalluri.com/
https://www.nature.com/articles/d41586-020-02003-2
Nakamura, Lisa. 2014. âIndigenous Circuits: Navajo Women and the Racialization of Early Electronic Manufacture.â American Quarterly, 66:4.