Queering Digital Space: How queer bodies disturb the gender binary in facial recognition

by Qingyi Ren (they/them)

PhD Student at The University of Arts, Linz & Basel

Academy of Art and Design FHNW

qingyi.ren (@fhnw.ch)

Anthways, 2023 ©Qingyi Ren

DOI:10.5281/zenodo.8352731

Abstract

This paper discusses the intersection of gender and artificial intelligence (AI) and how the binary concept of gender is deeply embedded in the digital space. I examine how facial recognition systems use the binary gender as one of their facial analysis tags and how gender is assigned externally in data and algorithm models, masking heteronormativity with the illusion of “technical neutrality.” This paper also discusses the role of queer bodies in dismantling societal order and challenging the binary concept of gender. The paper concludes by asking how machines perceive our identities and what insights queer bodies can offer to the narrative of AI and gender.

Keywords: face recognition, data classification, gender, artistic practice-based research

Contexts

Digital technology has provided space, a brand-new space, for people to explore their identity. In this space, bodies seem to be free, with digital skins, genders, and ages, allowing them to perform and transform themselves in infinite ways (Russell, 2020). Users can easily access portals, pop up as a young man from Southern California on a weekend morning, or effortlessly transform into a subculture lesbian in East London’s graffiti streets. We celebrate the extension of the body that digital technology brings (McLuhan, 2001) while being gazed upon, monitored, and subjected to violence (UNFPA, n.d.). As the concepts of digital twins, metaverse, and hybrid collaboration brought by the post-pandemic era and the acceptance of working spaces continue to be highlighted, technology is digitising not only the items in the physical world but also everyone who participates in it, or who has already been digitised. Either as a data file or as a smooth-surfaced 3D modelling character. We stare at our colleagues’ every move on the screen while remotely collaborating with software, observing each other, and worrying about the laptop camera’s “gaze” so covering it with a bright sticker. Does technology bring us liberation of the body or are we dancing in the matrix, an illusion of a painted prison?

“See” my Gender

The domains of image detection and analysis stand as arenas where machines possess the capability to ‘perceive’ human presence, as expounded by Scheuerman et al. (2019). The outcomes of facial recognition systems effectively serve as visual representations of how machines ‘observe’ us. As an individual whose identity transcends the confines of the binary female gender spectrum, I’ve traversed a landscape riddled with queries and perplexities in the realm of physical structures. The challenges surrounding transgender individuals and the establishment of communal facilities are not novel occurrences – they encompass issues such as access to spaces affirming one’s gender identity, including public restrooms (National Center for Transgender Equality, 2016; Bagagli et al., 2021), in addition to instances of prejudice and harassment experienced within public domains (NRCDV, n.d.; Trans PULSE, 2014).

However, when I first opened facial recognition in a workshop, most of the recognition outcomes were markedly skewed towards male identities, accompanied by confidence probabilities exceeding 90%. I thought I was free in this digital space, but unsettling results manifested. When everyone discussed their recognition results and made different faces while waiting for the digital changes, I still felt fearful and frustrated about the violence and prejudice I encountered in real life, which were being expanded in the digital space. This is how I began working as a queer researcher with AI.

How is binary gender deeply embedded in digital space? Scheuerman (2019) examined commercial facial recognition systems from 10 well-known companies, including Amazon, Microsoft, and IBM, and found that seven of the systems used binary gender (male/female) as one of the facial analysis tags. Facial databases are image data used to train and test facial processing algorithms, with each data subset having a set of associated annotations conceptualized as target variables. Taking the well-known facial database LFW (Labeled Faces in the Wild) as an example, this dataset contains over 13,000 face images collected from the web. LFW image annotations include race, hair colour, age range, gender, and a series of similar attributes. (2018) These attribute values are used as given inputs for computer vision algorithms and serve as the baseline for algorithm model predictions, meaning that the dataset’s classification instantiates computer vision tasks. In LFW, gender is used as one of the attributes and annotated with the labels “male” and “female” which are classified by the character’s names in the images. LFW’s web page states “Eventually, we manually reviewed each category of male and female images three times to completely eliminate any incorrect labels (2018).” In face recognition technology, gender is not considered a personal statement, but externally assigned. Technical practitioners manage and enforce gender binary divisions in the absence or denial of gender-related knowledge.

When the algorithm model is used to predict gender, the model’s results are also given a classification. Therefore, database classification plays a crucial role in computer predictions. One of the main drawbacks of machine learning is that its functionality relies heavily on data, and as data is collected, the dominant power structure is often specified. As with any artefacts, Winner (1980) argues that certain technologies in themselves have political properties. Whether artificial intelligence technologies “can be used to enhance the power, authority and privilege of some over others” Practitioners regard gender as the “gold standard” for designing and evaluating machine learning algorithms (Lockhart, 2022), transform their lack of knowledge about gender into a lazy default for binary gender and bring it into data collection and algorithm model design, masking heteronormativity with the illusion of “technical neutrality” and binary recognition results given by the computer.

Furthermore, the application of machine learning techniques has led to an unsettling deepening of social biases, thereby amplifying discriminatory practices, and fostering the curbing of diversity across numerous dimensions. In this context, Kleinberg et al. delve into the intricate ways algorithmic decision-making systems perpetuate bias and underscore the urgency for regulatory interventions (2018). Meanwhile, Noble uncovers a disconcerting reality, revealing that search engine results and algorithms are rife with negative biases against women of colour (2018). Buolamwini found that commercial AI systems from leading tech companies misclassified women and darker-skinned people (2018). Keyes expose the implicit or explicit use of ‘gender’ as a binary, immutable and physiologically distinguishable concept. Thus, further discussing the neglect and exclusion of automatic gender recognition for trans (2018). In using these technologies, a third-party machine intervenes to produce judgments that deepen the political violence of their existence under the guise of Algorithmic Neutrality. (Wojcik and Remy, 2021). Most studies demonstrate the severe consequences of this for ethnic minorities (Chiusi, 2020; Eubanks, 2019).

A series of studies consistently underscore the severe repercussions of these biases, particularly for ethnic minorities (Chiusi, 2020; Eubanks, 2019). Amidst these revelations, we find ourselves ensconced within the confines of an enigmatic black box – a construct that seemingly extols the virtues of diverse gender options within various software user registration systems. Paradoxically, we perceive our corporeal forms as emancipated entities in the digital realm, all the while unaware that the very opaqueness of this black box contributes to our continued entrapment.

We are all In Between

What insights can queer bodies offer regarding the narrative of artificial intelligence and gender? In “Trans* Architectures,” Jack Halberstam (2018) deftly deconstructs the notion of “the wrong body,” reframing it not as an assertion of correctness, but as a revolutionary act aimed at dismantling the societal constructs that uphold judgments based on divergent expressions. The landscape of transgender bodies stands in stark contrast to the “normative body,” subverting and unsettling the binary paradigm through its very fragmentation and inherent contradictions. Queer bodies, in their unapologetic nonconformity and propensity for disruption, redefine spatial dynamics meticulously engineered by heteronormativity itself (Ahmed, 2006). “Transgender/queer bodies provide a blueprint for deconstructing the binary,” Halberstam aptly asserts (2018).

Some researchers have entangled the digital and physical realms in digital technology, activity Queer’s body in virtual space. Artist Emily Martinez trained an artificial intelligence model called Queer AI using text content that incorporates elements of erotica, feminism, and queer theory. Reclaiming agency and fighting bias through AI storytelling (2022). In order to get rid of sound stereotyping and sexism and gender bias in the technology industry, the research team designed a neutral voice assistant Meet Q (2019).

Facial recognition systems perform poorly for transgender individuals and cannot classify non-binary genders (Scheuerman et al., 2019). My face always confuses machines – when I cover my left eye, I am recognized as female, and when I cover my right eye, I become male. I conducted a series of experiments to verify this, and some of the recognition results can be summarized as a machine’s perception of certain parts of my face as “male” or “female.” When I specifically interfere with these parts, the system changes its recognition results. However, the recognition results are influenced by many factors such as lighting and shooting angle and are not always consistent. Next, I will discuss how to activate my queer body in facial recognition through media art to unveil the intricate choreography between the technologically constructed and the authentically lived. This work seeks to not only expose the profound dissonances and tensions that underscore the relationship between identity and algorithmic perception but also to illuminate the potential for liberating these digital confines through the embodiment of authenticity and the celebration of multifaceted existence.

In 2020, I did my first digital gender performance (see F), using various materials (including paper, metal, textiles, ropes, and makeup) to change my appearance and repeatedly requesting facial recognition to identify and confirm me. This performance aimed to reflect on the tension between facial recognition and queer bodies. I primarily worked with Amazon’s facial recognition system, Amazon Rekognition. First, Rekognition uses binary gender as a label, and the system also provides an interesting percentage score referred to as the confidence score (Amazon Web Services, 2023), which ranges from 50% to 100% and the value tends to infinity at both ends. It is not hard to obtain data of 99.9% male or 99.9% female during countless data submissions and identification processes, but there has never been 100% data. This part can be explained by the algorithm, but these data also reflect that all facial data are between 99.9% male and 99.9% female. Binary facial recognition systems trained on binary databases still seem unable to recognize 100% binary genders, making us wonder what the remaining 0.1% means. We are all in between.

Figure1. “See my gender”. (Video available at: https://renqingyi.com/see-my-gender)

In the year 2021, I embarked on a transformative journey, symbolized by the deliberate shearing of my hair. This symbolic act propelled me into the realm of performance, allowing me to amass a comprehensive database consisting of a hundred images. To infuse this repository with gender annotations, I enlisted the capabilities of Amazon Rekognition, yielding an intriguing blend of gender labels oscillating between 51% and 99% for male and female attributions.
Using this database, I trained a machine learning GANS system to randomly generate images of my face with any gender figure label (see Figure 2, Figure 3). The resulting images cast me in a role analogous to an avatar, boldly interrogating the very performative nature of digital gender, and challenging prevailing notions by shedding light on the disparities present in gender recognition data procured from Amazon Rekognition. The experiment calls into question the conventional practice of pigeonholing identities within binary gender constructs through face recognition systems.

I worked with Amazon Rekognition for 7 months from 2020 to 2021. During this tenure, I subjected my diverse visages to countless iterations of submission, perpetually subjected to Amazon Rekognition’s relentless scrutiny. These interactions yielded thousands of outcomes, as the system tirelessly identify, classify, and define me. Regrettably, amidst this ceaseless assessment, I remained conspicuously voiceless – never afforded the opportunity to present my perspective or defend my identity.

Moreover, the opaqueness of Amazon Rekognition’s processes raises disconcerting queries about the fate of my submitted photos. These images, imbued with elements of my identity, now reside within unknown realms. The conditions under which they were classified, the categorical labels assigned, and the repositories in which they were ensconced remain shrouded in uncertainty.

Figure2. GANs generate faces.
Figure3. GANs generate faces with gender labels.

Gender bias in AI applications

The gender binary, deeply rooted in societal constructs, has long influenced our perceptions of identity and roles. As artificial intelligence (AI) becomes increasingly integrated into various aspects of our lives, its impact on shaping and reinforcing these binary notions of gender is becoming evident. In addition to the face recognition discussed earlier, in this section, I will briefly discuss gender bias in some AI applications to explain the pervasiveness of the problem and prompt us to reflect critically on the transformative potential of these technologies.

First, voice assistants are commonly assigned feminine names like Alexa and Cortana, and default to using female voices, perpetuating traditional gender stereotypes and reinforcing gender biases (Hwang et al., 2019; Baraniuk, 2022). I also examined some of the applications I frequently use. The translation software DeepL utilises AI to enhance machine translation accuracy (2023). However, when translating from Chinese to English, the software fails to translate the term “Queer” from Chinese (see Figure 4, Figure 5). For instance, when I provided the Chinese sentence “A person is cooking in the kitchen, and the person puts the child’s sandwich into the child’s school bag,” which does not use any gender-specific pronouns, DeepL’s translation automatically assigned a female subject to the sentence (see Figure 6). Interestingly, by adding an adjective before “person,” DeepL provided a different translation: “A successful person is cooking in the kitchen, and this person puts the sandwiches he has made for his child into his child’s school bag.” The subject of the sentence was changed to “He” (see Figure 7).

Figure 4. DeepL Translation Result 1.
Figure 5. DeepL Translation Result 2.
Figure 6. DeepL Translation Result 3.
Figure 7. DeepL Translation Result 4.

Similar situations also arise in some AI generation software. In the text-to-image generator Bluewillow, the software fails to recognize the term “lesbian.” Through multiple attempts, I found that Bluewillow strives to portray diversity (Figure 8, Figure 9 and Figure 10). However, when I changed the sentence to “a successful human,” the generated image featured a white male as the main character Figure 11. When using the phrase “a successful woman,” the generated image depicted a white female (Figure 12).

Figure 8, Figure 9, Figure 10. AI-Generated Image from Bluewillow 1, 2, 3.

Figure 11. AI-Generated Image of “a successful human” from Bluewillow.
Figure 12. AI-Generated Image of “a successful woman” from Bluewillow.

These observations underscore the urgent need for a comprehensive reassessment of AI technologies to ensure they are inclusive and representative of diverse gender identities. Recognizing the potential for AI to perpetuate gender biases, we must actively work to challenge and transform these systems to better reflect the realities and complexities of human identities.

Conclusion

In this exploration of the intricate interplay between technology and identity, we uncover a multifaceted landscape in which AI capabilities both illuminate and constrain nuances of gender expression. Queer communities are becoming powerful agents of insight, revealing the liberating potential of technology, but also the inherent biases it can perpetuate.

Our work with AI applications like, Amazon Rekognition is a poignant reminder of the delicate balance between human agency and technological dominance. Despite tremendous advances in artificial intelligence, limitations, and inaccuracies in recognizing and representing the complexity of identities persist. It’s a stark reminder that as technology evolves, we must also be aware of its potential biases and shortcomings. Our efforts to blend technology and humanity must be driven by a commitment to inclusivity, authenticity, and a relentless pursuit of a digital landscape that reflects our intricate realities.

Acknowledgements

First of all, I would like to express my deepest gratitude to the Anthways team for their support, especially the review and editorial team. This article summarises the most important work during my master’s study. I am very grateful to all my tutors and peers in MA Interaction Design at the London College of Communication for their help and support in discovering the most beautiful part of me, my Queerness. I grew up in a completely different culture and have always been confused about who I am. I am very grateful that MA ID 20_21 gave me a safe environment. Especially Dr Wesley Goatley, without whose help I would not be able to continue my doctoral research on gender and artificial intelligence.

Bibliography

Ahmed, S. (2006). Queer Phenomenology: Orientations, Objects, Others. Duke University Press.

Amazon Web Services (2023) Developer Guide: Detecting and analyzing faces. Available at: Bagagli, B.P., Chaves, T.V. and Zoppi Fontana, M.G. (2021) ‘Trans women and public restrooms: The legal discourse and its violence’, Frontiers in Sociology, 6. doi:10.3389/fsoc.2021.652777.
https://docs.aws.amazon.com/rekognition/latest/dg/faces.html (Accessed: 28 April 2023).

Baraniuk, C. (2022) Why your voice assistant might be sexist, Bbc.com. Available at: https://www.bbc.com/future/article/20220614-why-your-voice-assistant-might-be-sexist (Accessed: 14 August 2023).

Bluewillow AI Generator. (2023). AI-Generated Image [Generated Image]. Available at: https://www.bluewillow.ai/ (Accessed on April 15, 2023).

Butler, J. (1990). Gender Trouble: Feminism and the Subversion of Identity. New York; London: Routledge.

Brown, E. and Perrett, D. I. (1993) ‘What Gives a Face its Gender?’, Perception, 22(7), pp. 829–840.

Chiusi, F. (2020) ‘In Italy, an appetite for face recognition in football stadiums ‘, Algorithm Watch, 16 September, Available at: https://algorithmwatch.org/en/italy-stadium-face-recognition/ (Accessed: 23 April 2023).

Computer Vision Lab (2018) Labeled faces in the wild home. Available at: http://vis-www.cs.umass.edu/lfw/ (Accessed: 28 April 2023).

DeepL Translation Software. (2023). DeepL Translation Results [Screenshot]. Captured on April 15, 2023.

Eubanks, V., 2019. Automating inequality. New York: Picador.

Halberstam, J. (2018) ‘Unbuilding Gender’, Places Journal , Available at: https://doi.org/10.22269/181003 (Accessed: 28 April 2023).

Russell, L. (2020) Glitch Feminism: A Manifesto. Verso Books.

Keyes, O. (2018) ‘ The Misgendering Machines: Trans/HCI Implications of Automatic Gender Recognition’, Proceedings of the ACM on Human-Computer Interaction, 2(CSCW), pp.1-22. doi:10.1145/3274357.

Hwang, G. et al. (2019) ‘It sounds like a woman’, Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems. doi:10.1145/3290607.3312915.

Kleinberg, J. et al. (2018) ‘Discrimination in the age of algorithms’, Journal of Legal Analysis, 10, pp. 113–174. doi:10.1093/jla/laz001.

Martinez, E. (2023) Queer AI. Available at: https://queer.ai/ (Accessed: 11 August 2023).

McLuhan, M., 1995. The Playboy interview. Essential McLuhan, pp.233-269.

Meet Q The First Genderless Voice (2019) Available at: https://www.genderlessvoice.com/ (Accessed: 11 August 2023).

MIT Media Lab. 2018. Gender Shades. Available at: https://www.media.mit.edu/projects/gender-shades/overview/ (Accessed: 23 April 2023).

National Center for Transgender Equality(2016) ‘Transgender People and Bathroom Access’. Available at: https://transequality.org/issues/resources/transgender-people-and-bathroom-access (Accessed: 14 August 2023).

Noble, S.U. (2018) Algorithms of oppression: How search engines reinforce racism. New York: New York University Press.

NRCDV (no date) Violence against trans and non-binary people, VAWnet.org. Available at: https://vawnet.org/sc/serving-trans-and-non-binary-survivors-domestic-and-sexual-violence/violence-against-trans-and (Accessed: 14 August 2023).

Ren, Q. (2021) IN BETWEEN. Available at: https://renqingyi.com/in-between (Accessed: 27 April 2023).

Ren, Q. (2021) See my gender. Available at: https://renqingyi.com/see-my-gender (Accessed: 28 April 2023).

Russell, L. (2020) Glitch Feminism: A Manifesto. London: Verso.

Scheuerman, M.K., Paul, J.M. and Brubaker, J.R. (2019) ‘How Computers See Gender: An Evaluation of Gender Classification in Commercial Facial Analysis Services.’ Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), pp. 1–33. Available at: https://doi.org/10.1145/3359246 (Accessed: 24 April 2023).

Trans PULSE (2014) ‘Avoidance of Public Spaces by Trans Ontarians: The Impact of Transphobia on Daily Life’, Trans PULSE E-Bulletin, 4(1), pp. 1–3.

UNFPA (no date) ‘Technology-facilitated gender-based violence: A growing threat.’ United Nations Population Fund. Available at: https://www.unfpa.org/TFGBV (Accessed: 14 August 2023).

Why Deepl? (no date) Why DeepL? Available at: https://www.deepl.com/en/whydeepl (Accessed: 14 August 2023).

Winner, L. (1980). ‘Do Artifacts Have Politics?’, Daedalus, 109(1), 121–136.

Wojcik, S., Remy, E. and Baronavski, C. (2021) How does a computer ‘see’ gender? Pew Research Center. Available at: https://www.pewresearch.org/interactives/how-does-a-computer-see-gender/ (Accessed: 27 April 2023).