
Second session: Facing FACETS (University of Turin)
31 mrt. 2023
Giovanni Pennisi, University of Turin
In a paper published in 2018, Os Keyes investigated how the literature on Automated Gender Recognition (AGR) systems conceived gender, finding out that 94.8% of the papers treated it as binary, 72.4% as immutable, and 60.3% as a physiological component. In the author’s mind, this is indicative of an operationalisation of gender, that is, the assumption that the layer is a discrete and objectively applicable parameter. Keyes argues that such a vision is especially dangerous for transgender people. Here I will follow this observation, providing several examples that show how AGR systems’ failures in recognizing the faces of transgender people are capable of both perpetuating and amplifying gender stereotypes and inequalities. Then, I will introduce the notion of intersectionality, which is the idea that humans “sit at the crossroads” of many physical, social, and political factors, whose combination generates dynamics of discrimination or privilege. This will serve a double purpose: on the one hand, it will allow me to focus on those cases in which the overlapping of identity aspects that are historically associated with severe oppression – i.e., black transgender – produces particularly dreadful episodes of discrimination perpetrated by both human and technological agents; on the other, it will enable a comprehensive of the literature that advocates for the overview of the adoption of intersectional approaches to make AI more equitable.