Transgender teenager ripping up sign for female and male, highlighting gender equality

Facial recognition tech struggles to identify transgender people

Image credit: Sam Wordley | Dreamstime.com

According to new research by the University of Colorado (CU) Boulder, emerging facial recognition services often mischaracterise transgender and non-binary individuals.

With a brief glance at a single face, emerging facial recognition software can now categorise the gender of many men and women with high accuracy. However, a study by University of Colorado Boulder, US, has found that if a face belongs to a transgender or non-binary person, such systems get their genders wrong more than one-third of the time.

“We found that facial analysis services performed consistently worse on transgender individuals and were universally unable to classify non-binary genders,” said Morgan Klaus Scheuerman, a PhD student in the university’s Information Science department.

“While there are many different types of people out there, these systems have an extremely limited view of what gender looks like.”

The study comes at a time when facial analysis technologies, which use hidden cameras to assess and characterise certain features about an individual, are becoming increasingly prevalent. For example, it is used in applications such as smartphone dating apps and the more controversial law enforcement surveillance systems.

Previous research has also suggested that such technology tends to be most accurate when assessing the gender of white men, but frequently misidentifies women of colour. An MIT study found that the error rates in determining the sex of dark-skinned women were 20.8 per cent, 34.5 per cent and 34.7 per cent.

“We knew there were inherent biases in these systems around race and ethnicity and we suspected there would also be problems around gender,” said senior author Jed Brubaker, an assistant professor of Information Science. “We set out to test this in the real world.”

As part of the study, the researchers collected around 2,450 images of faces from Instagram, each of which had been labelled by its owner with a hashtag indicating their gender identity.

The team then divided the pictures into seven groups of 350 images (#women, #man, #transwoman, #transman, #agender, #agenderqueer, #nonbinary), which were analysed by four of the largest providers of facial analysis services: IBM, Amazon, Microsoft and Clarifai.

On average, all four systems were most accurate with photos of cisgender women (those born female and identifying as female), getting their gender right 98.3 per cent of the time. They also categorised cisgender men accurately 97.6 per cent of the time.

However, trans men were wrongly identified as women up to 38 per cent of the time and those who identified as agender, genderqueer or nonbinary – which indicates they identify as neither male nor female – were misidentified 100 per cent of the time.

Following the results, Brubaker argued: “These systems don’t know any other language but male or female, so for many gender identities it is not possible for them to be correct.”

The study also suggests that such services identify gender based on outdated stereotypes. This assumption was proven when Scheuerman, who is male and has long hair, submitted his own picture to all four systems and found that half categorised him as female.

Software identifying Morgan as female

When researcher Morgan Klaus Scheuerman, who is a man, submitted his photo to several facial analysis services, half got his gender wrong

Image credit: Morgan Klaus Scheuerman/CU Boulder

Although the researchers could not get access to the training data - the image inputs used to 'teach' the system what male and female looks like - previous research has suggested that these computer vision systems assess physical characteristics of an individual such as their eye position, lip fullness, hair length and even clothing.

“These systems run the risk of reinforcing stereotypes of what you should look like if you want to be recognised as a man or a woman. And that impacts everyone,” Scheuerman said.

The market for facial recognition services is projected to double by 2024, as tech developers work to improve human-robot interaction and more carefully target ads to shoppers.

“They want to figure out what your gender is, so they can sell you something more appropriate for your gender,” Scheuerman explained, referring to the example of a shopping centre in Canada which used a hidden camera in a kiosk to achieve this.

Brubaker noted people engage with facial recognition technology every day to gain access to their smartphones or log in to their computers and if such systems tend to misgender certain populations that are already vulnerable, that could have consequences for the individual.

For instance, a match-making app could set someone up on a date with the wrong gender, leading to a potentially dangerous situation, Scheuerman argued. However, he also expressed how he is most concerned that such systems reaffirm notions that transgender people don’t fit into society.

“People think of computer vision as futuristic, but there are lots of people who could be left out of this so-called future,” he said.

The authors added that they would ideally like to see tech companies move away from gender classification entirely and stick to more specific labels such as 'long hair' or 'make-up' when assessing images.

“When you walk down the street, you might look at someone and presume that you know what their gender is, but that is a really quaint idea from the ‘90s and it is not what the world is like anymore,” Brubaker said. “As our vision and our cultural understanding of what gender is has evolved, the algorithms driving our technological future have not. That’s deeply problematic.”

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles