Not Only Does Facial Recognition Software Often Get Racial ID Wrong, a New Study Finds It Misgenders Trans and Nonbinary Folks Almost All the Time

Illustration for article titled Not Only Does Facial Recognition Software Often Get Racial ID Wrong, a New Study Finds It Misgenders Trans and Nonbinary Folks Almost iAll/i the Time
Photo: Shutterstock

Nonbinary and transgender identities are increasingly being accepted and normalized throughout society, but not when it comes to high-tech facial recognition software developed by some of the largest tech firms in the world.

Advertisement

According to Forbes, a recent study by the University of Colorado, Boulder, found that when it comes to transgender and nonbinary people, facial recognition software misidentifies transgender people about a third of the time and always gets it wrong when it comes to nonbinary folks, those who identify as neither male nor female.

The issue is that while society is increasing its vocabulary when it comes to identifying and labeling gender, computer programs are still binary in their design.

Advertisement

“Training a system to recognize gender beyond the binary breaks the purpose of the system,” Morgan Klaus Scheuerman, lead author of the study, told the Daily Camera. “If someone wants to include nonbinary identities in their algorithm, the problem becomes that nonbinary people look like any other people, so the system won’t know how to classify anyone.”

Scheuerman noted that systems’ typical use of “contextual labels,” like whether someone was wearing a dress or had long hair, to determine gender indicated that “traditional concepts of gender are ingrained in facial recognition algorithms,” as the Daily Camera explains.

And so far, according to Scheuerman, a common response, if any, by tech companies has been to remove any classification of gender by their facial recognition algorithms, rather than try to teach them nonbinary concepts regarding gender.

Said Scheuerman’s co-author Jed Brubaker in a statement, per Forbes:

“We knew there were inherent biases in these systems around race and ethnicity and we suspected there would also be problems around gender. Bottom line: What we found is that the computer vision systems that run all of our facial detection and facial analysis do not handle the level of gender diversity that we live with every day.”

Advertisement

Of course, as Daily Camera notes, removing all binary gender classifications from such systems is not necessarily prudent, given their usefulness in certain urgent tasks like locating a missing child.

But the slow adaptation of such crucial technology to be gender-nonconforming is frustrating for the LGBTQiA+ community, Mardi Moore, executive director of Colorado’s Out Boulder County organization, told Daily Camera.

Advertisement

“This kind of stereotyping flies in the face of what we know to be true about humans,” Moore said. “It doesn’t reflect the real world.”

Share This Story

Get our newsletter

DISCUSSION

adohatos
A Drop of Hell, A Touch of Strange

I get why they’re removing the classification. There’s no way to code this that will not make a mistake that is offensive in some way to someone. Unlike mistakes related to race and ethnicity using this technology it’s unlikely that a more diverse programming team or a larger data set would make a significant difference as far as accuracy.

The only solution I can think of is pretty invasive. Hook the damn thing up to social media and have it scour profiles for keywords, hashtags, etc. in order to make a more informed classification. Even simply asking people at the point of interface then saving that info would create a trackable profile even if it wasn’t directly associated with personal information.

Absent new technologies or breakthroughs in software I don’t think facial recognition will be a good tool for use with trans or nonbinary people at least in the short term. Unfortunately they tend to have higher than average rates of disappearances and other events where it could be helpful.

Since software is a business the companies will ignore this as long as possible because any potential solution will be time-consuming and expensive and since the potential market is small there’s little incentive. I think the answer has to be legislation whether it's grants for development, implementation of inclusive standards with penalties for a high error rate or both.