Vote 2020 graphic
Everything you need to know about and expect during
the most important election of our lifetimes

Twitter to Investigate Why Its Picture Cropping Algorithm Is Apparently Racist

Illustration for article titled Twitter to Investigate Why Its Picture Cropping Algorithm Is Apparently Racist
Photo: Postmodern Studio (Shutterstock)

Considering how little diversity is present in Silicon Valley, I’m never terribly surprised when some arbitrary feature on a tech platform turns out to have a racial bias. I mean, if white people aren’t thinking about how their racial bias affects people in the real world, I doubt they’re thinking about how it affects cropping a picture on Twitter.

Advertisement

Oh, you weren’t aware that Twitter’s picture cropping tool is, allegedly, kind of racist?

According to NBC News, Twitter will launch an investigation into the apparent racial bias of its picture cropping tool. The announcement comes after users reported that if a picture has a white and a Black face, the cropping algorithm will usually favor the white face.

Advertisement

The problem was first brought to light when Colin Madland, a manager at a Vancouver university, found that his Black colleague’s head would disappear during video calls on Zoom. Madland took to Twitter to find a solution only to find that Twitter had a similar problem of removing Black faces. He specifically noticed that the site’s mobile app would preview his face over his Black colleagues.

Other users began conducting tests themselves, with a tweet containing a picture of both Sen. Mitch McConnell and President Barack Obama going viral. Dantley Davis, Twitter’s Chief Design Officer, said that in Madland’s case, removing his glasses and facial hair would solve the problem. In a response to the tweet featuring McConnell’s face, the site said that an initial look into the algorithm didn’t reveal any bias but that there was still work to be done.

Advertisement

Davis has taken full responsibility for the error and has pledged he will work to fix it. “I know you think it’s fun to dunk on mebut I’m as irritated about this as everyone else. However, I’m in a position to fix it and I will,” Davis tweeted.

Advertisement

“It’s 100 percent our fault. No-one should say otherwise,” he added.

Racial bias in tech is nothing new. Earlier this year, a study found that voice recognition technology used by companies such as Apple, Google and Amazon was less likely to recognize Black voices compared to white ones. You would think the answer to the problem would be “put Black people in the room,” but that’s too easy, y’all.

Advertisement

With Google cutting diversity programs so as not to appear “anti-conservative,” and Facebook being willing to lose money to allow racism on its site, we’re probably going to have to keep enduring dumb shit like this for the foreseeable future.

Jr Staff Writer @TheRoot. Watcher of wrestling, player of video games. Mr. Steal Your Disney+ Password.

Share This Story

Get our newsletter

DISCUSSION

Well, part of this is definitely human error (and hubris) and the other part is machines doing as they are programmed to do...so, humans are at fault again!

A great analog (pun intended!) to this problem is color film, and how originally, it rendered anyone melanated...horribly! Was it outcry for diversity that fixed color film? NOPE. It was actually the furniture industry that was upset no one could tell mahogany wood, from pine, from oak in their advertisements. Seriously. Learn all about it here:

The same types of biases that were baked into color film are baked into the AI of Zoom and other face-tracking programs. If XYZ stereotypical tech company is mostly white and Asian, their devs will invariably be lazy and train their AI on images that look like them, and call it good...until there is outcry like in this article.

I work for a cybersecurity firm, and out of about two dozen stateside employees, we only have about 2.5 white employees. Thankfully, we do not deal with images, but we do deal with a great deal of AI to detect malware, and it is very tricky business to teach it the right things. That said, the right person must have their thinking cap on and feed it the right data for equitable, comprehensive results. Really a low bar, but one that must be met.