Amazon’s Face Recognition Disproportionately Matched Congressional Members of Color With Mugshots

Members of the Congressional Black Caucus listen as President Donald J. Trump delivers the State of the Union speech before members of Congress in the House chamber of the U.S. Capitol on January 30, 2018 in Washington, DC.
Members of the Congressional Black Caucus listen as President Donald J. Trump delivers the State of the Union speech before members of Congress in the House chamber of the U.S. Capitol on January 30, 2018 in Washington, DC.
Photo: Photo by Melina Mara (The Washington Post via Getty Images)

Twenty-eight members of Congress, including six members of the Congressional Black Caucus, were falsely identified as suspects charged with a crime on Amazon’s facial recognition technology, according to a test conducted by the American Civil Liberties Union.


The test results of Amazon Rekognition, released Thursday, revealed that most of the politicians misidentified by the technology were black and Latino, including U.S. Rep. John Lewis.

“This test confirms that facial recognition is flawed, biased and dangerous,” said Jake Snow, a technology and civil liberties lawyer with the A.C.L.U. of Northern California, according to the New York Times.

Three of misidentified lawmakers wrote a letter (pdf) to Amazon CEO Jeff Bezos addressing the ineffectiveness of the technology, asking the company to provide a list of government agencies using Rekognition, with specific emphasis on law enforcement bodies and intelligence branches.

Here is more on the negative impacts of Amazon’s facial recognition software, per the ACLU:

If law enforcement is using Amazon Rekognition, it’s not hard to imagine a police officer getting a “match” indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins. Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification.

An identification — whether accurate or not — could cost people their freedom or even their lives. People of color are already disproportionately harmed by police practices, and it’s easy to see how Rekognition could exacerbate that. A recent incident in San Francisco provides a disturbing illustration of that risk. Police stopped a car, handcuffed an elderly Black woman and forced her to kneel at gunpoint — all because an automatic license plate reader improperly identified her car as a stolen vehicle.

Here is what Amazon said in response, per The New York Times:

Nina Lindsey, an Amazon Web Services spokeswoman, said in a statement that the company’s customers had used its facial recognition technology for various beneficial purposes, including preventing human trafficking and reuniting missing children with their families. She added that the A.C.L.U. had used the company’s face-matching technology, called Amazon Rekognition, differently during its test than the company recommended for law enforcement customers.

For one thing, she said, police departments do not typically use the software to make fully autonomous decisions about people’s identities. “It is worth noting that in real-world scenarios, Amazon Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgment,” Ms. Lindsey said in the statement.


That statement doesn’t cut it, as civil liberties groups claim the system can be used against people participating in protests, for example. People of color won’t trust the cops to use it as Amazon intended. Police are already racially profiling people of color at disproportionate rates, so Amazon’s facial recognition system is doing nothing more than exacerbating the problem.

Let’s hope Congress puts pressure on Bezos to pull the system as it does nothing more than put people of color at risk.


By the way, here is a list of members of Congress misidentified by the system:


  • John Isakson (R-Georgia)
  • Edward Markey (D-Massachusetts)
  • Pat Roberts (R-Kansas)


  • Sanford Bishop (D-Georgia)
  • George Butterfield (D-North Carolina)
  • Lacy Clay (D-Missouri)
  • Mark DeSaulnier (D-California)
  • Adriano Espaillat (D-New York)
  • Ruben Gallego (D-Arizona)
  • Thomas Garrett (R-Virginia)
  • Greg Gianforte (R-Montana)
  • Jimmy Gomez (D-California)
  • Raúl Grijalva (D-Arizona)
  • Luis Gutiérrez (D-Illinois)
  • Steve Knight (R-California)
  • Leonard Lance (R-New Jersey)
  • John Lewis (D-Georgia)
  • Frank LoBiondo (R-New Jersey)
  • David Loebsack (D-Iowa)
  • David McKinley (R-West Virginia)
  • John Moolenaar (R-Michigan)
  • Tom Reed (R-New York)
  • Bobby Rush (D-Illinois)
  • Norma Torres (D-California)
  • Marc Veasey (D-Texas)
  • Brad Wenstrup (R-Ohio)
  • Steve Womack (R-Arkansas)
  • Lee Zeldin (R-New York)



Anyone who thinks machine learning or “artificial intelligence” have any place in law enforcement have no fucking clue how unreliable and poorly suited this technology is.

If these type of programs are unreliable at getting bootlegs off of YouTube or figuring out which animal is which, why in the fuck would you ever want it near criminal justice?

Also, the racist applications of this bullshit needs no introduction. This is beyond ripe for abuse.