Photo: iStock

When you go to your local DMV office and stand in front of the camera to take as decent a photo as possible, you probably aren’t thinking about the possibility that said photo will come up in a facial-recognition search used by government agencies. If this is a concern you never had before, you should definitely have it now.

The Washington Post was provided “thousands of facial-recognition requests, internal documents and emails over the past five years, obtained through public-records requests” from researchers at Georgetown Law’s Center on Privacy and Technology, and what they reveal is stunning.

Advertisement

Agents from the FBI as well as Immigration and Customs Enforcement have used state driver’s license databases to scan through “millions of Americans’ photos without their knowledge or consent.”

From the Post:

Police have long had access to fingerprints, DNA and other “biometric data” taken from criminal suspects. But the DMV records contain the photos of a vast majority of a state’s residents, most of whom have never been charged with a crime.

And that is the crux of the problem. This particular use of driver’s license photos for facial-recognition searches—which can lead to dangerous and damaging errors—is being done without the consent of the people, something that members of Congress have spoken out about, including the two highest-ranking members of the House Oversight Committee.

The Post reports that Rep. Jim Jordan (R-Ohio) had concerns that this is happening without any type of consent or buy-in from state lawmakers or the individuals whose license pictures are being used.

Advertisement

During a hearing last month, Jordan said “They’ve just given access to that to the FBI. No individual signed off on that when they renewed their driver’s license, got their driver’s licenses. They didn’t sign any waiver saying, ‘Oh, it’s okay to turn my information, my photo, over to the FBI.’ No elected officials voted for that to happen.”

Similarly, Rep. Elijah Cummings (D-Md.)—chairman of the House Oversight Committee—expressed similar concerns about consent.

Advertisement

In a statement to the Post, Cummings said “Law enforcement’s access of state databases,” particularly DMV databases, is “often done in the shadows with no consent.”

But it really is about more than consent.

As reported in a July 8 New York Times article, the city of Detroit started an initiative called Project Green Light in 2016 with the aim of curbing crime in the city. As such, thousands of cameras monitor “gas stations, restaurants, mini-marts, apartment buildings, churches and schools” 24 hours a day and stream those images directly to police department headquarters. Detroit Mayor Mike Duggan has promised to expand the network of cameras “to include several hundred traffic light cameras would allow the police to ‘track any shooter or carjacker across the city.’”

Advertisement

Detroit has received pushback from the public recently because Project Green Light includes a software tool that can suggest the identity of the people captured on its cameras.

From the Times:

The facial recognition program matches the faces picked up across the city against 50 million driver’s license photographs and mug shots contained in a Michigan police database. The practice has attracted public attention recently as the department seeks approval for a formal policy governing its use from a civilian oversight board.

“Please, facial recognition software—that’s too far,” pleaded one resident at a recent meeting of the board.

Advertisement

The problem?

Facial-recognition software tends to be skewed when it comes to people of color.

Advertisement

“Facial recognition software proves to be less accurate at identifying people with darker pigmentation,” George Byers II, a black software engineer, told Detroit’s police board last month. “We live in a major black city. That’s a problem.”

From the Times:

Researchers at the Massachusetts Institute of Technology reported in January that facial recognition software marketed by Amazon misidentified darker-skinned women as men 31 percent of the time. Others have shown that algorithms used in facial recognition return false matches at a higher rate for African-Americans than white people unless explicitly recalibrated for a black population—in which case their failure rate at finding positive matches for white people climbs. That study, posted in May by computer scientists at the Florida Institute of Technology and the University of Notre Dame, suggests that a single algorithm cannot be applied to both groups with equal accuracy.

Advertisement

The potential is there for false identifications that can lead to troubling results including innocent people being misidentified and arrested—and that can create a domino effect of additional problems.

According to the Post, the Government Accountability Office said last month that the FBI has done more than 390,000 facial recognition searches since 2011, including searches through federal, local and DMV databases.

Advertisement

It’s all a creepy sign that we are moving closer and closer to the Big Brother type of government oversight that Orwell warned us about. That weird future we were worried about is already here, and it’s not going anywhere any time soon.

Jake Laperruque, a senior counsel at the watchdog group Project on Government Oversight, told the Post: “It’s really a surveillance-first, ask-permission-later system. People think this is something coming way off in the future, but these [facial-recognition] searches are happening very frequently today. The FBI alone does 4,000 searches every month, and a lot of them go through state DMVs.”

Advertisement

From the Post:

The FBI’s facial-recognition search has access to local, state and federal databases containing more than 641 million face photos, a GAO director said last month. But the agency provides little information about when the searches are used, who is targeted and how often searches return false matches.

The FBI said its system is 86 percent accurate at finding the right person if a search is able to generate a list of 50 possible matches, according to the GAO. But the FBI has not tested its system’s accuracy under conditions that are closer to normal, such as when a facial search returns only a few possible matches.

Advertisement

As private citizens, we don’t have a way to control the information the government keeps on us.

But now they have a problematic way of tracking our every move and tying us to events and situations they think we are involved in.

Advertisement

Even if it might not necessarily be “us.”