Science

Why Is Facial Recognition Important? 3 Signs It's Too Big to Ignore Any Longer

"It's the Wild Wild West out there."

Inverse 

If you’re an adult in the United States, there’s a good chance your face is in a law enforcement database, even if your criminal record is clean as a whistle.

More than half of American adults are included in a facial recognition network searchable by law enforcement, according to 2106 report from the Georgetown Center for Privacy and Technology. And that was in 2016. It’s even more likely you are in a law enforcement database today.

The FBI has 640 million photos in its facial recognition database, according to a recent report by the Government Accountability Office, released this month ahead of a congressional hearing on law enforcement, facial recognition, and government transparency.

To think of it another way, 640 million is enough for two images for every single person in the United States.

The FBI’s astonishing number of photos, paired with the shocked reactions of lawmakers who appeared clueless at the extent of the FBI database, is emblematic of a larger, terrifying reality: Facial recognition has become too scary of an issue for the government to ignore.

Multiple experts examining surveillance and civil liberties tell Inverse that between the lack of oversight, the inaccuracy of aspects of facial recognition systems, and the way such programs are rapidly spreading, it’s clear that regulation has to happen, and soon.

3. There Is a Lack of Government Oversight

Tech has always moved faster than the regulation meant to oversee it. Democracy is designed to be slow while “move fast and break things” was long the mantra of Silicon Valley (though the embrace of that careless attitude is fading fast). As facial recognition becomes more powerful, people are at risk of being misidentified and wrongly arrested, detained, or fined.

The lack of oversight on facial recognition has already had systemic consequences: Without anyone to stop them, law enforcement agencies and private companies have been able to develop and use their own systems, hoovering up images from social media, your driver’s license, and other sources and using them for their own ends. Policies on deployment and use are internally developed by various law enforcement agencies or police departments, with a distinct lack of uniformity across the country.

In Orlando, Florida, for example, the police department trialed Amazon Rekognition, but in Baltimore, Maryland, police use a system developed by Cognitec. At the FBI, they use an entirely different system called the Next Generation Identification system (NGI). None of these entities are playing by the same rules.

Jake Laperruque, Senior Counsel at the Constitution Project at the Project on Government Oversight, where he focuses on privacy and government surveillance, tells Inverse oversight of facial recognition borders on lawlessness.

“Right now, it really is the wild, wild, West out there for this technology,” Laperruque says. “There are really no limits on how it’s used, and it’s incredibly invasive and inaccurate.”

Computer running software.

Markus Spiske

Kimberly J. Del Greco, Deputy Assistant Director of the Criminal Justice Information Services Division of the FBI, said in testimony before the House Oversight Committee that the agency has internal best practices for using facial recognition tech, so it can promote public safety without interfering with fundamental values. They are comprised of four main points listed here.

But those FBI best practices were developed instead of outside regulation, not because of it.

Experts claim that law enforcement best practices for how to use facial recognition tech are being largely ignored.

This 2019 report from the Center on Privacy and Technology at Georgetown University, titled “Garbage In, Garbage Out: Face Recognition on Flawed Data,” found that analysts using facial recognition tech sometimes submit forensic sketches to the database, hoping for a match with a real person. But that method has been shown to fail the vast majority of the time.

The “Garbage In, Garbage Out” report found that law enforcement analysts also routinely doctor low-quality photos to make them clearer, including copying and pasting facial features from someone else’s face onto a photo of an unknown suspect. Famously, the NYPD was looking for a suspect who resembled the actor Woody Harrelson, so they put the actor’s photo into a database and ended up arresting the wrong person for petit larceny.

The Government Accountability Office made six recommendations regarding the FBI’s facial recognition system in 2016. As members of congress learned this month at the Oversight Committee hearing, the FBI has only addressed one of those six recommendations, one about whether the FBI was collecting images in accordance with Department of Justice policy.

2. Facial Recognition Systems Are Inaccurate

As if the lack of oversight weren’t already disconcerting, facial recognition mis-identifies people at a concerning rate — like that Woody Harrelson example in New York City.

By the FBI’s own admission, its system only achieves about 86 percent accuracy, according to testimony made by Del Greco. That 86 percent rate applies to when the system is asked to identify a suspect from a pool of 50 potential suspects.

The FBI said it conducted 152,500 searches using its system between fiscal year 2017 and April 2019. If that’s the case, it is possible that there could have been 21,350 instances of misidentification.

Facial Recognition is More Likely to Misidentify Women and People of Color

When the ACLU ran Amazon Rekognition on images of every member of Congress, the system incorrectly identified 28 of them as people who had been arrested or convicted of crimes, by mismatching their images to mugshot photos of other people. Nearly 40 percent of the system’s false matches were of people of color, even though they only make up 20 percent of Congress.

"The research on facial recognition’s inability to recognize black people and misgender ordinary people is chilling

The ACLU findings reinforce previous research from 2018, which found that such systems perform well on white men, but poorly on nearly everyone else. After examining various systems, researchers found that darker-skinned females were the most misclassified group, with error rates of up to 34.7 percent.

Women looking at security cameras. 

Matthew Henry 

Mutale Nkonde, a Fellow at Data & Society, describes the error rate as “chilling.”

“The research on facial recognition’s inability to recognize black people and misgender ordinary people is chilling,” Nkonde tells Inverse.

Nkonde works on the intersection of policy and facial recognition technologies, and says more testing is needed before facial recognition technology is applied.

“Facial recognition technologies are the only form of biometric data that can be gathered without consent and so they are ripe for government action,” says Nkonde.

The stakes of misidentification are not innocuous. In 2018, 998 people were shot and killed by police, the Washington Post reported. In 2019, that number stands at 334. Misidentification can lead to encounters with law enforcement in which officers may be expecting to confront a violent individual, and therefore be more likely to use lethal force.

1. These Systems Are Spreading Rapidly

There’s no definitive answer on how many facial recognition systems exist in the United States, nor how many are in development. The best we can do is follow the money, which suggests that business is booming.

The market for “facial biometrics” among government buyers alone is expected to increase from $136.9 million in 2018 to $375 million by 2025, according to projections from Grand View Research, a market research firm.

Some companies that market facial recognition tech to law enforcement are household names, including companies like Amazon.

Of course, there are a slew of smaller, less publicly known companies that are building and selling systems to, well, whoever can afford them.

The largest maker of body cameras in the United States, Axon, recently took out patents for facial recognition applications. Other companies, like Google, are developing recognition systems that can identify people in live video, rather than still photos.

A U.S. Park Service Police Officer videotapes spectators observing the USPP as they surround, or 'kettle' a group of protestors on President Trump's Inauguration Day, January 20, 2017.

Mobilus In Mobili / Wikimedia Commons

Facial recognition tech is booming because it has gotten a lot cheaper. It also used to be difficult to acquire the vast number of images needed to train such systems. Finally, computers just weren’t powerful enough.

Today, it’s easier than ever to create massive databases of images for facial recognition systems.

"“Lawmakers must put the brakes on law enforcement use of this technology"

That’s why The ACLU and a coalition of 60 other organizations are leading the charge to call for a moratorium on the use of facial recognition by federal law enforcement, until Congress debates what, if any, uses should be allowed.

The idea of a moratorium has bipartisan appeal, with Congressman Jordan suggesting at the hearing that “It’s time for a time out” on the government use of facial recognition technology. But coming up with a long-term solution to the problem may take time, as they will have to build a bill from the ground up (the June 4 hearing was only the second in a series of hearings to begin sketching out the issues a bill might address).

On the federal level, the closest analogue was perhaps The A.I. Futures Act of 2017, which was originally introduced as a messaging bill, and was intended to start the conversation about A.I. regulation. (It died in Congress).

At the local level, bills like the one just passed in San Francisco — which bans the use of facial recognition by the local government — could be a model for other cities.

Facial recognition systems have already been adopted by law enforcement agencies and they are also coming to airports and schools as well.

Neema Singh Guliani, a Senior Legislative Counsel at the American Civil Liberties Union tells Inverse she supports calling for a moratorium on facial recognition.

The FBI’s expansion of facial recognition technology happened without explicit legislative approval, meaningful public debate, or mechanisms to safeguard our rights, says Guliani.

“Lawmakers must put the brakes on law enforcement use of this technology until Congress decides what, if any, use cases are permissible,” Guliani says.

Related Tags