Face off

Clearview AI has your photos, but a data breach shows they haven't earned your trust

The controversial company had its entire client list stolen, and there are a lot of unanswered questions.

Face ID technology facial recognition use in biometric security for phone or mobile scanning detecti...
Shutterstock

Clearview AI is a controversial facial recognition company based out of New York City that works with law enforcement, has made a surprising announcement: that someone has stolen its entire client list. The company is telling its customers its servers were not breached, but there are a lot of unanswered questions.

The Daily Beast reports that the person gained access to the company's "list of customers, to the number of user accounts those customers had set up, and to the number of searches its customers have conducted." Considering the company works with state and federal law enforcement agencies, that could reveal quite a bit of information about what they're doing.

The company indicated that the database of billions of pictures of people it collected from the internet has not been breached. The New York Times revealed the existence of this database and the Clearview AI's ties to law enforcement agencies last month in an article that said the company "might end privacy as we know it." Law enforcement agencies use Clearview AI to match photos of people with photos in its database.

"You take a picture of a person, upload it and get to see public photos of that person, along with links to where those photos appeared," the Times wrote. "The system — whose backbone is a database of more than three billion images that Clearview claims to have scraped from Facebook, YouTube, Venmo and millions of other websites — goes far beyond anything ever constructed by the United States government or Silicon Valley giants."

Clearview’s lawyer, Tor Ekeland, tells Inverse that the company has fixed the flaw that allowed the intruder to gain access to their client list.

"Security is Clearview’s top priority. Unfortunately, data breaches are part of life in the 21st century," Ekeland says. "Our servers were never accessed. We patched the flaw, and continue to work to strengthen our security."

Ok. Data breaches happen. That doesn't inspire a lot of confidence when we're talking about a company that has a lot of people's personal information and works with law enforcement.

If someone was to gain access to Clearview's database, a lot of harm could be done. Not only does it probably have your photo, it quite possibly has your name, address, social media profiles and more personal information about you. Clearview doesn't appear to be extremely worried about someone getting that information.

Sen. Ed Markey (D-Mass.) has been a fierce critic of Clearview since the Times report came out last month. Markey released a statement regarding the data breach on Wednesday.

“Clearview’s statement that security is its ‘top priority’ would be laughable if the company’s failure to safeguard its information wasn’t so disturbing and threatening to the public’s privacy,” Markey said. “This is a company whose entire business model relies on collecting incredibly sensitive and personal information, and this breach is yet another sign that the potential benefits of Clearview’s technology do not outweigh the grave privacy risks it poses.”

Something that's truly concerning about facial recognition is that companies have developed A.I. systems that can use facial recognition to identify how someone might be feeling. Considering A.I. is known to often be biased, especially against people of color and women, one has to wonder if this kind of bias might lead to it inaccurately determining what emotions someone is feeling and how the judgments that are made based on that assessment might affect that person.

As we've previously reported, the facial recognition industry is raking in money despite the fact facial recognition is being banned in cities around the country. Experts estimate the industry will be worth around $12 billion by 2025. It appears our privacy has a price tag.

Related Tags