Clearview AI faces lawsuit over gathering people’s images without consent

security-facial-recognition-contact-tracing

Clearview AI is being sued for collecting people’s photos without their consent for its facial recognition database.


James Martin/CNET

Clearview AI has spent months amassing its facial recognition database with more than 3 billion pictures of people it gathered from the internet. Now it faces a lawsuit in Illinois for taking those photos without people’s consent. 

The American Civil Liberties Union announced on Thursday that it is suing the controversial facial recognition company. Co-plaintiffs include the Illinois State Public Interest Research Group and the Chicago Alliance Against Sexual Exploitation. 

Clearview AI had been forced out of the shadows after a profile by The New York Times in January, which detailed how the company planned to use facial recognition to identify people in real time. It’s able to do that through its massive database of people’s photos, which it gathered on social media platforms like Instagram, YouTube and LinkedIn without those people’s consent. 


Now playing:
Watch this:

Clearview AI’s facial recognition goes creepier than…



2:58

The lawsuit is being brought in Illinois because it is the only state in the US with a biometric privacy law, the Biometric Information Privacy Act. The law requires companies to have “informed written consent” before getting and using a person’s biometrics, including facial recognition. In January, Facebook paid $550 million in a settlement tied to the law. 

“Clearview’s practices are exactly the threat to privacy that the legislature intended to address, and demonstrate why states across the country should adopt legal protections like the ones in Illinois,” the ACLU said in a statement. 

Clearview AI didn’t respond to a request for comment. 

The ACLU said it is suing Clearview AI on behalf of organizations that represent survivors of sexual assault and domestic violence and undocumented immigrants. It stated that the surveillance technology offered by Clearview AI could enable abusive partners and government agencies to track and target vulnerable communities. 

Documents obtained by BuzzFeed News showed that police were using Clearview AI to identify sex workers. The reports also found that Clearview AI had been offering its facial recognition tools to Immigrations and Customs Enforcement, as well as private companies like Walmart.

See also: CCPA is here: California’s privacy law gives you new rights

The ACLU is working with the law firm Edelson PC, which also had a hand in the Facebook facial recognition lawsuit settled in January. The lawsuit is seeking a court order in Illinois to force Clearview AI to delete photos of Illinois residents gathered without consent, and to stop gathering new photos until it complies with the state’s law. 

If the lawsuit succeeds, this protection would only apply to residents of Illinois, as other states in the US don’t have biometrics laws. Clearview AI has a “privacy request form” with a special section to opt out for residents in Illinois and California, which has its own state privacy law. 

But Illinois’ law specifically requires companies to get consent first rather than needing people to request exclusion. The opt-out process also involves people needing to upload a photo of themselves first before their images can be deleted from its database.

Companies like Facebook, Microsoft and Google have sent cease-and-desist orders to Clearview AI to demand that they stop harvesting images from their platforms. The company’s CEO, Hoan Ton-That, argued that it has a First Amendment right to gather these images.  

“If allowed, Clearview will destroy our rights to anonymity and privacy — and the safety and security that both bring,” the ACLU said. “People can change their names and addresses to shield their whereabouts and identities from individuals who seek to harm them, but they can’t change their faces.”

Source Article