A Blog by Jonathan Low

 

May 10, 2022

Clearview AI Settles Facial Recog Lawsuit, Agrees To Limit Data Access

Clearview can still sell its facial recognition data to government agencies and law enforcement but has agreed not to sell it to private companies. 

The significance is that it lost the case, further establishing privacy rights, but as a practical matter, the data it has already provided will likely be sufficient for companies to now create their own deepfakes. JL 

Ryan Mac and Kashmir Hill report in the New York Times:

Clearview AI, the facial recognition software maker, settled a lawsuit and agreed to limit its face database in the US primarily to government agencies and not allow most American companies to have access to it. (It) built its facial recognition software by scraping photos from the web and popular sites, such as Facebook, LinkedIn and Instagram. Clearview then sold its software to police and government agencies, including the F.B.I. and Immigration and Customs Enforcement. Its technology has been deemed illegal in Canada, Australia and parts of Europe for violating privacy laws.

Clearview AI, the facial recognition software maker, on Monday settled a lawsuit brought by the American Civil Liberties Union and agreed to limit its face database in the United States primarily to government agencies and not allow most American companies to have access to it.

Under the settlement, which was filed with an Illinois state court, Clearview will not sell its database of what it said were more than 20 billion facial photos to most private individuals and businesses in the country. But the company can largely still sell that database to federal and state agencies.

The agreement is the latest blow to the New York-based start-up, which built its facial recognition software by scraping photos from the web and popular sites, such as Facebook, LinkedIn and Instagram. Clearview then sold its software to local police departments and government agencies, including the F.B.I. and Immigration and Customs Enforcement.

But its technology has been deemed illegal in Canada, Australia and parts of Europe for violating privacy laws. Clearview also faces a provisional $22.6 million fine in Britain, as well as a 20 million-euro fine from Italy’s data protection agency.

“Clearview can no longer treat people’s unique biometric identifiers as an unrestricted source of profits,” Nathan Freed Wessler, a deputy director with the A.C.L.U.’s Speech, Privacy and Technology Project, said in a statement about the settlement. “Other companies would be wise to take note, and other states should follow Illinois’s lead in enacting strong biometric privacy laws.”

Floyd Abrams, a First Amendment expert hired by Clearview to defend the company’s right to gather publicly available information and make it searchable, said the company was “pleased to put this litigation behind it.”

“To avoid a protracted, costly and distracting legal dispute with the A.C.L.U. and others, Clearview AI has agreed to continue to not provide its services to law enforcement agencies in Illinois for a period of time,” he said.

The A.C.L.U. filed its lawsuit in May 2020 on behalf of groups representing victims of domestic violence, undocumented immigrants and sex workers. The group accused Clearview of violating Illinois’s Biometric Information Privacy Act, a state law that prohibits private entities from using citizens’ bodily identifiers, including algorithmic maps of their faces, without consent.

“This is a huge win for the most vulnerable people in Illinois,” said Linda Xóchitl Tortolero, a plaintiff in the case and the head of Mujeres Latinas en Acción, an advocacy group for survivors of sexual assault and domestic violence. “For a lot of Latinas, many who are undocumented and have low levels of IT or social media literacy, not understanding how technology can be used against you is a huge challenge."

One of Clearview’s sales methods was to offer free trials to potential customers, including private businesses, government employees and police officers. Under the settlement, the company will have a more formal process around trial accounts, ensuring that individual police officers have permission from their employers to use the facial recognition app.

Clearview is also prohibited from selling to any Illinois-based entity, private or public, for five years as part of the agreement. After that, it can resume doing business with local or state law enforcement agencies in the state, Mr. Wessler said.

In a key exception, Clearview will still be able to provide its database to U.S. banks and financial institutions under a carve-out in the Illinois law. Hoan Ton-That, chief executive of Clearview AI, said the company did “not have plans” to provide the database “to entities besides government agencies at this time.”

The settlement does not mean that Clearview cannot sell any product to corporations. It will still be able to sell its facial recognition algorithm, without the database of 20 billion images, to companies. Its algorithm helps match people’s faces to any database that a customer provides.

“There are a number of other consent-based uses for Clearview’s technology that the company has the ability to market more broadly,” Mr. Ton-That said.

As part of the settlement, Clearview did not admit any liability and agreed to pay $250,000 in attorneys’ fees to the plaintiffs. The settlement is subject to approval by an Illinois state judge.

0 comments:

Post a Comment