A Blog by Jonathan Low

 

Mar 29, 2022

People Can Now Sign Away Rights To Their Biometric Data

Signing away does not mean selling. 

This is being framed by Getty Images as an efficient way to get people in audiences or at photo shoots to employ a simple release form. But legal experts warn it is actually a very ambitious data grab of rights that individuals might want to preserve and may not realize they are giving away. JL 

Janus Rose reports in Motherboard, image Microsoft:

Getty Images, one of the largest stock image sites on the internet, is offering a way to sign away rights to biometrics like facial data, with a release form that allows the information to be used by third parties. Legal experts worry Getty’s enhanced model agreement is overly broad, and could lead to photographers, filmmakers, and agencies using biometric data for unintended purposes. “Between that broad assignment language, and the equally broad waiver on biometric rights, this is a huge rights grab snuck into what should be a simple release.”

Biometric data is everywhere thanks to facial recognition technology and an endless supply of selfies on social media. Now, one of the largest stock image sites on the internet is offering a way to sign away rights to biometrics like facial data, with a release form that allows the information to be used by third parties.

Last week, Getty Images announced an “Enhanced Model Release” form which allows biometric information sharing as part of its media licensing platform, which is used heavily by media publications (including VICE). Release forms are a standard practice for media photographers, to show that a subject has consented to their image being used and sold. In addition to granting normal rights for photos, Getty’s new agreement prompts subjects captured in provide consent for licensing or using of their biometric data “for any purpose (except pornographic or defamatory purposes) including marketing or promotion of any product or service.”

 

“Biometric data is especially valuable because it can be used to recognise and map facial features extracted from visual content,” the company wrote in a press release. “Recently, there have been a spate of lawsuits around the use of biometric information without the explicit consent of people featured in visual imagery. While the law in this area is still evolving, developers should always start with collecting data from legitimate sources and obtaining authorization for its intended use.”

But some legal experts worry Getty’s enhanced model agreement is overly broad, and could lead to photographers, filmmakers, and agencies using biometric data for all kinds of unintended purposes.

“It's way beyond what someone would need for including someone in a photoshoot or anything like that,” Frederic Jennings, a Brooklyn-based attorney who specializes in privacy and digital rights, told Motherboard. “Between that broad assignment language, and the equally broad waiver on biometric rights and prohibitions, this is a pretty huge rights grab snuck into what should be a simple release.”

Facial recognition and other biometric data are frequently used to train machine learning algorithms — often without the knowledge or consent of their subjects. 10 US states currently have laws protecting against the sale and use of biometric data, but that hasn’t stopped many companies from amassing giant troves of facial recognition templates. The notorious facial recognition company Clearview AI claims to have over 3 billion face images, and was targeted by cease-and-desist actions after it was found scraping peoples’ faces from Twitter, YouTube, and other social platforms. 

Getty is hailing the new release form as an “industry first” that it hopes will become a standard practice for licensing images. But on a major platform, the form could also cause models to unnecessarily grant legal rights to their biometric data—including in states where the law would normally protect it by default.

On the other hand, Jenkins argues that it could give people the chance to opt out in states without strong biometric privacy protections.

“In places that do have biometric protections written into law (or places where courts have read those into other laws), people would retain those rights by default, and would be just as well protected by that language being absent & undefined here,” said Jenkins. “But I think it being called out is useful where those rights don't exist—either by not signing, or signing a version with those sections crossed out, it would be a clear way to show that permission isn't being granted.”

0 comments:

Post a Comment