A Blog by Jonathan Low

 

Mar 19, 2019

US Using Photos Of Visa Applicants And Dead People To Test Facial Recognition Tech

Hey, it's very efficient: you dont have to waste time trying to get them to smile. JL


Cat Zakrzewski reports in the Washington Post:

The U.S. government has been using photos from government agencies, collected during law enforcement and immigration processes, to test facial recognition software. Developers can download some of the photo data sets, such as from people who were arrested but are now deceased.The revelations are raising new questions about whether the government is abusing its power over citizens who can't opt out of these photos. “The government is using any data it can get. If we’re not in there now, we will be.”
Researchers this week announced their discovery that the U.S. government has been using massive sets of photos from government agencies -- collected during typical law enforcement and immigration processes -- to test facial recognition software developed by companies and universities. Developers can even download some of the photo data sets -- such as those from people who were arrested but are now deceased. 
“The government is using any data it can get ahold of,” said Nikki Stevens, a software engineer and PhD student at Arizona State University. “If we’re not in there now, soon we will be.”
The revelations are raising new questions about whether the government is abusing its power over citizens who often can't opt out of taking these photos.
The researchers from University of Washington, Arizona State University and Dartmouth say that people's images should not be used without their consent -- especially to improve technology that could potentially be used against them in the future for purposes like surveillance. It's time for new laws that mandate how the government uses facial recognition technology, they say. 
“If the government wants to use this technology, then the government should make sure that it is fair, equitable, working in a way that is consensual, has appeal mechanisms and that is not built on thousands of dead arrestees,” said Os Keyes, a PhD student at the University of Washington.
Facial recognition technology is booming -- and putting in place privacy safeguards for government testing of the technology could set the tone industry-wide, researchers say. 
After all, the government is one of the earliest and largest buyers of facial recognition software, which Keyes says powers everything from routine tasks at the Department of Motor Vehicles to entrances at high security military bases. 
“If you put conditions on what facial recognition systems can do and how they have to work for the government to buy them, you are effectively putting conditions on facial recognition systems,” said Keyes. “A company designing facial recognition system that major governments won’t buy is a company that will be subject to an involuntary buyout pretty soon.”
The findings were publicized in a Slate op-ed published Sunday amid a broader debate brewing about whether facial recognition researchers should obtain consent of ordinary people before they use their images to refine and test facial recognition technology.
Facial recognition systems can't be built or refined without massive sets of data — and the public is just learning about the ways researchers amass them. Last week, NBC News exposed that IBM, in its efforts to create a large, diverse set of images to train facial recognition, scraped millions of photos from the photo sharing website Flickr. NBC contacted some of the photographers whose images were in the set, who were surprised and disconcerted to learn how their photos were being used by the tech giant. The photographers also said none of the people pictured in their photos knew their likenesses would be used in that way. 
“This is the dirty little secret of AI training sets. Researchers often just grab whatever images are available in the wild,” NYU School of Law professor Jason Schultz told NBC.
Stevens and Keyes are particularly worried about how racial minorities, who were overly represented in the government data sets they found, could be impacted by the use of their photos to train facial recognition technology. They say because of racial inequities in the United States, these groups are more vulnerable to the ways that facial recognition could be abused to surveil or limit the civil liberties of these groups.
The researchers say they uncovered the use of the data sets as they were working on a paper about the National Institute of Standards and Technology. This government agency allows companies to submit their facial recognition software for tests that have become known a standard way for industry to benchmark their accuracy. Though the researchers filed some Freedom of Information Act requests, they say much of the data they cite in their op-ed is publicly available online once you know where to look. 
“You don't have to look hard to find what we found,” Keyes said. 
The researchers also say their findings raise serious questions about whether the agency can be trusted to help shape standards governing the federal government’s use of artificial intelligence. President Trump recently signed an executive order that says NIST is responsible for planning how the federal government will develop technical standards for artificial intelligence.
NIST, however, is defending its work. Jennifer Huergo, an agency spokeswoman, says all of the images it uses from other agencies comply with Human Subject Protection review and other applicable federal regulations to preserve people's privacy rights in research.
Huergo did contest some of the claims the researchers made in the article. She pushed back on claims that some testing programs depend on images of children that have been exploited for pornography. Huergo said that the government did test algorithms against images of exploited children in an effort to help the Department of Homeland Security determine if facial recognition could be used to combat child abuse. However, Huergo said NIST never took over that data — it remained housed within DHS, and NIST employees “never look at the images.”
The op-ed also said that images are drawn from documentation of people boarding aircrafts, but Huergo said those images were taken from a simulation that DHS ran of people boarding aircrafts in a warehouse. 
Huergo says the agency’s work is focused on making facial recognition better and fairer for everyone.
“We’re here to help these technologies to reduce errors and bias and provide technical underpinnings to make sure people can make decisions on their use,” she said. “You want these things to be working properly.”

0 comments:

Post a Comment