A Blog by Jonathan Low

 

Jan 26, 2019

What Are the Rules For Using Facial Recognition To Convict In Court?

At the moment, in most cases, prosecutors do not even have to reveal that facial recognition technology is being used, let alone permit questions about the quality of the algorithms behind it, the fact that computers, not humans are making determinations about alleged matches between suspects and photos as well as the record of the technology's accuracy.

As the public becomes more aware of the technology's use, its frailty and mistakes, this is almost certain to change. JL


Aaron Mak reports in Slate:

Photos of other FACES matches aren’t the only potentially exculpatory evidence. Algorithm quality, confidence thresholds, and the format for returning matches can all affect the accuracy of the technology. Given those known issues, police should be required to disclose the very use of facial recognition software. Willie Allen Lynch, who was convicted in 2016 for selling crack, had no right to view photos of other suspects identified by the facial recognition search that led to his arrest.
A Florida state appellate court ruled last week that Willie Allen Lynch, who was convicted in 2016 for selling crack cocaine, had no right to view photos of other suspects identified by the facial recognition search that led to his arrest.
In 2015, undercover agents working with the Jacksonville Sheriff’s Office photographed a man selling $50 of cocaine. Detectives were unable to identify him, so they decided to turn to the Face Analysis Comparison Examination System, known as FACES, which draws from a database consisting of more than 33 million driver’s license and law enforcement photos. The software, which is designed to return multiple potential matches for a given image, named Lynch and four other suspects. Upon further investigation, detectives arrested Lynch for the crime. He was eventually sentenced to eight years in prison.
Florida law enforcement began implementing its current facial recognition system in 2001, before most other states. Authorities in Florida now conduct roughly 8,000 searches on FACES per month, almost twice that of the FBI face recognition unit’s search average. Georgetown’s Center on Privacy & Technology revealed in a 2016 report that the software had not been audited for error or misuse, and the Jacksonville Sheriff’s Office has also stated that it has no formal policies around FACES.
“Florida has the most advanced face recognition system of any state in the country. It’s the longest-running and most robust,” says Jennifer Lynch (no relation to the defendant), who serves as the surveillance-litigation director for the Electronic Frontier Foundation. Florida law enforcement in fact advised the FBI when the bureau was developing its own facial recognition system. Lynch added, “It’s not surprising that Florida would be the first state where a case like this would get up to the appellate level. It’s very likely that we will see more cases like this going forward.” Despite the wide proliferation of the technology in the state, local public defenders have told Georgetown researchers that police had never disclosed information about specific uses of the system during criminal cases.
Indeed, authorities did not write in the police report for Willie Allen Lynch’s arrest that they had consulted FACES. Instead, he only learned of the technology months into the case after he personally sought to depose the detectives and the crime analyst involved. During a pretrial deposition, the crime analyst who ran the dealer’s photograph through FACES also testified that the software rates the quality of a match using a star system. She noted Lynch had only one star but that the other potential matches had none. She did not know the maximum number of stars possible.
During his trial, Lynch claimed that he had been misidentified. But the court denied his request to obtain the photos of other people that FACES produced as possible matches because detectives on the case hadn’t seen them either. One of the central arguments in Lynch’s appeal was that the state had violated the legal precedent set by the Supreme Court case Brady v. Maryland, which dictates that prosecutors must hand over potentially exculpatory evidence to the defense. “If any of the photographs of the other potential matches from the facial recognition program resembles the drug seller or Appellant then clearly there was a Brady/discovery violation and Appellant should be granted a new trial,” Lynch’s public defender wrote in the motion for rehearing. (The Jacksonville Sherriff’s Office has publicly stated that detectives only use FACES in conjunction with other investigatory tools. Detectives in this case also relied on an eyewitness account—which the defense has disputed—along with Lynch’s criminal record to pin him as the culprit.)
Jake Laperruque, who serves as senior counsel at the Constitution Project and does work with facial recognition and privacy, points out that the photos of other FACES matches aren’t the only piece of potentially exculpatory evidence in this scenario. Factors such as algorithm quality, confidence thresholds, and the format for returning matches can all affect the accuracy of the technology. Given those known issues, many advocates argue that police should be required to disclose the very use of facial recognition software under Brady.
“Without knowing that facial recognition was used and the details, it’s impossible for defendants to know if its use in advancing an investigation was proper,” says Laperruque. “It’s the equivalent of police basing their investigation on an eyewitness account, but then not letting the defendant know the witness was used, or if what they saw was from 5 or 500 feet away.”
The fact that Lynch is black also raises questions about the accuracy of FACES in this case, as facial recognition technology notoriously struggles to identify people of color. Researchers at MIT published a study last year testing three of the most advanced facial recognition systems available. They found that error rates were about 1 percent for light-skinned males, 12 percent for darker-skinned males, and 35 percent for darker-skinned females.
This month, however, the 1st District Court of Appeals upheld Lynch’s conviction on the basis that he could not prove that the other photos in the database resembled his own—even though neither Lynch nor his appeals attorney, Victor Holder, have been able to access the photos. Without them, they couldn’t argue that the outcome of the trial might have been different. The court further noted that the jury had a chance to compare photos of Lynch with those of the dealer.
Holder told the Florida Times Union that he is looking into potential avenues for further appeal. “Florida law enforcement agencies currently use facial recognition technology with little to no public awareness, no uniform standards governing its use, and no public oversight by the Florida Legislature,” he said.
Greater awareness and transparency around facial recognition may lead to more interrogation of the technology in courtrooms. “In general, if you’re using facial recognition as the basis for investigative activity, it’s going to raise questions in a trial if that was not a reliable system,” says Laperruque. “When police use fingerprints in an investigation, even if it’s not their smoking-gun piece of evidence, if it’s what led them to declare a suspect or search someone’s house, then you would want defense counsel to question the fingerprint expert about their methods and expertise.”
For Sarah St. Vincent, a researcher and advocate for Human Rights Watch, Lynch’s case also raises questions about the seemingly indiscriminate use of facial recognition technologies by law enforcement. “There are human rights concerns about the government’s decision to use a powerful form of surveillance software, facial recognition, in a case reportedly involving $50 worth of drugs,” she says. “If the government’s even going to think about using surveillance that can intrude on fundamental rights as powerfully as facial recognition can, it should only be considering that in cases that are similarly serious.” (There have been more urgent occasions to use the technology: Police consulted a facial recognition system last year to identify a suspect in a mass shooting at the Maryland Capital Gazette’s newsroom.)
If Lynch had not taken it upon himself to seek depositions and file handwritten motions in this case, he may never have learned of the role that FACES played in his arrest. Failure on the part of law enforcement to disclose cases may be hindering our legal system from considering such questions as well. “We need to know at least when the technology is being used so that courts, defense attorneys, and prosecutors can collectively have those discussions,” says St. Vincent. “I’m not confident right now that the technology is usually being revealed.”