A Blog by Jonathan Low

 

Jan 31, 2022

Startup Is Developing Facial Recognition Using AI Analysis of Subject's DNA

Putting aside, for the moment, the fact that the technology to accomplish this feat does not yet exist, the concern is that perceived demand for facial recognition - and the profits it will generate - supersedes all legal, ethical and moral issues that attend such an outcome. 

These concerns are magnified by the impending Olympic Games in China, where authorities are reportedly using mandatory health-based Covid-monitoring software to deny services and permissions to critics or people it suspects of disloyalty. JL

Tate Ryan-Mosley reports in MIT Technology Review:

A facial recognition subsidiary of an AI company purports to use DNA to create a model of a face that can then be run through a facial recognition system. The tool “constructs a physical profile by analyzing genetic material collected in a DNA sample.” The science needed to support such a system doesn’t yet exist. The product would exacerbate the ethical, privacy, and bias problems facial recognition technology already causes. It’s a signal of the industry’s ambitions for the future, where face detection becomes one facet of a broader effort to identify people by any available means—even inaccurate ones.

A police officer is at the scene of a murder. No witnesses. No camera footage. No obvious suspects or motives. Just a bit of hair on the sleeve of the victim’s jacket. DNA from the cells of one strand is copied and compared against a database. No match comes back, and the case goes cold. 

Corsight AI, a facial recognition subsidiary of the Israeli AI company Cortica, purports to be devising a solution for that sort of situation by using DNA to create a model of a face that can then be run through a facial recognition system. It is a task that experts in the field regard as scientifically untenable. 

Corsight unveiled its “DNA to Face” product in a presentation by chief executive officer Robert Watts and executive vice president Ofer Ronen intended to court financiers at the Imperial Capital Investors Conference in New York City on December 15. It was part of the company’s overall product road map, which also included movement and voice recognition. The tool “constructs a physical profile by analyzing genetic material collected in a DNA sample,” according to a company slide deck viewed by surveillance research group IPVM and shared with MIT Technology Review. 

A photo of Corsight's investor presentation showing its product roadmap that features "voice to face", "DNA to face" and "movement" as an expansion of its face recognition capabilities.

Corsight declined a request to answer questions about the presentation and its product road map. “We are not engaging with the press at the moment as the details of what we are doing are company confidential,” Watts wrote in an email. 

But marketing materials show that the company is focused on government and law enforcement applications for its technology. Its advisory board consists only of James Woolsey, a former director of the CIA, and Oliver Revell, a former assistant director of the FBI.

 

The science that would be needed to support such a system doesn’t yet exist, however, and experts say the product would exacerbate the ethical, privacy, and bias problems facial recognition technology already causes. More worryingly, it’s a signal of the industry’s ambitions for the future, where face detection becomes one facet of a broader effort to identify people by any available means—even inaccurate ones.

This story was jointly reported with Donald Maye of IPVM who reported that "prior to this presentation, IPVM was unaware of a company attempting to commercialize a face recognition product associated with a DNA sample."

A checkered past

Corsight’s idea is not entirely new. Human Longevity, a “genomics-based, health intelligence” company founded by Silicon Valley celebrities Craig Venter and Peter Diamandis, claimed to have used DNA to predict faces in 2017. MIT Technology Review reported then that experts, however, were doubtful. A former employee of Human Longevity said the company can’t pick a person out of a crowd using a genome, and Yaniv Erlich, chief science officer of the genealogy platform MyHeritage, published a response laying out major flaws in the research. 

A small DNA informatics company, Parabon NanoLabs, provides law enforcement agencies with physical depictions of people derived from DNA samples through a product line called Snapshot, which includes genetic genealogy as well as 3D renderings of a face. (Parabon publishes some cases on its website with comparisons between photos of people the authorities are interested in finding and renderings the company has produced.) 

Parabon’s computer-generated composites also come with a set of phenotypic characteristics, like eye and skin color, that are given a confidence score. For example, a composite might say that there’s an 80% chance the person being sought has blue eyes. Forensic artists also amend the composites to create finalized face models that incorporate descriptions of nongenetic factors, like weight and age, whenever possible. 

Parabon’s website claims its software is helping solve an average of one case per week, and Ellen McRae Greytak, the company’s director of bioinformatics, says it has solved over 600 cases in the past seven years, though most are solved with genetic genealogy rather than composite analysis. Greytak says the company has come under criticism for not publishing its proprietary methods and data; she attributes that to a “business decision.” 

Parabon does not package face recognition AI with its phenotyping service, and it stipulates that its law enforcement clients should not use the images it generates from DNA samples as an input into face recognition systems. 

Parabon’s technology “doesn’t tell you the exact number of millimeters between the eyes or the ratio between the eyes, nose, and mouth,” Greytak says. Without that sort of precision, facial recognition algorithms cannot deliver accurate results—but deriving such precise measurements from DNA would require fundamentally new scientific discoveries, she says, and “the papers that have tried to do prediction at that level have not had a lot of success.” Greytak says Parabon only predicts the general shape of someone’s face (though the scientific feasibility of such prediction has also been questioned). 

Police have been known to run forensic sketches based on witness descriptions through facial recognition systems. A 2019 study from Georgetown Law’s Center on Privacy and Technology found that at least half a dozen police agencies in the US “permit, if not encourage” using forensic sketches, either hand drawn or computer generated, as input photos for face recognition systems. AI experts have warned that such a process likely leads to lower levels of accuracy

Corsight also has been criticized in the past for exaggerating the capabilities and accuracy of its face recognition system, which it calls the “most ethical facial recognition system for highly challenging conditions,” according to a slide deck presentation available online. In a technology demo for IPVM last November, Corsight CEO Watts said that Corsight’s face recognition system can “identify someone with a face mask—not just with a face mask, but with a ski mask.” IPVM reported that using Corsight’s AI on a masked face rendered a 65% confidence score, Corsight’s own measure of how likely it is that the face captured will be matched in its database, and noted that the mask is more accurately described as a balaclava or neck gaiter, as opposed to a ski mask with only mouth and eye cutouts. 

Broader issues with face recognition technology’s accuracy have been well-documented (including by MIT Technology Review). They are more pronounced when photographs are poorly lit or taken at extreme angles, and when the subjects have darker skin, are women, or are very old or very young. Privacy advocates and the public have also criticized facial recognition technology, particularly systems like Clearview AI that scrape social media as part of their matching engine. 

Law enforcement use of the technology is particularly fraught—Boston, Minneapolis, and San Francisco are among the many cities that have banned it. Amazon and Microsoft have stopped selling facial recognition products to police groups, and IBM has taken its face recognition software off the market. 

“Pseudoscience”

“The idea that you’re going to be able to create something with the level of granularity and fidelity that’s necessary to run a face match search—to me, that’s preposterous,” says Albert Fox Cahn, a civil rights lawyer and executive director of the Surveillance Technology Oversight Project, who works extensively on issues related to face recognition systems. “That is pseudoscience.”

Dzemila Sero, a researcher in the Computational Imaging Group of Centrum Wiskunde & Informatica, the national research institute for mathematics and computer science in the Netherlands, says the science to support such a system is not yet sufficiently developed, at least not publicly. Sero says the catalog of genes required to produce accurate depictions of faces from DNA samples is currently incomplete, citing Human Longevity’s 2017 study.

In addition, factors like the environment and aging have substantial effects on faces that can’t be captured through DNA phenotyping, and research has shown that individual genes don’t affect the appearance of someone’s face as much as their gender and ancestry does.  “Premature attempts to implement this technique would likely undermine trust and support for genomic research and garner no societal benefit,” she told MIT Technology Review in an email.

Sero has studied the reverse concept of Corsight’s system—“face to DNA” rather than “DNA to face”—by matching a set of 3D photographs with a DNA sample. In a paper in Nature, Sero and her team reported accuracy rates between 80% to 83%. Sero says her work should not be used by prosecutors as incriminating evidence, however, and that “these methods also raise undeniable risks of further racial disparities in criminal justice that warrant caution against premature application of the techniques until proper safeguards are in place.”

Law enforcement depends on DNA data sets, predominantly the free ancestry website GEDmatch, which was instrumental in the search for the notorious “Golden State Killer.”  But even DNA sampling, once considered the only form of scientifically rigorous forensic evidence by the US National Research Council, has recently come under criticism for problems with accuracy.  

Fox Cahn, who is currently suing the New York Police Department to obtain records related to bias in its use of facial recognition technology, says the impact of Corsight’s hypothetical system would be disastrous. “Gaming out the impact this is going to have, it augments every failure case for facial recognition,” says Fox Cahn. “It’s easy to imagine how this could be used in truly frightening and Orwellian ways.”

The future of face recognition tech

Despite such concerns, the market for face recognition technology is growing, and companies are jockeying for customers. Corsight is just one of many offering photo-matching services with flashy new features, regardless of whether they’ve been shown to work. 

Many of these new products look to integrate face recognition with another form of recognition. The Russia-based facial recognition company NtechLab, for example, offers systems that identify people based on their license plates as well as facial features, and founder Artem Kuharenko told MIT Technology Review last year that its algorithms try to “extract as much information from the video stream as possible.” In these systems, facial recognition becomes just one part of an apparatus that can identify people by a range of techniques, fusing personal information across connected databases into a sort of data panopticon.

Corsight’s DNA to face system appears to be the company’s foray into building a futuristic, comprehensive surveillance package it can offer to potential buyers. But even as the market for such technologies expands, Corsight and others are at increased risk of commercializing surveillance technologies plagued by bias and inaccuracy.

0 comments:

Post a Comment