A Blog by Jonathan Low

 

Apr 4, 2019

Why AI Experts From Google, Facebook, Microsoft Urge Amazon To Stop Selling Its Facial Recognition To Police

There is concern not just about misidentification, but about potential abuse by government agencies or companies of a system known to be flawed. JL

Cade Metz and Natasha Singer report in the New York Times:

25 prominent artificial-intelligence researchers, including experts at Google, Facebook, Microsoft and a recent winner of the prestigious Turing Award, have signed a letter calling on Amazon to stop selling its facial-recognition technology to law enforcement agencies because it is biased against women and people of color. It mistook women for men 19% of the time, the study showed, and misidentified darker-skinned women for men 31% of the time.
At least 25 prominent artificial-intelligence researchers, including experts at Google, Facebook, Microsoft and a recent winner of the prestigious Turing Award, have signed a letter calling on Amazon to stop selling its facial-recognition technology to law enforcement agencies because it is biased against women and people of color.
The letter, which was publicly released Wednesday, reflects growing concern in academia and the tech industry that bias in facial-recognition technology is a systemic problem. Some researchers — and even some companies — are arguing the technology cannot be properly controlled without government regulation.
Amazon sells a product called Rekognition through its cloud-computing division, Amazon Web Services. The company said last year that early customers included the Orlando Police Department in Florida and the Washington County Sheriff’s Office in Oregon.
In January, two researchers at the Massachusetts Institute of Technology published a peer-reviewed study showing that Amazon Rekognition had more trouble identifying the gender of female and darker-skinned faces in photos than similar services from IBM and Microsoft. It mistook women for men 19 percent of the time, the study showed, and misidentified darker-skinned women for men 31 percent of the time.

Before publishing their findings on Amazon Rekognition, the M.I.T. researchers released a similar study examining services from IBM, Microsoft and Megvii, an artificial-intelligence company in China. All three updated their services to address concerns raised by the researchers.
In separate blog posts from the Amazon executives Matthew Wood and Michael Punke, the company disputed the study and a Jan. 24 article in The New York Times describing it.
“The answer to anxieties over new technology is not to run ‘tests’ inconsistent with how the service is designed to be used, and to amplify the test’s false and misleading conclusions through the news media,” Dr. Wood wrote. Amazon did not directly engage with the M.I.T. researchers.
The letter released on Wednesday was signed by the Google researchers Margaret Mitchell, Andrea Frome and Timnit Gebru; the Facebook researcher Georgia Gkioxari; William Isaac, a researcher at DeepMind, the London lab owned by Google’s parent company, Alphabet; and Yoshua Bengio, one of the world’s most important A.I. researchers.
Last week, Dr. Bengio was one of three people to receive the Turing Award — often called “the Nobel Prize of computing” — for his work with neural networks, the technology that underpins modern facial recognition services.
There are no laws or required standards to ensure that Rekognition is used in a manner that does not infringe on civil liberties,” the A.I. researchers wrote. “We call on Amazon to stop selling Rekognition to law enforcement.”
The researchers added that Dr. Wood and Mr. Punke had “misrepresented the technical details” of the M.I.T. study and modern facial-recognition technology. Amazon declined to comment on the letter on Wednesday.
Microsoft, by contrast, improved the accuracy of its facial recognition last year after an earlier M.I.T. study reported that its system was better at identifying the gender of lighter-skinned men in a photo database than darker-skinned women.
During a February talk at the Cornell Tech graduate school in New York, Brad Smith, Microsoft’s president and chief legal officer, said the company had “participated in the market for law enforcement in the United States,” but had also turned down sales when there was concern it could unreasonably infringe on people’s rights.
In February, Microsoft backed a bill in Washington State that would require notices to be posted in public places using facial-recognition tech and ensure that government agencies obtain a court order when looking for specific people. The bill is still pending. But the company did not back other legislation that provides much stronger protections.
Mr. Punke wrote in his February blog post that Amazon also supported regulation of facial-recognition technology and called for law enforcement agencies to “be transparent in how they use facial-recognition technology.” But Amazon has declined to disclose how police or intelligence agencies are using its Rekognition system and whether the company puts any restrictions on its use.
Amazon has said that it has not received any reports of Rekognition misuse by law enforcement, and that the company’s acceptable use policy prohibits customers from using its services in ways that violate laws.

0 comments:

Post a Comment