A Blog by Jonathan Low

 

Sep 16, 2018

IBM Used NY Police Surveillance Videos To Train AI - And A Lot of People Are Unhappy

Like Google, Facebook and Amazon, Microsoft is now facing protests over its use of AI technology to profile potential 'troublemakers.' The added issue in this is case is that IBM and the NYPD collaborated on the development of the technology, which studies suggest reflects unacknowledged biases of people with darker skin tones.

The larger question is whether any use of such profiling will ever be acceptable in a society increasingly suspicious of both quality and intent. JL


Sean Captain reports in Fast Company:

IBM used police footage of New Yorkers to develop its software. The NYPD had access to beta software’s skin tone feature. (A) petition asks IBM to refrain from developing and offering it for clients. Such AI has been used by casinos, merchants, and police to identify individuals captured on high-quality surveillance footage in seconds, resulting in cloud-connected databases full of such profiles. Face databases used by local law enforcement now contain over half of the U.S.’s adults

It’s not just Amazon facing backlash over video-analysis technology. Today, progressive organization Care2 launched an online petition calling on IBM to cease providing computer vision technology that, according to a September 6 report in The Intercept, registers the skin tone of people in video footage.


Per The Intercept, this technology had been provided to the New York Police Department, which has a troubled history of racial profiling through its “stop-and-frisk” practices. In 2013, a federal judge ruled that the NYPD’s practice had violated the Fourth Amendment (unreasonable search and seizure) and Fourteenth Amendment (equal protection under the law) rights of blacks and Latinos by disproportionately targeting them with arbitrary searches for contraband on the streets.

The NYPD has acknowledged that the technology exists and that IBM used police footage of New Yorkers to develop its full suite of software. As early as 2012, the NYPD had access to beta software’s skin tone feature, but it claims the tech was never used. While the NYPD told The Intercept that it no longer uses IBM’s video surveillance technology at all, the Care2 petition asks IBM to refrain from developing and offering it for any other clients. Such AI has been used by casinos, merchants, and police to identify individuals captured on high-quality surveillance footage in seconds, resulting in cloud-connected databases full of such profiles. Face databases used by local law enforcement now contain over half of the U.S.’s adults, reported Fast Company in July.

We’ve reached out to other civil rights organizations to inquire if they are planning their own actions. We will update this article with anything we learn.

UPDATE: The New York Civil Liberties Union informs us that it is pursuing the issue through continued support of a (languishing) proposed law for public oversight of NYPD use of surveillance technology. A statement to Fast Company from NYCLU lead policy counsel Michael Sisitzky reads, in part, “A simple step for transparency would be to pass city council legislation, known as the POST Act, to ensure that the NYPD police department discloses what surveillance technology is being used on our streets.”

UPDATE 2: IBM has responded, saying that it does not participate in racial profiling. An email statement reads, in part, that, “we have numerous programs underway to better identify and address bias in technology, including making publicly available to other companies a set of annotations for more than a million publically available images to help solve one of the biggest issues in facial analysis—the lack of diverse data to train AI systems.”

0 comments:

Post a Comment