A Blog by Jonathan Low

 

Apr 20, 2021

NYPD Uses Facial Recognition To Catch Two Officers Drinking On the Job

Given police departments' growing embrace of facial recognition, however questionable the technology or the application, it's nice to know they're even abusing the rights of their own. JL 

Tim Cushing reports in Tech Dirt:

The NYPD expressly forbade the use of Clearview and other "outside databases" in March 2020. Prior to that ban, it appears NYPD investigators were using the software. How often they used it to run searches on suspected criminals is unknown. But emails obtained by the Legal Aid Society show investigators used it at least once to identify a couple of unknown subjects… who also happened to be NYPD officers. The NYPD thought it was too unreliable to use even before Internal Affairs decided it might be able to identify the officers caught drinking on duty.

The NYPD has an uneasy relationship with Clearview. The facial recognition startup -- one that has compiled a database of millions of images by scraping info from social media platforms and other websites -- claimed in an emailed pitch that the nation's largest police force used its software to identify a suspected terrorist.

That's not what actually happened, said the NYPD. It didn't use Clearview (even though it had experimented with it). Instead, the NYPD used its own facial recognition tech to identify the suspect by searching against a pool of images derived from its mugshot database.

But Clearview persists. The NYPD expressly forbade the use of Clearview and other "outside databases" in March 2020. Prior to that ban, it appears NYPD investigators were still using the software. How often they used it to run searches on suspected criminals is unknown. But emails obtained by the Legal Aid Society show investigators used it at least once to identify a couple of unknown subjects… who also happened to be NYPD officers.

One day after the hero cop’s funeral, Deputy Commissioner of Internal Affairs Joseph Reznick ordered Deputy Inspector Michael King of the Joint Terrorism Task Force to use the “Clearview AI” app to identify two cops in a photograph taken on the train, according to an email made public by the Legal Aid Society.

“As per commissioner Reznick can you please identify the Members of service in the photo,” then-IAB Detective Alfredo Torres wrote to King.

An image attached to the email shows two men seated in a train car across the aisle from the camera. The photo appears to have been shot surreptitiously with a cellphone.

So what were these officers up to? It appears they were drinking on the job, heading to the officer's funeral and tipping a few back on the train when they were spotted by someone (likely a fellow officer) who passed on the recording to Internal Affairs. While that is definitely misconduct, it seems like the sort of thing that might not require the use of particularly sketchy facial recognition tech.

But if you're going to use particularly sketchy facial recognition tech, you may as well use it on your own. No sense pointing it outward when assisting Clearview in its very public beta test. Using this for internal investigations seems fine, since it's far less likely to cause serious harm, like wrongful imprisonment. Instead, it will assist in the search of wrists in need of gentle slapping for temporarily embarrassing New York's Finest.

In all reality, the best thing to do is not use Clearview at all. The NYPD thought it was too unreliable and too questionable to use even before Internal Affairs decided it might be able to identify the officers caught drinking on duty. There's a level of due process expected in misconduct allegations and using unproven tech to ID people -- whether they're citizens or cops -- is just going to end up hurting the wrong people eventually.

0 comments:

Post a Comment