A Blog by Jonathan Low

 

Nov 30, 2019

Civil Libertarians Are Raising Questions About Robot Police Dogs

The primary question being who oversees the programmers and the testing of such systems. JL

Brian Heater reports in Tech Crunch:

A video demonstrated how a robot could be used to help get human officers out of harm’s way during a terrorist or hostage situation. Combining people’s distrust of scary robots with (arguably deserved) distrust of law enforcement, it’s pretty easy to go down a dystopian rabbit hole. "Deployment of  technologies happens faster than our social, political, or legal systems react. We need more transparency from government agencies, who should be upfront with the public about their plans to test and deploy new technologies. We also need regulations to protect civil liberties, civil rights, and racial justice in the age of artificial intelligence."
Back in April at our robotics event at UC Berkeley, Boston Dynamics head Marc Raibert showed off video of the company’s Spot robot in a number of different real world scenarios. Some, like construction and first responders, were familiar to anyone who has been following the company — and automation in general. 
Another scenario, which found the robot opening doors during a training exercise for the Massachusetts State Police, was something different entirely. It was a brief video that demonstrated how the robot could potentially be used to help get human officers out of harm’s way during a terrorist or hostage situation.
All these months later, the video has raised some questions among some civil liberties groups — including, mostly notably, the Massachusetts wing of the ACLU. A public records request filed by the organization is in response to a Facebook post by the department describing the July event that, “seeks to learn more about how your agency uses or has contemplated using robotics.”
ACLU Massachusetts’ Technology for Liberty Program Director Kade Crockford expanded on the request in a statement provided to TechCrunch:
There is a lot we do not know about how and where these robotics systems are currently deployed in Massachusetts. All too often, the deployment of these technologies happens faster than our social, political, or legal systems react. We urgently need more transparency from government agencies, who should be upfront with the public about their plans to test and deploy new technologies. We also need statewide regulations to protect civil liberties, civil rights, and racial justice in the age of artificial intelligence. Massachusetts must do more to ensure safeguards keep pace with technological innovation, and the ACLU is happy to partner with officials at the local and state levels to find and implement solutions to ensure our law keeps pace with technology.
As with any new technology, it’s right to ask many of these questions. Of course, this particular video has the added bonus of combining people’s distrust of big, scary robots with their (arguably deserved) distrust of law enforcement. It’s pretty easy to watch a video like that and go immediately down a dystopian rabbit hole.
Boston Dynamics told TechCrunch that it’s not at liberty to discuss the specifics of how the Massachusetts State Police deployed the robot, but the company’s Vice President Of Business Development Michael Perry explained that it’s put in place guidelines for how the loaner units can be used.
“Right now we’re at a scale where we can pick and choose the partners we engage with and make sure that they have a similar deployment and a vision for how robots are used,” Perry said. “For example, not using robots in a way that would physically harm or intimidate people. But also have a realistic expectation for what a robot can and cannot do.”
Perry explained that Boston Dynamics’ vision has the robots taking on a first responder role, rather than one of law enforcement. The latter seems to be the source of much of the concern here. It’s not so much the idea of the robots being implemented in a scenario with bomb deployment or hazardous material, so much as the potential to take a role in policing.
Notably, the ACLU’s request involves, “Documents, including emails, discussing, referencing, or pertaining to the weaponization of any robotics.”
Perry explains that the organization’s concerns are valid, but believes that Spot doesn’t represent a significant departure from existing technologies employed by first responders.
“It’s certainly the case that when a new technology is employed, multiple stakeholders need to come to the table,” he says. “I think the issues that the ACLU has raised specifically are applicable not just to our robots but to any new technology that is deployed. I’m not sure that what we bring to the table is significantly differentiated from anything that is already out there.”

0 comments:

Post a Comment