A Blog by Jonathan Low

 

Jul 6, 2019

London Police's Facial Recognition System Wrong On 80 Percent Of Identifications

How can a system be considered adding to citizens' safety if citizens are being incorrectly targeted. JL

Rowland Manthorpe and Alexander Martin report in Sky:

Researchers found that the system is 81% inaccurate - meaning that, in the vast majority of cases, it flagged up faces to police when they were not on a wanted list.Citing a range of technical, operational and legal issues, the report concludes that it is "highly possible" the Met's usage of the system would be found unlawful if challenged in court. he data used to create watch lists was not current, so people were stopped even though their case had already been addressed. In other cases, there were no clear reasons why people were put on watch lists, leaving "significant ambiguity" about the intended purpose of facial recognition.
Four out of five people identified by the Metropolitan Police's facial recognition technology as possible suspects are innocent, according to an independent report.
Researchers found that the controversial system is 81% inaccurate - meaning that, in the vast majority of cases, it flagged up faces to police when they were not on a wanted list.
The force maintains its technology only makes a mistake in one in 1,000 cases - but it uses a different measurement to arrive at this conclusion.
The report, exclusively revealed by Sky News and The Guardian, raises "significant concerns" about Scotland Yard's use of the technology, and calls for the facial recognition programme to be halted.
Citing a range of technical, operational and legal issues, the report concludes that it is "highly possible" the Met's usage of the system would be found unlawful if challenged in court.


LONDON, ENGLAND - MARCH 24: Police officers patrol on Westminster Bridge on March 24, 2017 in London, England. A fourth person has died after Khalid Masood drove a car into pedestrians on Westminster Bridge before going on to fatally stab PC Keith Palmer on March 22. (Photo by Carl Court/Getty Images)
Image: An independent report calls for Scotland Yard to halt its use of facial recognition technology
The Met has been monitoring crowds with live facial recognition (LFR) since August 2016, when it used the technology at Notting Hill Carnival.
Since then, it has conducted 10 trials at locations including Leicester Square, Westfield Stratford, and Whitehall during the 2017 Remembrance Sunday commemorations.
The first independent evaluation of the scheme was commissioned by Scotland Yard and conducted by academics from the University of Essex.Professor Pete Fussey and Dr Daragh Murray evaluated the technology's accuracy at six of the 10 police trials. They found that, of 42 matches, only eight were verified as correct - an error rate of 81%. Four of the 42 were people who were never found because they were absorbed into the crowd, so a match could not be verified.
The Met prefers to measure accuracy by comparing successful and unsuccessful matches with the total number of faces processed by the facial recognition system. According to this metric, the error rate was just 0.1%.Duncan Ball, the Met's deputy assistant commissioner, said: "We are extremely disappointed with the negative and unbalanced tone of this report... We have a legal basis for this pilot period and have taken legal advice throughout."We believe the public would absolutely expect us to try innovative methods of crime fighting in order to make London safer."
Professor Fussey and Dr Murray claimed the Met's use of facial recognition during these trials lacked "an explicit legal basis" and failed to take into account how this technology infringed fundamental human rights.
"Our report conducted a detailed, academic, legal analysis of the documentation the Met Police used as a basis for the face recognition trials," Professor Fussey told Sky News. "There are some shortcomings and if [the Met] was taken to court there is a good chance that would be successfully challenged."
The co-authors also found "significant" operational problems - with obtaining the consent of those affected a particular issue.

Are police illegally using facial recognition technology?

Are police illegally using facial recognition technology?

When live facial recognition is used in public places, everyone who comes within range of the cameras is considered to be under overt surveillance.
The Met did make an effort to notify passers-by about their trials by putting out signs and tweeting about the event.
But the researchers observed "significant shortcomings" in this process - and said this created difficulty in gaining meaningful consent.
A recent BBC documentary captured an incident where a man was fined after refusing to take part in a facial recognition trial.
Professor Fussey and Dr Murray wrote: "Treating LFR camera avoidance as suspicious behaviour undermines the premise of informed consent.
"The arrest of LFR camera-avoiding individuals for more minor offences than those used to justify the test deployments raise clear issues regarding the extension of police powers and of 'surveillance creep'."
Their report also criticised the Met's use of "watch lists" - the registers of "wanted" people that facial recognition is supposed to help locate.
According to the report, the data used to create watch lists was not current, so people were stopped even though their case had already been addressed. In other cases, there were no clear reasons why people were put on watch lists, leaving "significant ambiguity" about the intended purpose of facial recognition.


Facial recognition systems can see through some disguises
Image: Facial recognition systems can see through some disguises
Earlier this year, the Met's senior technologist Johanna Morley told Sky News that huge investment would be needed to upgrade police IT systems in order to ensure that the people on these watch lists were there legally.
The Met's use of facial recognition is being challenged in court by Big Brother Watch, an anti-surveillance campaign group.
The group's director, Silkie Carlo, told Sky News: "(This report) is absolutely definitive and I think there is really no recovery from this point.
"The only question now is when is the Met finally going to commit to stop using facial recognition."
The Home Office defended the Met, telling Sky News: "We support the police as they trial new technologies to protect the public, including facial recognition, which can help them identity criminals.
"The government believes that there is a legal framework for the use of live facial recognition technology, although that is being challenged in the courts and we would not want to pre-empt the outcome of this case."


Silkie Carlo
Image: Silkie Carlo told Sky News that there was 'no recovery' for the police's programme
The first court case against police use of facial recognition began in May in Cardiff.
Human rights group Liberty is bringing a judicial challenge against South Wales Police, which is undertaking a Home Office-funded programme of live facial recognition.
According to Liberty, the force will use facial recognition at the Wales National Airshow this Saturday in Swansea. In 2016, an estimated 200,000 people attended the event.

0 comments:

Post a Comment