Skip to the content

London ethics panel outlines five steps for facial recognition in policing

30/05/19

Mark Say Managing Editor

Get UKAuthority News

Share

London’s deputy mayor for policing and crime has said the Metropolitan Police will take seriously the recommendations of an independent panel on the use of facial recognition software in policing.

Sophie Linden made the remarks after the London Policing Ethics Panel published its review of the technology, along with five conditions it said should be met.

It concluded that that, while there are important ethical issues, these should not prevent the use of facial recognition if approached in the right way.

Linden said: “I welcome this extensive report into the potential implications of facial recognition software, and the recommendation that this technology should only be deployed by the police after five conditions are met – including strict new guidelines.

“We will continue to work closely with the Met and ensure the panel’s recommendations are addressed before further deployment.”

The panel said that facial recognition software should only be deployed by police if the five conditions can be met:

  • The overall benefits to public safety must be great enough to outweigh any potential public distrust in the technology.

  • It can be evidenced that using the technology will not generate gender or racial bias in policing operations.

  • Each deployment must be assessed and authorised to ensure that it is both necessary and proportionate for a specific policing purpose.

  • Operators are trained to understand the risks associated with use of the software and understand they are accountable.

  • Both the Met and the Mayor’s Office for Policing and Crime develop strict guidelines to ensure that deployments balance the benefits of this technology with the potential intrusion on the public.

London’s police force has carried out 10 trials on the use of the technology, but some have prompted criticisms of misuse. Civil liberties group Liberty focused on trials at the Notting Hill Carnival and questioned whether the algorithm had been tested for bias.

Also, the Police Foundation thinktank recently warned that clumsy use of algorithms such as those applied in facial recognition could undermine public trust in the police.

Independent evaluations

In addition to the panel’s research, the Met Police is carrying out two independent technical evaluations into its use of the software. The panel has urged it not to conduct any further trials until the results of these have been reviewed.

It also highlighted the results of a survey of more than 1,000 Londoners on their attitudes to facial recognition that showed more 57% felt its use by the police is acceptable, and the figure increased to 87% when it is used in searching for serious offenders.

There is a wider interest in the possibilities of the technology: last August the Home Office began a procurement of a database to support the use of facial biometrics.

The London ethics panel also set out a framework for the police when trialling new technology, consisting of 14 questions around engagement, diversity and inclusivity that have to be considered before any trial begins.

Image by Sheila Scarborough, CC BY 2.0 through flickr

Register For Alerts

Keep informed - Get the latest news about the use of technology, digital & data for the public good in your inbox from UKAuthority.