The Equality and Human Rights Commission (EHRC) has called for the suspension of the use of automated facial recognition (AFR) and predictive algorithms in policing in England and Wales.
It wants the technology put on hold until its impact has been independently scrutinised and laws updated to provide sufficient protection for the public.
The call comes amid continuing controversy over the use of AFR, which has won the approval of the Home Office but come in for criticism from civil liberties groups and watchdogs.
The EHRC has highlighted its concerns about AFR in evidence submitted to the UN on a range of civil and political rights issues. It emphasised concerns about how the use of AFR is regulated, and suggested it may not comply with the UK’s obligation to respect privacy rights under the International Covenant on Civil and Political Rights (ICCPR).
The report also raises questions about the technology’s accuracy and points to evidence that many AFR algorithms disproportionately misidentify Black people and women, and therefore could be discriminatory.
Rebecca Hilsenrath (pictured), chief executive of the EHRC, said: “With new technology comes new opportunities and new, more effective ways of policing crimes and protecting UK citizens. We welcome this opportunity and recognise the priority that everyone is kept safe. But these also bring new challenges and new risks which we need to meet in order to use any such technology effectively for the good of the community as a whole.
“It is essential that our laws keep pace with our evolving digital world so that new techniques to protect us don’t infringe on our rights in the process, and damaging patterns of discrimination, that we already know exist, are not reinforced.
“The law is clearly on the back foot with invasive AFR and predictive policing technologies. It is essential that their use is suspended until robust, independent impact assessments and consultations can be carried out, so that we know exactly how this technology is being used and are reassured that our rights are being respected.”
AFR in policing is proving to be a highly controversial issue. In July of last year the Home Office signalled its approval of police forces running trials of the technology, and the Metropolitan Police has announced its intention to press ahead with a limited deployment. Supporters of the technology were boosted by a court ruling last September in favour of its use by South Wales Police.
But it has provoked protests by civil rights campaigners, a Scottish Parliament committee recently came out strongly against its use and the Information Commissioner’s Office has highlighted shortcomings and called for a statutory code.
The EHRC has also expressed concerns over the use of predictive policing programmes, which use algorithms to analyse data and identify patterns, suggesting that such programmes could replicate and magnify discrimination in policing.
Image from EHRC