Skip to the content

ICO wants statutory code for police use of facial recognition

01/11/19

Mark Say Managing Editor

Get UKAuthority News

Share

Information Commissioner Elizabeth Denham has called for the creation of statutory code on police forces’ use of live facial recognition (LFR) technology.

The call has come with the publication of her office’s report on the issue, prompted by rising concerns about the civil liberties implications of how the technology is used.

Denham (pictured) said the current combination of laws, codes and practices relating to LFR will not be enough for the ethical and legal approach necessary to manage the risks, and that this will likely undermine public confidence in its use.

“As a result, the key recommendation arising from the ICO’s investigation is to call for government to introduce a statutory and binding code of practice on the deployment of LFR,” she said. “This is necessary in order to give the police and the public enough knowledge as to when and how the police can use LFR systems in public spaces.

“We will therefore be liaising with Home Office, the Investigatory Powers Commissioner, the Biometrics Commissioner, the Surveillance Camera Commissioner and policing bodies on how to progress our recommendation for a statutory code of practice.”

She also recommended that more work should be done by a range of agencies, including developers of LFR, to eliminate any bias in its algorithms, especially those associated with ethnicity.

Practical guidance

Denham also published her opinion notice on the issue, saying there are well defined data protection rules that police forces need to follow in deploying the technology. It includes practical steps in this process, including the recognition of a strict necessity threshold and training for police officers using LFR on the implications of processing biometric data and the principles of data protection.

The Information Commissioner’s Office (ICO) report on the issue was prompted largely by protests against LFR’s use in trials by the Metropolitan Police and South Wales Police, the latter of which led to a court case that the force won.

Among the report’s key findings are that, while there is evidence that both forces applied good practice in processing the data, there were also areas for improvement, and there have been missed opportunities to achieve higher standards and improve public confidence.

There were also inconsistencies in their approaches that could be repeated as other police forces pick up the technology, which could in turn increase the risk of failing to comply with data protection regulations. The bar has been set high for processing biometric data, but the ICO believes the more generic the objectives and watchlist used in any operation, the less likely that the bar will be met.

Reduce bias

In addition, more needs to be done to reduce the technology bias that has led to false matches of faces – effectively indicating that innocent people are on a watchlist of suspects – and data protection officers have been on the peripheries of the pilots. The latter has added to the concerns about police forces’ accountability in using LFR.

The ICO is also investigating the use of facial recognition in the private sector, including where it is used in partnership with law enforcement. This follows complaints over its use in the area around King’s Cross in London.

Image from ICO

Register For Alerts

Keep informed - Get the latest news about the use of technology, digital & data for the public good in your inbox from UKAuthority.