Skip to the content

Civil rights group urges ban on predictive policing IT

05/02/19

Michael Cross Correspondent

Get UKAuthority News

Share

Civil rights pressure group Liberty has become the latest body to call for curbs on artificial intelligence systems to predict where crimes are likely to happen and the likelihood of individuals reoffending.

It has published a report, Policing by Machine, that lists 14 UK police forces which it says it has identified as deploying, testing or investigating 'predictive policing' systems based on machine learning algorithms.

These include:

  • Durham Constabulary's Harm Assessment Risk Tool (HART), a system which assists custody officers in identifying which offenders are eligible for a scheme which diverts them from prosecution;

  • Merseyside Police's use of predictive techniques;

  • Norfolk Police's trials of an algorithm to assist in deciding whether burglaries should be investigated;

  • West Midlands Police's analysis of hotspots identified on a mapping system and the same force's data driven insights project. 

The Liberty report echoes previous concerns that algorithm-driven systems based on big data can entrench existing bias and are not open to scrutiny. 

Although police forces generally require human oversight over such programs, in practice officers are likely to defer to the algorithm, the report warns.

Incredibly difficult

“A police officer may be hesitant to overrule an algorithm which indicates that someone is high risk, just in case that person goes on to commit a crime and responsibility for this falls to them – they simply fear getting it wrong,” it says. “It is incredibly difficult to design a process of human reasoning that can meaningfully run alongside a deeply complex mathematical process.”

Liberty recommends that police forces should end their use of predictive mapping programs, which “rely on problematic historical arrest data and encourage the over-policing of marginalised communities” as well individual risk assessment programs that “encourage discriminatory profiling”. 

At the very least, the report calls on police forces to disclose information about the use of predictive policing technologies, including to people most likely to be affected by the systems. 

Meanwhile, investment in digital policing should focus on the development of programs and algorithms that actively reduce biased approaches to policing, the report states. 

It concludes: “While it may seem a laudable aim to prevent crime before it ever occurs, this is best achieved by addressing underlying social issues through education, housing, employment and social care. The solution does not lie in policing by machine.” 

Earlier warnings

Liberty is far from the first to raise such concerns.  In September last year a study by the Royal United Services Institute and the University of Winchester concluded with a call for the Home Office to set out rules governing how police forces should conduct trials of predictive policing tools. 

The Law Society is also holding a public policy commission on algorithms in the justice system. It will conduct two public hearings this month in Cardiff and London.

Image by Andy Thornley CC BY 2.0

Register For Alerts

Keep informed - Get the latest news about the use of technology, digital & data for the public good in your inbox from UKAuthority.