Skip to the content

Call for procurement controls of police AI

04/06/19

Michael Cross Correspondent

Get UKAuthority News

Share

Law Society commission identifies ‘window of opportunity’ for algorithm-based justice

The widespread deployment of algorithm-based machine learning in criminal justice urgently needs to be brought under the rule of law, a year-long investigation reports today. The report of the Public Policy Commission into Algorithms in the Justice System calls for new measures to regulate what it describes as ad-hoc procurements, including a statutory register and a requirement for public bodies to own software rather than rent from commercial suppliers. 

The commission was established by the Law Society for England and Wales, the representative body of solicitors. Its report is based on more than 80 submissions from academics, civil society bodies and police forces and others, both in writing and at public hearings. It echoes concerns raised by other recent studies in the US and the UK about the risk that artificial intelligence will reinforce rather than eliminate human prejudices. It warns that policy decisions could be ‘baked into algorithmic systems made invisibly and unaccountably by contractors and vendors’. 

However, the report concludes that, within the right legal framework, algorithm-based machine learning has a role to play in the justice system. ‘The United Kingdom has a window of opportunity to become a beacon for a justice system trusted to use technology well, with a social licence to operate and in line with the values and human rights underpinning criminal justice. It must take proactive steps to seize that window now.’

Key recommendations of the report cover: 

  • Oversight: A legal framework for the use of complex algorithms in the justice system. The lawful basis for the use of any algorithmic systems must be clear and explicitly declared
  • Transparency: A national register of algorithmic systems used by public bodies
  • Equality: The public sector equality duty is applied to the use of algorithms in the justice system
  • Human rights: Public bodies must be able to explain what human rights are affected by any complex algorithm they use
  • Human judgement: There must always be human management of complex algorithmic systems
  • Accountability: Public bodies must be able to explain how specific algorithms reach specific decisions
  • Ownership: Public bodies should own software rather than renting it from tech companies and should manage all political design decisions

Christina Blacklaws, Law Society president, said that at present “there is a worrying lack of oversight or framework to mitigate some hefty risks – of unlawful deployment, of discrimination or bias that may be unwittingly built in by an operator.

“These dangers are exacerbated by the absence of transparency, centralised coordination or systematic knowledge-sharing between public bodies. Although some forces are open about their use of algorithms, this is by no means uniform.”

Report: Algorithms in the Criminal Justice System

Register For Alerts

Keep informed - Get the latest news about the use of technology, digital & data for the public good in your inbox from UKAuthority.