Skip to the content

Commission hears call for public register of justice AI systems

30/07/18

Michael Cross Correspondent

Get UKAuthority News

Share

The Government should set up an independent public register of artificial intelligence systems to ensure that automated decision-making by the police, courts and other justice agencies is open to public scrutiny, a new public inquiry heard last week.

The Technology and Law Policy Commission on algorithms in the justice system, set up by solicitors’ body the Law Society, is investigating questions arising from the use of big data and algorithms and the risk of so-called ‘Shrodinger’s justice’.

Its first public evidence session last week discussed concerns about the possibilities of bias and error in AI. But it also heard a strong defence from a pioneering chief constable of his force’s testing of an algorithm based assessment tool for managing offenders. 

On the topic of regulation, Roger Bickerstaff, partner at international law firm Bird & Bird, said that any regime should be ‘light touch’ to avoid stultifying development. Key figures of a framework for the justice system could include a central public register of AI/algorithmic solutions used in the justice system held by an independent body.

“This would allow for public scrutiny of the usage of these solutions. Public scrutiny would put pressure on developers to give greater consideration to the impact and consequences of their products. The public entities operating the systems would also not be shielded by confidentiality.”

Bickerstaff also suggested that the extent to which the new Data Protection Act makes subject access rights available in law enforcement “needs careful review”.

Proposal for review

In addition, he proposed a review to determine if changes to the law are needed in order to give citizens effective legal remedies if problems arise through the use of AI/algorithmic solutions in the justice system.

“For example, what rights would an individual have who is arrested or subject to some other form of police action as a result of the output of an AI/algorithmic solution?” he asked.

He stressed that this light touch regulatory framework should not mean that systems require approval before deployment in the justice system: “Such prior approval would stultify development. The time and costs would be considerable and the skillsets of any approval body are likely to run well behind those of commercial developers.”

Also at the hearing, Michael Barton (pictured), chief constable of Durham Constabulary, robustly defended his force’s testing of algorithm based system for dealing with offenders.

Durham Constabulary has come under fire after revealing last year that it was testing a Harm Assessment Risk Tool (HART) to help custody officers identify offenders eligible for a deferred prosecution scheme called Checkpoint designed to encourage offenders away from criminality. The tool employs advanced machine learning to predict the likelihood that an individual will reoffend in the next two years. 

Lack of nuance

Barton said that HART was intended as a decision support tool and would never be capable of the kind of nuanced decisions made daily by custody officers. The main reason for its use is to ensure that people accepted for the Checkpoint scheme do not go on to commit serious crimes.

“We are halfway through the pilot of finding out whether custody officers do better than the algorithm,” he said, promising that results will be peer reviewed and published.

So far fewer than 5% of people in Checkpoint had returned to crime he said. Barton joked that he was pleased that “one or two people had taken issue” with the HART pilot - “because that means you trust the cops”. 

The event in Chancery Lane was the first of four public hearings scheduled for the commission, set up by Law Society president Christina Blacklaws. The next hearing will be on 12 November. 

 

Register For Alerts

Keep informed - Get the latest news about the use of technology, digital & data for the public good in your inbox from UKAuthority.