Think tank report says AI, big data and other innovations could transform the justice system - but will need new governance frameworks
Images of members of the public are being collected for analysis by police automatic facial recognition systems “outside of any regulation at all” according to an influential justice thinktank.
In a report published this month examining the potential of technologies to transform Britain’s policing, legal advice systems and courts, the Centre for Justice Innovation says that emerging technologies have huge potential - but that public support may be fragile.
“Without the necessary vigilance, we could drift towards an automated, industrialised justice system where important decisions are taken by hidden algorithms, where automated systems isolate citizens from human and professional contact, and where technology allows the state (and others) to intrude even further into the private lives of citizens,” warn the authors of Just Technology: emergent technologies and the justice system - and what the public thinks about it.
The report is significant because the Centre for Justice Innovation and the authors, former senior civil servant Phil Bowen and former ministerial adviser Blair Gibbs, enjoy more influence at the Home Office and Ministry of Justice than most civil liberties groups. It also looks at public attitudes to new technologies in criminal justice: stating that while these are often positive, they may also be fragile.
Big data is likely to have seismic effects on policing by enabling forces to predict crimes and catch offenders through the analysis of data from other systems, the report says. However, it could also profoundly alter a policing model defined by consent, with police resources “increasingly pro-actively directed by data, rather than reactive to calls for service”.
The centre adds its voice to calls for “a clear decision-making framework at the national level” to ensure the ethical use of big data technology.
On facial recognition - the use of which is already being challenged by civil liberties groups such as Big Brother Watch - the report notes, “as new facial recognition software comes to market every year, the law has simply been unable to keep up”.
At present, the sourcing of images for matching with automatic facial recognition software “seems to be outside of any regulation at all”. The report says the Government “should actively consider whether there is a need for primary legislation to… address the existing gaps in the legal framework around the sourcing and retention of images”.
It warns that public support for the technology, although high at present, could be jeopardised if it is rolled out without standards. Younger people, it notes “appear far more cautious about using images from any source than any other age group, perhaps reflecting their level of exposure to such a policy given their greater internet footprint than other age groups or maybe their more sophisticated understanding of the role of technology in society”.
Concerns stems from the clear difference between a manual process to probe a single image against a database - as at present - and "the future potential of routine, or automated mass matching of images against an ever expanding database".
A core principle for “just technology” is that new technology should support, not supplant, the role of humans in the justice system and its introduction should not fetter the right for individuals to interact with human decision makers where they choose to. For example, while the public appears to support the Government's plans for online courts for minor criminal offences, the physical route should always be available.
Overall, so long as new and disruptive technologies are shown to work, can maintain public consent and do not undermine the legitimacy of the justice system, they are “a risk worth taking”, the report concludes.