The Government has been urged to impose a transparency obligation on the public sector for its use of algorithms, according to a new report from the Centre for Data Ethics and Innovation.
It has also called for specific measures to promote fairness in the use of algorithms in local government and policing.
The review into bias in algorithmic decision making was commissioned by the Government two years ago following the rise in concerns over the scope for unfairness.
It makes clear that human bias often creeps into decision making in public and private sectors, says there is a strong risk of affecting algorithms used in the process and there is a need for measures to mitigate the risk.
It looks at specifics in four sectors – local government, policing, recruitment and financial services – with recommendations for each and others to be applied more generally.
The Centre for Data Ethics and Innovation highlights its recommendation that the Government should place a mandatory transparency obligation on all public sector organisations using algorithms that have an impact on significant decisions affecting individuals.
This would need a project to scope the obligation more precisely, and a pilot of an implementation approach. It would also require the proactive publication of information on how the decision to use an algorithm was made, how it is used in the overall process and steps to ensure individuals are treated fairly.
Local government factors
In the case of local government, the review points to an increasing use of data technologies to support decision making, and says evidence shows some people are more likely to be over-represented in the data, which can lead to biases in predictions and interventions. There can be a particular problem in high error rates among minority groups.
In response, the Government should develop national guidance to support local authorities to legally and ethically procure or develop tools for processes in which significant decisions are made about individuals, and consider how compliance with this should be monitored.
For policing, the review finds few tools currently in operation in the UK, but says police forces have access to more digital material than ever before and that there is a fragmented picture of the governance arrangements. Again, there are dangers of bias in decision making and a need for the right balance between automation and human judgement.
This prompts the recommendation that the Home Office should define clear roles and responsibilities for national policing bodies in regard to data analytics, with the power to set guidance and standards. As a first step, it should ensure support for work that the National Police Chiefs’ Council has begun in the field.
Other broad recommendations highlighted by the centre are that: organisations should actively use data to identify and mitigate any bias; and that the Government should issue guidance that clarifies the application of the Equality Act to algorithmic decision making.
Need to work together
Adrian Weller, board member for the Centre for Data Ethics and Innovation, said: “It is vital that we work hard now to get this right as adoption of algorithmic decision making increases. Government, regulators and industry need to work together with interdisciplinary experts, stakeholders and the public to ensure that algorithms are used to promote fairness, not undermine it.
“The Centre for Data Ethics and Innovation has today set out a range of measures to help the UK to achieve this, with a focus on enhancing transparency and accountability in decision making processes that have a significant impact on individuals. Not only does the report propose a roadmap to tackle the risks, but it highlights the opportunity that good use of data presents to address historical unfairness and avoid new biases in key areas of life.”
Image by Pamela Carls, CC BY 2.0