Skip to the content

BCS calls for strict standards for use of algorithms in public policy


Mark Say Managing Editor

Get UKAuthority News


BCS – The Chartered Institute for IT, has called for standards to prove the ethics and competence of the use of algorithms and data science in public policy.

It has outlined the need in a report compiled in response to the controversy over examinations regulator Ofqual’s use of an algorithm in revising school students’ A level grades from those awarded by teachers in place of exams during the pandemic lockdown.

After widespread protests against the number of cases in which grades were reduced, the Government reversed its position to effectively award grades based on the initial estimates from teachers.

Named The Exam Question: how do we make algorithms do the right thing?, the BCS report says that information systems relying on algorithms are often a force for good, but it is hard to make them work as intended in high stakes situations.

There are issues around the public service standards openness, accountability and objectivity, and a big challenge in embedding these in the design and development stages, and ensuring the right governance is in place.

Professionalisation plan

In response, BCS recommends that the Government endorse the professionalisation of data science, in line with a plan it has already developed with the Royal Statistical Society, the Royal Society and others.

It would mean that algorithms whose creators did not meet a strict code of practice for ethics, professionalism and competence cannot decide issues such as exam grades or make estimates about the outcomes of pandemics to the Government.

In addition, all algorithms and data with high stakes consequences should be put through an impact assessment against widely recognised ethical standards, and be open to public scrutiny, before going live.

Dr Bill Mitchell (pictured), director of policy at BCS, said: “The exam crisis has given algorithms an undeserved reputation as ‘prejudice engines’ when in fact ethically designed algorithms fed on high quality data can result in massive benefit to our everyday lives.

“Lack of public confidence in data analysis will be very damaging in the long term. Information systems that rely on algorithms can be a force for good but, as students found out to huge cost, we have been using them to make high stakes judgements about individuals based on data that is subjective, uncertain and partial.

“We need true oversight of the way algorithms are used, including identifying unintended consequences, and the capability at a system level to remedy harm that might be caused to an individual when something goes wrong.

“That means, first of all, professionalising data science so that the UK has the most trusted, ethical and sought after data science teams in the world.”

Exams case failings

The report highlights a number of failings in the exams grade controversy, including that stakeholders were not consulted in advance on important issues such as the best method for using data to estimate grades and how to maintain standards. There was a lack of robust governance for the Ofqual board, the Department for Education and Parliament to oversee the development of exam grade estimation system; and there were weaknesses in the attempt to ensure objectivity in the process.

It draws three main conclusions:

  • Openness means being open about what data will be used, its provenance, how it will be used and what criteria will be applied to ensure the resultant information is fit for purpose.
  • It is essential to develop mechanisms for joint governance of the design and development of information systems at the start.
  • There is a need to be clear what is intended to be achieved at an individual level for all those affected by an information systems, how it will be measured and what it will mean for people to have trust in the system.

Register For Alerts

Keep informed - Get the latest news about the use of technology, digital & data for the public good in your inbox from UKAuthority.