Interview: Rob McCargow, programme leader for PwC, talks about the balance of opportunity and risk in the deployment of artificial intelligence
The debate over artificial intelligence (AI) in public services took a new turn recently, with the publication of a report by PwC that forecast three waves of implementations and that it could affect a third of jobs in the sector by the mid 2030s.
Its title, Will robots really steal our jobs? is bound to prompt some trepidation, but it takes a measured look at the impact across the economies of 29 countries, identifying opportunities and risks with the evolution of the technology.
Rob McCargow, AI programme leader for the consultancy, comes across as a cautious optimist, generally welcoming the development of AI, but highlighting the risks and making clear that it has to be applied carefully.
“On one hand there is the increased maturity of technology offering solutions to intractable problems; but on the other there are new risks,” he says.
Perhaps the largest is one that has been identified from several quarters, that the algorithms used in AI to support decision making could reflect the biases of the people who programmed them. If it is increasingly used by offficials, doctors, law enforcement and legal officers to make sensitive decisions, it could lay the ground for some serious mistakes and injustices.
“The overall considerations for public sector bodies is that, as we start moving from use cases to this technology, there will be significant consequences if they don’t quite work,” McCargow says.
“For example, something that handles basic queries on where to find information on council websites; it’s annoying to the customer but not necessarily causing harm. But as we are starting to see the advance of the technology into uses of greater consequence, such as in medical diagnosis and criminal justice, the possibility of consequential harm increases significantly.”
Need for diversity
It could reinforce a familiar criticism of the way public officials make decisions: that too often they reflect the perspectives of an affluent and well-connected minority. McCargow sees the solution partly in ensuring that decisions on the use of AI take in a range of perspectives, and that people from diverse backgrounds are encouraged to become involved in its development.
“It’s vital not just to have the right technology approach, but to think about how you bring challenge from across your organisation. One factor that is so important is the lack of diversity in the AI workforce.”
This requires a significant long term effort to overcome, but he points to PwC’s creation of a programme of computer science apprenticeships – which involves encouraging more women into the field and supporting social mobility – and says that public authorities could do a lot through their hiring and training programmes to further the cause.
He also emphasises the important of public sector leaders acquiring some knowledge not just of the potential of the technology, but how it uses data, its long term impact on their organisations, what sort of governance is needed for its deployment, and how they should talk about it with their staff, other stakeholders and the public.
Again he refers to practice inside PwC, saying its AI projects follow an inter-disciplinary approach, including data scientists, software engineers, product and programme managers, and people who can who can look at the issues through a human resources or regulatory compliance perspective.
“We always try to apply as much challenge as possible to ensure we’ve thought through every consequence before we go live on anything,” he says.
As for the course of deployment over the next 15-20 years, he reiterates points made in the report. He sees it coming in three waves: first the algorithm wave, focused on the automation of simple computational tasks and analysis of structured data; then ‘augmentation, automating repeatable tasks and statistical analysis of unstructured data; finally the ‘autonomy wave, automating physical labour and problem solving in real world situations.
The report highlights the potential in healthcare and education, and McCargow talks about the latter involving tasks in which AI can assist and augment the work of teachers and provide scope for more personalisation in how students are taught.
But he sees the early steps in picking “lower hanging fruit” such as customer services, and believes that in many organisations the more basic forms of AI, described as “intelligent automation”, could be used alongside humans.
The thinking is that many end-to-end processes consist of some parts that are ideal for robotic software, such as invoice reconciliation, while others could be enhanced with a conversational agent or chatbot, and others are too complex for automation or augmentation and would continue to rely on humans. It points towards a hybrid that could allay the fears of machines taking over public administrations.
He also warns that the technology should not be seen as a solution for everything.
No silver bullet
“There’s a risk of jumping the gun by assuming that AI is the silver bullet for a number of challenges, but that is fundamentally not always the case,” he says. “You need raw ingredients for a successful deployment, the principal of which is the volume and quality of data.
“You also need the right technology and right computer processing power in the cloud, and the right talent within the organisation.”
It is another element of AI being a part of a wider mix to do the most good for society.
Overall, he is optimistic about the prospects, and believes that public sector bodies can take a lead in setting new standards for transparency and trust in its application. But he also sees a need for the public and private sectors to find the right balance between being ready to exploit the opportunities in AI while acknowledging the limitations and risks.
“It needs organisations to adopt the technology in a way that allows them to be at the bleeding edge of innovation, but also that really supports trust in confidence in the organisations and its stakeholders."