A police chief pioneering new ways of dealing with offenders yesterday vigorously defended his force's pilot of a controversial algorithm-based system for picking suitable candidates. Michael Barton, chief constable of Durham Constabulary, was appearing at the first public evidence-gathering hearing of the Law Society's Technology and Law Policy Commission on algorithms in the justice system. 

Michael Barton, chief constable, Durham Constabulary

Michael Barton, chief constable, Durham Constabulary

Durham Constabulary has come under fire after revealing last year that it was testing whether an algorithmic 'Harm Assessment Risk Tool' (HART) could help custody officers identify offenders eligible for a deferred prosecution scheme called Checkpoint designed to encourage offenders away from criminality. The tool employs advanced machine learning to predict the likelihood that an invdividual will reoffend in the next two years. 

In a robust defence of the pilot, Barton said that HART was intended as a decision support tool and would never take the kind of nuanced decisions made by custody officers. The main reason for its use is to ensure that people identified as suitable for the Checkpoint scheme do not go on to commit serious crimes, he said. 'We are halfway through the pilot of finding out whether custody officers do better than the algorithm,' he said, promising that results will be peer-reviewed and published.

So far fewer than 5% of people in the Checkpoint programme had failed the four-month contract. Barton joked that he was pleased that 'one or two people had taken issue' with the HART pilot - 'because that means you trust the cops.' 

The event in Chancery Lane was the first of four public hearings scheduled for the commission, set up by Law Society president Christina Blacklaws. 

Earlier, the session heard from leading academics in artificial intelligence, data science and human rights about the potential issues likely to arise when algorithms - the basis of machine-learning artificial intelligence - take key decisions about people's lives. While such systems have the potential to eliminate human bias in criminal justice, there is also the possibility of embedding prejudices in the choice of training data.

Public Policy Commission on algorithms in the justice system

Public Policy Commission on algorithms in the justice system

One question likely to figure in the commission's report is that of the need for regulation. 'The more that people understand big data analytics the more support there is for the regulation on the use of data and algorithms,' Lorna McGregor, director of the Human Rights Centre at Essex University said. Lillian Edwards, professor of internet law at the University of Strathclyde, noted that criminal justice is exempt from one current regulatory protection, the 'right to information' about automated decision making under the Data Protection Act 2018. 

Meanwhile a past sceptic about the need for regulation, Roger Bickerstaff, technology partner at international firm Bird & Bird, told the commission that he had changed his views following the Cambridge Analytica affair. Some form of legal framework associated with the management of these systems is probably appropriate,' he said, suggesting that such oversight should ensure the transparency of automated decision-making.