US-style computer systems which score offenders for their future risk of committing crimes have no place in UK courtrooms, specialists in information law heard today. 

Professor Allan Brimicombe, head of the centre for geo-information studies at the University of East London, told the annual Trust, Risk, Information and the Law conference at the University of Winchester that challenges to such systems could 'bring courts to a halt'.

Systems based on computer algorithms have been used in the US for more than a decade to assess offenders' risk of reoffending if granted bail. However, a study last year of decisions made about 7,000 people arrested in Florida found evidence of racial bias in the predictions. Black people were almost twice as likely as white people to be falsely labelled at risk of future offending, while white people were more likely to be mislabelled as 'low risk' when they went on to commit crimes.

Brimicombe said that although the system does not explicitly score for race, 65-70% of data items about offenders 'would strongly correlate with race in America'.

However because the system is based on a proprietary algorithm, it is in effect a 'black box' whose decisions cannot be challenged in court, Brimicombe said. On the other hand, opening the algorithm to the public would be a  ‘double-edged sword', leading to its decisions being 'challenged to the point where the courts no longer function'.

‘Bail is a matter of judgment and should not be for algorithmic risk assessment,' he said.  

The Ministry of Justice and several UK police forces are considering the use of artificial intelligence to improve decision-making in the justice system and to introduce 'predictive policing' to pre-empt crimes. Durham Constabulary is currently testing a ‘harm assessment risk tool’ (HART) to see if it agrees with custody sergeants’ assessments of which offenders are at medium risk of re-offending within two years. The idea is to identify offenders such as long-term drug users who might be eligible for the force's Checkpoint programme, which offers people such as long-term drug offenders a good behaviour contract as an alternative to prosecution. 

However Sheena Urwin, head of criminal justice at Durham Constabulary, told the conference that the system will not replace human custody sergeants. ‘This is a decision support tool, not a decision maker,’ Urwin said. ‘It provides consistent and transparent decision support.' 

Meanwhile, one of the UK’s leading researchers in artificial intelligence in the law revealed that the first fruits of a collaboration with a new model of law firm will be announced next month. A paper co-authored by Karl Chapman, chief executive of fixed-price legal services business Riverview Law on 'Context based information extraction from commercial law documents' is on the programme of the International Conference on Artificial Intelligence and Law to be held in London from 12-16 June, Professor Katie Atkinson, head of computer science at the University of Liverpool, said.

Riverview announced its collaboration with Atkinson's artifical intelligence research group in 2015.