US attempts to cut the prison population by means of computer predictions of offender behaviour are based on flawed understanding and could entrench existing bias, artificial intelligence experts have concluded. As a result, algorithm-based tools ‘should not be used alone to make decisions to detain or to continue detention’, according to the Partnership on AI. 

The partnership’s report will fuel concerns about UK police forces’ interest in computerised risk assessment tools based on AI.

In the US, several jurisdictions have begun to mandate such tools, based on simple forms of machine-learning algorithms, to try to remove bias from assessments such as the likelihood that a suspect will fail to turn up in court.  

Such efforts are ‘laudable and important’, says the partnership, an association of more than 80 groups, AI developers and academics. However it questions whether algorithms are the right tool for the job. ‘It is a serious misunderstanding to view tools as objective or neutral simply because they are based on data,’ the report states. ‘While formulas and statistical models provide some degree of consistency and replicability, they still share or amplify many weaknesses of human decision-making.’

A basic problem is the ‘ecological fallacy’ which occurs where statistical data collected about a group is applied to train a system to predict an individual’s behaviour. Such data, for example arrest records, can amount to a proxy for race, the report states. 

It sets out 10 requirements which jurisdictions should ‘weigh heavily and assess’ before adopting computerised risk assessment. These include applying methods to measure and mitigate bias. Systems must also be designed to reveal how they came up with their decisions to enable ‘meaningful contestation and challenges’. 

However, even if systems meet these requirements a fundamental question remains: is it acceptable to make decisions about an individual’s liberty based on data about others? 

  • The report of the Law Society’s public policy commission on algorithms in the criminal justice system is due to be published on 4 June.