A year-long investigation by the Law Society has concluded that algorithm-based machine-learning has the potential to improve the criminal justice system - but warns of 'a worrying lack of oversight' over current experiments. In a report published today, Chancery Lane's technology and law policy commission on algorithms in the criminal justice system says that ad-hoc deployments by police forces of systems to predict crimes and monitor the behaviour of individuals must be given legal certainty.

Christina Blacklaws

Christina Blacklaws

The commission also recommends measures to improve transparency and accountability - and for the operators of the new technology to comply with the public sector equality duty. Echoing the findings of recent studies in the US and the UK, the report raises the spectre of artificial intelligence reinforcing rather than eliminating human prejudices. It warns that policy decisions could be 'baked into algorithmic systems made invisibly and unaccountably by contractors and vendors'. 

However the report is notably more positive than other recent studies about the potential for algorithm-based machine-learning - currently the most promising form of 'artificial intelligence' - so long as it is deployed in the right way. It concludes that: 'The United Kingdom has a window of opportunity to become a beacon for a justice system trusted to use technology well, with a social licence to operate and in line with the values and human rights underpinning criminal justice. It must take proactive steps to seize that window now.'

The commission's 80-page report is based on submissions from lawyers, academics, police forces, civil society groups and others in writing and at public hearings. Chair Christina Blacklaws, who has made technology a central theme of her year-long presidency of the Law Society, commented that: 'Within the right framework, algorithmic systems – whether facial recognition technology, predictive policing or individual risk assessment tools – can deliver a range of benefits in the justice system, from efficiency and efficacy to accountability and consistency.'

At the moment however 'there is a worrying lack of oversight or framework to mitigate some hefty risks – of unlawful deployment, of discrimination or bias that may be unwittingly built in by an operator'.

These dangers are exacerbated by the absence of transparency, centralised coordination or systematic knowledge-sharing between public bodies, Blacklaws said. 

The report is published today at a London conference on artificial intelligence in the justice system attended by the lord chancellor, David Gauke MP.  'We need to build a consensus rooted in the rule of law, which preserves human rights and equality, to deliver a trusted and reliable justice system now and for the future,' Blacklaws said.