Police forces in the UK should abandon their tests of computer programs to predict where crimes are likely to happen and whether individuals are likely to re-offend, human rights pressure group Liberty says today. According to the group, at least 14 forces in the UK are testing or in the process of developing ‘predictive policing’ systems based on machine-learning algorithms.
A highly critical report, Policing by Machine, says that such systems can entrench bias, by making decisions based on historical 'big data' about crimes. The predictions may be based on 'black box' algorithms, which are impossible to scrutinise. Although police forces generally require human oversight over such programs, in practice officers are likely to defer to the algorithm, the report warns.
'A police officer may be hesitant to overrule an algorithm which indicates that someone is high risk, just in case that person goes on to commit a crime and responsibility for this falls to them – they simply fear getting it wrong. It is incredibly difficult to design a process of human reasoning that can meaningfully run alongside a deeply complex mathematical process,' the report states.
Liberty recommends that police forces should end their use of predictive mapping programs, which 'rely on problematic historical arrest data and encourage the over-policing of marginalised communities' as well 'individual risk assessment programs, which 'encourage discriminatory profiling'.
At the very least, the report calls on police forces to disclose information about the use of predictiving policing technologies, including to those most likely to be affected by the systems.
Meanwhile, investment in digital policing should focus on the development of programs and algorithms that actively reduced biased approaches to policing, the report states.
It concludes: 'While it may seem a laudable aim to prevent crime before it ever occurs, this is best achieved by addressing underlying social issues through education, housing, employment and social care. The solution does not lie in policing by machine.'
Liberty is far from the first to raise concerns about the prospect of so-called 'Schrödinger's justice'. In September last year a study by the Royal United Services Instittue and the University of Winchester concluded wtih a call for the Home Office to set out rules governing how police forces should conduct trials of predictive policing tools.
- The Law Society’s policy commission on algorithms in the justice system will hold final public evidence sessions this month, in Cardiff (7 February) and London (14 February).