Computer algorithms used in law enforcement need legislative safeguards to prevent them being abused, peers have been told in an inquiry into the use of technology in the criminal justice system.

The House of Lords Justice and Home Affairs Committee was yesterday warned about the use of algorithmic tools – such as facial recognition technology, predictive policing tools, and digital categorisation of prisoners – to discover, deter, rehabilitate, and punish people who break the law in England and Wales.

Professor Karen Yeung, an interdisciplinary professorial fellow in law, ethics and informatics at Birmingham Law School, told peers that these tools are not objective, and can result in ‘systemic, sustained abuses going on in a highly opaque way’.

‘I would introduce legally mandated, systematic transparency for all uses of algorithmic tools in the public sector, particularly in the criminal justice system,’ she told the committee.

‘I think our constitutional principles are up to the task. The challenge is translating them into safeguards in relation to specific tools. And the problem is these tools are sophisticated and hard for the average person to understand. That’s why I think we need specific statutory regulations that translates and operationalises our constitutional principles into concrete, systematic safeguards.’ 

Facial recognition camera

Facial recognition tech needs to be more transparent, witnesses tell Lords committee

Source: iStock

‘This would make them much safer to use and inspire public trust in a way which currently isn’t there,’ Professor Yeung added.

Silkie Carlo, director of privacy campaign group Big Brother Watch, told the Lords committee that there should be an immediate legislative ban on the use of real-time facial recognition.

‘This is one of the disproportionate and extreme surveillance technologies we have ever seen in Britain,' she said. 'It’s already the case that without any parliamentary involvement and in near enough total absence of a regulator regime, tens of millions of Britons have already been subjected to facial recognition scans and won’t even know about it. And there are lots of people who have been wrongly stopped by the police and – in some cases – had some quite traumatic incidences as a result,’ she said.

Professor Yeung added that the surveillance technique marks a 'serious reversal of the presumption that one is entitled to go about one's business in a lawful manner without being disturbed by the state' and that facial recognition 'should be subject to very, very stringent regulations, if not an outright ban.'

The hearing is part of an inquiry into new technologies in law enforcement. The committee plans to examine the legal framework around these tools, ethical issues raised by their use, and the experiences of users and citizens interacting with them.