Police forces are deploying predictive policing and other artificial intelligence tools with a 'Wild West' disregard for oversight and safeguards, according to an influential committee of peers. 'We were taken aback by the proliferation of artificial intelligence tools potentially being used without proper oversight,'  the House of Lords Justice and Home Affairs Committee reports. 

While facial recognition is the best known of the new technologies, 'many more are already in use, with more being developed all the time'. The market in such systems is 'worryingly opaque', creating a 'serious risk' that an individual's right to a fair trial could be undermined by 'algorithmically manipulated evidence'. 

Currently, police are not required to be trained to use such technologies or on the legislative context, the possibility of bias and the need for cautious interpretation of the outputs. 

Among other safeguards, the report calls for legislation to create a 'kitemark' system to ensure quality and to create a register of algorithms. Police forces should have a 'duty of candour' about the technologies in use and their impact, especially on marginalised communities. 

The committee says it acknowledges the 'many benefits' that new technologies can bring to law enforcement. However 'AI technologies have serious implications for a person’s human rights and civil liberties'. For example, the report asks, 'At what point could someone be imprisoned on the basis of technology that cannot be explained?'

Informed scrutiny is essential to ensure that any new tools deployed in this sphere are safe, necessary, proportionate, and effective - but this scrutiny is not happening. 'Instead, we uncovered a landscape, a new Wild West, in which new technologies are developing at a pace that public awareness, government and legislation have not kept up with. Public bodies and all 43 police forces are free to individually commission whatever tools they like or buy them from companies eager to get in on the burgeoning AI market. And the market itself is worryingly opaque.'

Public bodies often do not know much about the systems they are implementing, due to the supplier's insistence on commercial confidentiality, the report states. 'This is particularly concerning in light of evidence we heard of dubious selling practices and claims made by vendors as to their products’ effectiveness which are often untested and unproven.'

Meanwhile 'a culture of deference towards new technologies means the benefits are being minimised, and the risks maximised'. 

Committee chair Lady Hamwee (former solicitor Sally Hamwee) said: 'Without proper safeguards, advanced technologies may affect human rights, undermine the fairness of trials, worsen inequalities and weaken the rule of law. The tools available must be fit for purpose, and not be used unchecked.

'Government must take control. Legislation to establish clear principles would provide a basis for more detailed regulation.'