Experts have long warned of the dangers of the ‘computer is always right’ presumption in law. The Post Office Inquiry has only added to calls for a new approach to the handling of IT-derived evidence

Twenty-seven years ago, the Law Commission decided that computers were grown-up enough to be trusted. In a review of the rules around hearsay evidence, the commission concluded that s69 of the Police and Criminal Evidence Act, which required anyone seeking to adduce evidence generated by a computer to show the system was working properly, ‘serves no useful purpose’.

Instead, it was satisfied that a ‘presumption of proper functioning’ could be applied. ‘We believe, as did the vast majority of our respondents, that such a regime would work fairly,’ the commission reported. The government agreed and s69 was repealed through the Youth Justice and Criminal Evidence Act 1999.

A growing consensus among specialists in IT law holds that the commission’s analysis was wrong. ‘I think the Law Commission when making its recommendation did not really understand the nature of software, hardware and IT systems,’ said Dr Sam De Silva, partner at international firm CMS and a former chair of the Law Society’s Technology and Law Committee. ‘In their report they appear to have misinterpreted various experts’ views and used their [Law Commission’s] somewhat misguided interpretation to support the repeal of s69.’

A tipping point of course was the 2019 judgment in Bates v the Post Office, which found that presumptions about the soundness of the Horizon computer system had put innocent people in prison. But legal experts such as barrister Stephen Mason have been warning for decades* about the dangers of the ‘computer is always right’ presumption.

'All computers have a propensity to fail, possibly seriously. That is to say, they have a latent propensity to function incorrectly'

*Recommendations for the probity of computer evidence

In November 2020, Mason was one of nine experts, led by Paul Marshall of Cornerstone Barristers, to warn the government of the danger of the presumption. Following a request by Alex Chalk MP, then a junior MoJ minister, the group drew up a report* in the wake of the Post Office scandal and other wrongful convictions. Their analysis: while a quest for convenience in evidence handling was understandable, ‘a presumption that the computer “works correctly” in itself is unsafe and, for anyone with expertise in the area, will appear wholly unreal, because it suggests a binary question of whether the computer is working or not’.

Where the Law Commission got it wrong, the paper said, was the assumption that computer errors are either caused by incorrect data being entered or are immediately detectable because they cause the machine to crash.

‘The reality is more complex. All computers have a propensity to fail, possibly seriously. That is to say, they have a latent propensity to function incorrectly.’

A series of recommendations included the strengthening of obligations for disclosure, to include, for example, all known error logs as well as evidence of reliably managed records of system changes. The disclosure exercise should, where possible, be ‘collaborative and co-operative between the parties’. When disclosure reveals shortcomings, the party seeking to rely upon the computer evidence would have to prove that none of the bugs or errors might affect the reliability of the material being relied upon. ‘The courts should consider what degree of doubt remains in the context of all the other available evidence,’ the authors suggested.   

The Marshall report is understood to have been referred to the attorney general and the chair of the Criminal Procedure Rule Committee, but no action appears to have been taken.

But, even ahead of the findings of the Post Office Inquiry, it is clear that a new approach to the handling of computer-derived evidence may be necessary. Since 2020 the potential problem has become more acute with the spread of ‘large language model’ artificial intelligence software, with its notorious ability to ‘hallucinate’.

One item on the agenda is whether it is time to subject IT designers to standards of ethical conduct required by other professions. ‘It seems an obvious step,’ Emily Taylor, a solicitor and chief executive of Oxford Information Labs, said.  Meanwhile, Harold Thimbleby, professor emeritus of computer science at Swansea University, called for legislation to require the registration of programmers of ‘serious systems’ along the lines of the medical profession. It is telling that one of the bodies calling attention to potential flaws in computer evidence is BCS, the Chartered Institute for IT.

Back in 2018, Sir Geoffrey Vos, then chancellor of the High Court, suggested in a lecture at the Law Society that in the near future the ubiquity of computer-generated evidence would almost eliminate court disputes over primary fact. At the very least, that prediction now looks premature.

*Stephen Mason and Daniel Seng, editors, Electronic Evidence and Electronic Signatures (5th edition, Institute of Advanced Legal Studies for the SAS Humanities Digital Library, School of Advanced Study, University of London, 2021) https://uolpress.co.uk/book/electronic-evidence-and-electronic-signatures/ 

 

This article is now closed for comment.