First use of AI-based technology in Old Bailey murder trial is rightly regarded as a watershed moment, but after many false starts veterans of past ‘hype cycles’ are tempering their expectations
After decades of false starts, 2023 could be the ‘breakout year’ for artificial intelligence in law. Or more precisely, a breakout year for the tandem technologies of natural language processing and machine learning. Breakthrough developments include the first use of AI-based technology in an Old Bailey trial and the granting of the first licences for the bulk downloading of court judgments.
Meanwhile, excitement continues to build about the potential of the ChatGPT family of ‘chatbots’, which draw on vast databases to formulate apparently intelligent text in response to new challenges – including answering bar exam questions (see LegalTech, p20).
The Gazette can reveal that the first Old Bailey case supported by commercial legal machine-learning technology was last year’s trial of James Watson for the 1994 murder of six-year-old Rikki Neave in Peterborough. The reopening of the case after more than two decades brought massive evidence-handling challenges, according to barrister Sally Hobson, a serious crime expert at The 36 Group, who defended Watson.
‘There were hundreds of witnesses and statements,’ she said. ‘It was pre-mobile phones so there was no original electronic evidence. In some ways it was a very old-fashioned trial.’
To handle more than 10,000 files the defence team looked for a document-handling platform. It picked a system from a UK AI startup, Luminance, whose machine-learning system is mainly used in commercial law for e-discovery. The developers claim the software is capable of ‘unsupervised’ learning: it can identify patterns in vast datasets without being told what to look for. This enabled it to collect data from multiple sources about key pieces of evidence, such as a pair of black shoes recovered from a bin and handwritten scene-of-crime officers’ notes.
'There were hundreds of witnesses and statements. It was pre-mobile phones so there was no original electronic evidence. In some ways it was a very old-fashioned trial'
Sally Hobson, The 36 Group
At the outset, Hobson said she was ‘quite sceptical’ about the technology because ‘I don’t know how it works’. However, she estimated that it saved the defence team four weeks’ work. In the event, Watson was convicted and sentenced to life imprisonment. An appeal has been lodged.
While the Neave murder trial had unique features, there seems no reason why machine-learning cannot be deployed to speed up the handling of more criminal cases – as it is already doing for processes such as contracting in the commercial world. In a new book on the subject*, Professor Martin Ebers of the University of Tartu, Estonia, says that self-learning AI systems are already making contracts on behalf of businesses. To critics of algorithmic ‘black boxes’ (see Practice Points, p22), Ebers argues that the predictions and decisions made by AI systems ‘are generally more accurate and less biased than those made by humans’.
Ebers acknowledges that the rise of AI in contracting processes ‘can exacerbate existing power asymmetries, as parties to a contract can leverage them to draft lopsided contracts and gain a better bargaining power’. This poses the question of whether contract law needs to provide new corrective mechanisms to address such imbalances.
No doubt recalling past ‘hype cycles’, legal AI experts are wary of predicting an imminent ‘breakout’. For a start, cautions Daniel Hoadley, head of data science and analytics at London firm Mishcon de Reya, the vast majority of commercial legal AI systems today deal with transactional data, which is structured and thus suited to training machine-learning systems. The unavailability of real-world unstructured legal data remains a problem to anyone chasing the dream of creating a robot lawyer. One step forward, however, could be the release of the National Archives’ new case law database for machine analysis. The National Archives told the Gazette that seven such transactional licences have been granted since they became available last year.
But can ChatGPT-type systems, drawing on the entire internet, fill the gap? Hoadley is excited by the potential but says practical adoption poses problems. ‘You can only leverage this type of AI if you’re willing to send your client data over the wire,’ he notes. He is also sceptical of how useful a system trained presumably on mostly North American data would be in specialist areas of England and Wales Law. ‘As interesting as the GPT family are, and they are very interesting, you’re not going to get many firms deploying them on anything relying on client data.’
* Contracting and Contract Law in the Age of Artificial Intelligence, edited by Martin Ebers, Cristina Poncibo and Mimi Zou, Bloomsbury