Cast your mind back to the stuffy lecture theatre of your student days. Do you remember the fascinating case of Varghese v China Southern Airlines, about injury to an airline passenger? No? Perhaps you were snoozing at the back of Thursday morning tort law. How about Shaboon v Egypt Air? Still drawing a blank? The reason these cases are not ringing any bells is that they do not exist. They are straight from the overactive imagination of an invisible robot known as ChatGPT. It tries its best to be helpful, but tends to get a little carried away – and it will not let mere facts stand in the way of a nice answer. 

Rachel Rothwell

Rachel Rothwell

These fictitious cases were cited by a hapless US lawyer who had used the artificial intelligence tool to find cases in support of his submissions. Mortified by what had happened, the lawyer explained that he had learned about the technology from his college-age children. He had not understood that it was not acting like a search engine and that it could make up results. He and a colleague were fined by the court – which was no doubt keen to set an example – for committing ‘acts of conscious avoidance and false and misleading statements to the court’.

In this instance, the bogus cases came to light because they were spotted by the trial judge. But let’s face it, plenty of fake cases have probably already slipped past the net by now in case submissions on both sides of the Atlantic – and plenty of other jurisdictions as well.

Chat GPT app logo

Source: iStock

If even qualified lawyers are turning to AI, then it is inevitable that litigants in person will be using it. Indeed the Gazette reported in May that a LiP was found to have submitted AI-invented cases, or fake passages from real cases, in support of their claim.

Clearly, lawyers have coped perfectly well until now without needing an artificial brain to draft submissions or research case authorities for them. But this is a competitive world, and if one firm is gaining an edge by using AI to carry out tasks more quickly and cheaply, others will soon be compelled to do so. In the litigation context, the courts are already getting wise to this. As the Gazette has also reported, a new practice direction in a Canadian province now requires lawyers to tell the court if they have used AI, and how. Other jurisdictions will surely follow suit.

Bogus cases highlight the overactive imagination of AI, and the danger of relying on it too much. But this is one of the few areas where it should be relatively easy to stamp out the problem, at least in the context of legal proceedings – because it is usually straightforward for lawyers to check from an official source whether the original judgment actually exists. Personally, I worry more about the danger of AI creating misinformation more broadly, both in the legal sphere and beyond.

My concern is that many AI hallucinations will not remain as one-off anomalies, but will ultimately end up finding their way on to the internet – for example in cheap, AI-generated marketing articles that are posted online without proper vetting by knowledgeable humans. Once we reach the stage where AI has unlimited access to the live internet to gather its information, these made-up facts, which have been carelessly posted online, will become source material for the next AI task. Before you know it, the fabrication – in the legal context, a fake legal principle or fake case perhaps – has been referred to in multiple sources and looks deceptively real. Forget the ‘common law wife’ – consumers could be given a whole host of new legal myths to believe in.

On the face of it, the development of AI should empower consumers to find out more about the law for themselves. But until the tech geniuses can find a way to stop their AI prodigy from telling porky pies, it seems to me that free legal information – along with free information in many other spheres – is about to get a whole lot more circumspect. If consumers want to find out about the law from a reliable source, there will be one place to go: the qualified lawyer.

 

 

 

Rachel Rothwell is editor of Gazette sister magazine Litigation Funding, the essential guide to finance and costs.

For subscription details, tel: 020 8049 3890, or click here

Topics