A litigant in person tried to present fictitious submissions in court based on answers provided by the ChatGPT chatbot, the Gazette has learned. 

The civil case, heard in Manchester, involved one represented party and one unrepresented: proceedings ended for the day with the barrister for one side arguing there was no precedent for the case being advanced.

The Gazette understands that the following day, the LiP returned to court with four case citations, each backing the point they were trying to make.

On closer inspection by the barrister, it transpired that one case name had simply been fabricated, while the other three were real case names but with the relevant cited passages being completely different to the judgment in each. For all four citations, the paragraphs quoted were completely fictitious, though appearing completely legitimate.

It is understood that the judge quizzed the litigant in person, who admitted they had asked the AI tool ChatGPT to find cases that could prove their argument.

The chatbot appears then to have delved into a bank of case names and created excerpts purportedly from these cases which responded to the question asked of it. The judge accepted the misleading submissions were inadvertent and did not penalise the litigant. 

Chat GPT app logo

The litigant in person admitted they had asked the AI tool to find cases that could prove their argument

Source: iStock

The case highlights the potential influence of AI in court proceedings, particularly when one or both parties is unrepresented. There have been reports from Colombia and India this year that judges have used ChatGPT to help make rulings.

Websites dedicated to artificial intelligence in law boast they can create legal documents online and generate legal text output that is both accurate and natural-sounding. Many firms are already using 'large language models' such as ChatGPT to create legal marketing content and draft legal documents.

The Judicial Office says it provides training resources, including a handbook for litigants in person, to equip judges and LiPs with the necessary information to ensure parties understand court proceedings and what is expected of them. Any use of false documentation is taken ‘extremely seriously’ and action is taken where necessary.

A spokesperson said: ‘The Judicial College regularly reviews training content and guidelines in line with modern developments in legal practice and legislation. Appropriate updates are made to ensure we provide the best support to judges, including guidance on how to respond to developments in technology.’

 

This article is now closed for comment.