Calls are growing for judges to stop referencing fake case citations when making rulings on how the citations have been misused.
Lawyers and commentators warn that when judges cite the AI-generated cases in full, they risk inadvertently embedding such bogus ‘authorities’ as precedents, creating self-propagating misinformation.
The warnings follow an increasing number of examples, both at home and abroad, of litigants and lawyers presenting case law based on AI searches. These have sometimes produced entirely fake, or hallucinatory, case citations, which some judges have quoted in full.
Matthew Lee, a barrister at Doughty Street Chambers and founder of the Natural and Artificial Intelligence in Law blog, warned last year that there was a tension between transparency and protection, where judges instinctively want to set out fabricated authorities in full but risk unintended consequences.
Lee told Counsel magazine: ‘Well-intentioned judges often cite hallucinated cases and their erroneous legal principles in full within official judgments to show the extent of the problem to those reading. However, judges may be inadvertently exacerbating the issue because those AI-generated inaccuracies are being integrated into the established legal canon indirectly.’

Read more
Writing on LinkedIn this week, Jim Sturman KC from 2 Bedford Row echoed Lee’s concerns, indicating that nothing has been done. ‘The time has come for English judges to stop reciting the names of bogus cases in judgments. Doing so risks embedding bogus “authorities ” in the corpus of law reports,’ said Sturman. ‘It would be safer, and easy, to write “fake case 1” etc.’
The issue of adding to the embedding of false case citations has yet to be addressed in the UK courts, but there have been developments elsewhere.
Ruling last year in JML Rose Pty Ltd v Jorgensen, in the Federal Court of Australia, Mrs Justice Wheatley said it had become apparent during proceedings that one of the parties was using a form of generative AI to help with written and oral submissions. Purported quoted passages did not exist and were likely the product of ‘hallucinations’, said the judge.
She added: ‘There has been an approach, which I will adopt, of redacting false case citations so that such information is not further propagated by AI systems.’






















4 Readers' comments