Two immigration solicitors will be investigated by the Solicitors Regulation Authority after a tribunal found that they presented false case citations.

Fiona Lindsley, judge of the Upper Tribunal, said in UK v Secretary of State for the Home Department (AI hallucinations; supervision; Hamid) that judges’ time was being wasted by searching for authorities generated by AI and not checked by authorised staff.

The problem is now so common that the claim form for judicial review has had to be amended to require lawyers to sign a statement of truth saying that cited authorities actually exist.

In a judgment concerning solicitors on different cases, handed down in November but published this week, Lindsley said it was no excuse if work had been carried out by junior staff. ‘A solicitor or other legal professional who delegates their work to another fee-earner remains responsible for the supervision of their work and for ensuring its accuracy, ‘ she said. ‘Such supervisors must ensure that fee-earners under their supervision are aware of the dangers of using non-specialist AI for legal research and drafting. Failures to do so, or to undertake appropriate checks on the drafting of fee-earners is likely to result in a referral to the Solicitors Regulation Authority or other regulatory body.’

Lindsley also highlighted the dangers of uploading confidential documents into open-source AI tools such as ChatGPT.

‘[Doing so] is to place this information on the internet in the public domain, and thus to breach client confidentiality and waive legal privilege, and any such conduct might itself warrant referral to the regulatory body and should, in any event, be referred to the Information Commissioner’s Office.’

In one of the highlighted cases, Tahir Mohammed of TMF Immigration Lawyers was responsible for drafting an application for permission to appeal. The application cited cases that were either non-existent or irrelevant.

Mohammed, who reported himself to the SRA, admitted he had put emails explaining Home Office decisions into ChatGPT to try to improve them.

In the second case, Zubair Rasheed of City Law Practice Solicitors and Advocates had signed a claim form which was placed before Upper Tribunal Judge Blundell. Several of the authorities cited were false or irrelevant: one misrepresented a case overseen by Blundell himself. Rasheed said the grounds for judicial review had been drafted by a part-time trainee who did not verify the references.

Lindsley said the issue was not just about citations and what she called the ‘naïve’ use of generative AI, but the lack of proper checks on junior lawyers’ work.

Rasheed asked the tribunal not to report him to the SRA but Lindsley said the inclusion of false citations iand his failure to supervise the work undertaken by others necessitated a referral.