A recorder has ruled that a barrister who presented the court with incorrect information produced by AI searches should be named. Recorder Howard, sitting at Bournemouth Family Court, said that Layla Parsons had held herself out as a lawyer offering paid legal work to the public, so it was in the public interest to name her.

It was agreed by all the advocates in A, B, C, D, Re (Extension of assessment; Use of AI: hallucinations) that Parsons had presented a skeleton argument presenting four cases or propositions that were not real. She accepted that she had used a widely known AI tool to help prepare the skeleton and apologised for inadvertently misleading the court. But she urged the court not to publish either the judgment or her name, saying this would amount to ‘character assassination’ which would place her at risk of actual and psychological harm.

The recorder ruled however that the judgment should be published as it was another example where AI hallucinations have led to the court being misled by a person representing themselves relying on the AI tool without reference to their duty to check the citations.

He added: ‘There is a real and not fanciful possibility that Ms Parsons will in the future offer legal services to members of the public. I consider that this factor, and the need for any person engaging the services of Ms Parsons in legal proceedings, to know that she has misled the court (albeit unintentionally) and does not in my judgement properly understand what she has done wrong is a strong and overwhelming factor in favour of naming Ms Parsons.

‘When I balance that factor against the risks Ms Parsons asserts, I consider it strongly outweighs the risks to her, and that naming her is a necessary and proportionate interference with her right to family life.’

The court heard that Parsons had acted as a lay advocate for her friend, a mother of four who was involved in Children Act proceedings. In the proceedings, she had sought to support the mother and put herself forward to care for all the children if they could not remain in the mother’s care.

Parsons works as a therapist but also held herself out as a lawyer. She is an unregistered barrister who had also done paid legal work as recently as last November. She had made applications in the case to be joined as a party, and for a special guardianship assessment and then as an interim kinship carer.

Following a hearing in January, at which it was discovered that the false cases had been cited, the recorder invited written submissions about whether the judgment should be published and Parsons named. In her submissions, Parsons offered immediate apologies for misleading the court having relied on the output of the AI tool. She was now fully aware that the AI tool may give incorrect information.

The recorder absolved her of any intention to mislead the court, but said he remained concerned that Parsons ‘minimises the seriousness’ of what had happened. He also rejected her suggestion that criticism of AI risks setting a harmful precedent for disabled litigants in person and would discourage access to justice.