There is nothing inherently improper about augmenting legal services with AI tools - but they must be properly understood by individual practitioners and used responsibly. That is the gist of the latest guidance to the legal profession on the use of so-called large language model (LLM) software, from the bar's representative body. It follows the previous publication of advice by the Law Society and HM Judiciary

Like its predecessors, the Bar Council's guidance 'Considerations when using ChatGPT and generative artificial intelligence software based on large language models' warns against taking the output of generative AI systems at face value. 

It stresses that such systems do not analyse the content of data but rather act as 'a very sophisticated version of the sort of predictive text systems that people are familiar with from email and chat apps on smart phones, in which the algorithm predicts what the next word is likely to be'. 

The guidance identifies three key risks with the technology:  

  • Anthropomorphism: systems are designed and marketed to give the impression that the user is interacting with a human, when this is not the case. 
  • Hallucinations: outputs which may sound plausible but are either factually incorrect or unrelated to the given context.
  • Information disorder:The ability for ChatGPT to generate misinformation 'is a serious issue of which to be aware'.

Barristers are warned to be 'extremely vigilant not to share with a generative LLM system any legally privileged or confidential information (including trade secrets), or any personal data, as the input information provided is likely to be used to generate future outputs and could therefore be publicly shared with other users.'

Irresponsible use of LLMs can lead to 'harsh and embarrassing consequences', including claims for professional negligence, breach of contract, breach of confidence, defamation, data protection infringements, infringement of IP rights and damage to reputation; as well as breaches of professional rules and duties, the guidance states. 

Bar chair Sam Townend KC said: 'The growth of AI tools in the legal sector is inevitable and, as the guidance explains, the best-placed barristers will be those who make the efforts to understand these systems so that they can be used with control and integrity.'

 

This article is now closed for comment.