Generative AI is transforming the way lawyers work. But it poses multiple challenges for firms, from procurement to data security. Above all, how much human interaction should AI replace?

Joanna goodman cut

Joanna Goodman

The low down

It is becoming increasingly difficult to find a solicitor who has not tried out the ‘generative AI’ available to them. The publicly available iteration of ChatGPT has now completed tasks, both serious and tongue in cheek, for sole practitioners and magic circle partners. They are unsettled by the processing power and resemblance of the output to legal sector tasks completed by real lawyers. But they also take comfort in AI’s hallucinogenic errors. The technology, though, is developing fast. And while generative AI’s large language models have grabbed the legal headlines, the use and impact will be much wider in law. Adoption of AI is happening now.

Law firms have been using AI for e-discovery, deal due diligence, document analysis and automation for nearly a decade. But until ChatGPT was launched in November 2022, the legal sector was a follower rather than a leader in the AI space. Now law firms are, unusually, in the vanguard of change.

Last year, a Goldman Sachs report identified law as one of the sectors most likely to be disrupted by generative AI – and predicted that 44% of legal tasks could be replaced by AI applications. At the same time, OpenAI’s large language model GPT-3.5 passed the US bar exam. Last year was significant in that all the major stakeholders – law firms and other legal services providers, legal tech vendors, law tech start-ups, legal academia, the judiciary, institutions and regulators – started exploring generative AI.

Early legal AI applications automated routine and time-consuming tasks that people in law firms did not want to do. They trawled through huge volumes of data, identifying relevant information for e-discovery, checking real estate details against Land Registry records, and identifying discrepancies in large datasets. The expansion of the lawtech start-up community saw contract analysis and compliance take centre stage. A multiplicity of ‘AI-powered’ offerings promised to revolutionise contracting and facilitate compliance.

While these applications did not fundamentally change legal services delivery, they were instrumental in making AI business as usual for law firms.

In just a few months, generative AI has had a massive impact on legal because it is accessible and intuitive (ChatGPT is publicly available, free or nearly free, and does not involve any training). This technology immediately worked very well with text. Its capabilities include searching, summarising, rewriting bullet points as emails and vice versa. Such accessibility sparked interest from lawyers who had become more engaged in technology since the pandemic. While 95% of respondents to a LexisNexis survey of 1,000 UK legal professionals think that generative AI will have a noticeable impact on the law, only 36% say they have used it in a personal or professional capacity.

Two-thirds (65%) of respondents believe that generative AI will increase efficiency, by supporting lawyers’ work. The most popular use cases are legal research (66%), briefing documents (59%) and document analysis (47%). Lawyers in larger firms highlighted due diligence (46%) and business development (40%).

Lawyers are aware, however, of potential issues around the accuracy and quality of large language models’ outputs and their tendency to hallucinate (produce responses containing fictitious information). This can be addressed by retrieval-augmented generation, whereby a large language model references an authoritative knowledge base outside its training data sources (verified content from a legal publisher or from the user’s organisation before processing a query).

It follows, therefore, that 67% of survey participants have mixed feelings about the impact of generative AI on the practice of law; 90% reported having ethical concerns, which may be related to it being unregulated.

Even so, law firms are developing bespoke generative AI tools on OpenAI GPT-3.5 and GPT-4 and other large language models. Legal tech suppliers, meanwhile, from start-ups to major players, are developing new tools and applications and adding them to their suite of products. And lawyers are advising cutting-edge AI businesses, as well as being involved in (proposed) AI regulation and the growing number of legal disputes concerning data being used to train large language models. The most recent of these disputes concerned the case brought by the New York Times against OpenAI.

While AI may replace some legal support roles, it is already creating demand for new skills in data management and analysis, AI technology, prompt engineering and so on. On the downside, lawyers have hit the headlines in the UK and in the US for getting generative AI badly wrong.

AI adoption goes mainstream

As AI adoption took off across the legal sector, many firms introduced senior AI appointments and steering committees to develop policies and guidance on applying AI tools to firm and client data. Firms with strong technical capabilities developed their own generative AI tools. Travers Smith has produced several open source applications, while Dentons’ fleetAI legal research chatbots operate behind the firm’s firewalls, and include prompt engineering and output verification. For most firms, though, the path to adoption has been to buy rather than build generative AI capability.

Procurement challenges reflect the pace at which software is evolving and rapid market expansion. As budget is an important consideration, magic circle and international firms with well-resourced tech and innovation functions have been in the forefront of AI adoption. Allen & Overy was first off the mark, adopting Harvey AI, which assists with contract analysis, due diligence, litigation, and regulatory compliance. It generates insights, recommendations, and predictions based on data.

We have to educate and train people, as we would with any technology, but because generative AI is moving so fast, we also have to find a balance

Michael Kennedy, Addleshaw Goddard

AI adoption pyramid

The embarrassment of choice also necessitates a procurement/adoption strategy. Christopher Tart-Roberts, head of lawtech, and chief knowledge and innovation officer at Macfarlanes, explains: ‘It’s important to have a strategy in place given the myriad options available, and it’s also important to retain flexibility. Short-term priorities should be set, but longer-term plans (over two years) need to be flexible.

‘We’re pursuing a blended strategy across three core areas: new third-party applications, existing systems, and self/bespoke development. It’s not a “one technology fits all” approach; the right solution will depend on the use case. As we establish clear use cases, we’re assembling a suite of tools to give our teams the appropriate support in a way that delivers value.’

A recent panel focused on generative AI in law firm knowledge management identified the same three-tiered adoption pyramid: general applications such as Microsoft Copilot which incorporate generative AI into Microsoft 365 applications; legal-specific applications including legal information management, document management, e-discovery; and client-specific applications, which involve some bespoke development work.

Addleshaw Goddard started at the middle layer of the pyramid with a lengthy trial of legal-specific applications. Michael Kennedy, senior manager, innovation and legal technology, outlines the process: ‘Last year, we got a working group of 150 people representing all teams from across the firm. We conducted pilots of numerous tools and gathered user feedback, as well as running training sessions and engaging with the wider firm. We did a lot of testing ourselves, talked to vendors and we also did some internal build. We learned that while we can do a chunk of this ourselves, as we are not a software house, it is a slow process. So we purchased CoCousel, from Casetext [a Thomson Reuters product] which handles document reviews and solved a few problems immediately. We also purchased Spellbook, a GPT-4 contract drafting tool which operates within Microsoft Word.’

Kennedy’s team is sharply focused on prompt engineering, ‘crafting a good input to get a better output, building templates and finding out what people use and what works well’. Kennedy is aware that he will need to reassess the position regularly to keep up with new developments. ‘As head of R&D, it is my job to be on top of the evolution [of generative AI] to stay ahead of the game.’

Guidance, but no regulation (yet)

Matt-Hervey

Matt Hervey

The Law Society guidance note on generative AI published in November 2023 concluded: ‘At present, there are no statutory obligations on generative AI technology companies to audit their output to ensure that they are factually accurate. Consequently, the use of these tools by legal professionals could result in the provision of incorrect or incomplete advice or information to clients...

 

‘As there is currently no AI- or generative AI-specific regulation in the UK, it is important that you understand the capacities of the generative AI tool that you plan to use…

 

‘Currently, the SRA does not have specific guidance on generative AI related to use or disclosure of use for client care. It is advisable that you and your clients decide on whether and how generative AI tools might be used in the provision of your legal advice and support.

 

‘While it is not a legal requirement to do so, clear communication on whether such tools are used prevents misunderstandings as to how information is produced and how decisions are made.’

 

The Bar Council recently issued guidance to barristers and the City of London Law Society (CLLS) appointed an AI committee. Matt Hervey (pictured), partner at WLG Gowling, who sits on the CLLS committee, explains that although there is no AI-specific regulation: ‘AI-based legal technology has to comply with all the generally applicable laws and regulations – such as copyright, the GDPR and, for European activities, the upcoming AI Act. It also needs to comply with regulations for legal services [from] the Solicitors Regulation Authority and the Bar Standards Board. Even well-resourced, international firms are only beginning to tackle the implications for IP, privacy, privilege and liability. An ongoing dialogue is needed between firms, the regulators and insurers. The explosion of potential use cases of generative AI requires firms to have effective and flexible generative AI policies in place to enable responsible experimentation and drive firm-wide AI literacy.’

Leveraging supplier relationships

Other key considerations include data security and systems integration. A quick way of addressing both is to use trusted vendors. ‘We are discussing gen AI with all our existing suppliers – it’s an important roadmap item for them,’ adds Tart-Roberts. ‘We need to understand how the AI functionalities across these systems will sit together and how they’ll co-exist with new and self-developed applications. Over time we expect to see a patchwork of AI at work and it will be increasingly important to be vigilant about overlaps so as not to confuse users or duplicate costs.’

Major legal tech vendors are adding generative AI features into their products. For legal publishers Thomson Reuters and LexisNexis, which already have large volumes of structured legal data, generative AI is an obvious enhancement. They have both invested heavily in developing new tools and applications.

Document management systems are a major tech investment that underpins firms’ knowledge management capability. iManage recently introduced generative AI applications. These include adding metadata to documents for classification, identification and extraction, and enabling firms to build their own models, using prompt engineering, to extract the information they need. Search results link back to source material, addressing any risk of hallucination. There is a strong emphasis on data security, and a choice of nine datacentres in different global locations. Knowledge lifecycle is another focus, to ensure the quality of the data being interrogated.

Knowledge management teams are also leading adopters of cloud DMS NetDocuments’ OpenAI application PatternBuilder MAX. This helps build workflows by constructing and applying effective prompts to classify, extract and store information. It is hosted on Microsoft Azure, applies a zero-day retention policy, and the data it processes cannot be accessed by Microsoft or used as training data, explains chief product officer Dan Hauck. ‘We wanted to create a framework that allows firms to tailor PatternBuilder MAX to their needs, and feel confident pulling in content from the document management system, including sensitive information.’ Out of the box apps include summarisation, revising contracts, extracting key terms and using or modifying them, as well as custom apps for specific clients and cases.

These solutions enable (risk averse) firms to adopt generative AI in a familiar environment and also in a way that solves several key challenges. Their data is held securely by a trusted supplier. As the application works within their existing document management system, interrogating only the data they already have, there are no issues with data security or systems integration. Output can be checked against known content, avoiding hallucinations. A firm’s knowledge management capability is also enhanced, provided that the data is curated to ensure that it is up to date and relevant.

There is general agreement that making generative AI work effectively in the knowledge management function depends on effective data curation. Generative AI tools require an up-to-date, well-managed dataset to produce accurate high-quality results. Litera’s latest Project Dragon, which supports deal negotiations by suggesting, collating and verifying relevant documents (for example from previous similar deals), automatically inputs new deal information. Other applications of generative AI for negotiation include Luminance’s Autopilot, which conducts contract negotiations automatically. However, this requires both parties to invest in the same software, again suggesting that a more standardised approach supports wholescale adoption.

Copilot takes off

While building bespoke generative AI applications on top of OpenAI’s GPT series means recruiting skilled developers, nearly all law firms are Microsoft houses. A major factor in generative AI is that Microsoft is a longstanding investor in OpenAI. On 15 January, Microsoft dropped the requirement that customers must buy a minimum of 300 Copilot licences. This rule had existed since Copilot for Microsoft 365 first became available in November 2023. The change opens up the application to small businesses and individuals.  

John Craske is director of innovation at CMS, which is one of 600 organisations participating in the Microsoft 365 Copilot Early Access Program, alongside other legal-specific generative AI applications, including lawtech start-ups Harvey AI, Robin AI, as well as OpenAI. ‘It seems too soon to bet on just one product,’ he observes.

Craske is encouraged by the change in lawyers’ attitude to AI. Previously they would treat AI pilots almost as competition, missing the point that if they work with AI, it can produce better results faster. His philosophy of ‘human + machine’, whereby he advocates applying AI to speed up tasks you already know how to do, addresses the hallucination point. An expert would quickly spot a response that did not look right. CMS’s 300 Copilot licences were in such demand that there is a waiting list, and it is used by lawyers from trainee to partner.

‘We keep as close as we can to the market and when we see something interesting we try it,’ says Craske. ‘For a few years now, we have had product managers for legal tech, who are effectively internal customer success people. Their job is to own the product, look after the supplier relationship and drive adoption.’

Proceed with caution

Ultimately, although generative AI is transforming many aspects of legal work, the human nature of the profession requires an element of caution. ‘As the tech team in a law firm we have to get this right,’ observes Kennedy. ‘We have to educate and train people, as we would with any technology, but because generative AI is moving so fast, we also have to find a balance; we can’t stand still but we also have to go carefully because of the nature of our business. Even as the less risk-averse people in a law firm, because we are in tech and innovation, we are still aware that the stakes are high, and hopefully we are all sensibly pulling in the same direction.’

Topics