Digital literacy is an integral part of education for the lawyers of tomorrow. Law firms must be attuned to the impact of AI on juniors

There have been recent news stories about the use of GenAI tools in universities. While their rules and policies on GenAI use vary, there is general agreement about the need to improve digital literacy among students and academic staff. The Higher Education Policy Institute (HEPI) survey 2025 found that student use of AI has surged, with 92% of students using AI, up from 66% in 2024, and 88% using GenAI for assessments, up from 53% in 2024. 

Joanna Goodman

Joanna Goodman

AI and data privacy journalist and author Kashmir Hill wrote in the New York Times that students, discovering that their lecturers and supervisors were using GenAI for producing coursework and for marking student assignments, which had to be completed without GenAI, argued that their course fees should be [partially] reimbursed to compensate for the undisclosed use of GenAI. Another article highlighted students’ anxiety about AI detection software misidentifying their original work as AI-generated.

A converse argument is that if university lecturers and supervisors do not use GenAI, are they really qualified to teach students about using it effectively? GenAI is a sensitive topic for legal education, which is predicated on applying legal knowledge to different scenarios, and the veracity and accuracy of legal citations.

Recent judgments have shown that judges understand the value and risk associated with GenAI. While they can use it to produce judgments, barristers are regularly caught presenting AI-generated false citations (see box, opposite). This indicates a need for better digital literacy across the profession – and including it in legal education is one way of circumventing the problem.

Danon Pritchard is director of digital literacies and a senior lecturer at The City Law School and her specialisms include AI and technology. Her role highlights the importance that the university places on digital literacies. CLS has introduced new modules at graduate and undergraduate level to support students’ understanding of tech in the legal workplace and the evolving implications of GenAI in the practice of law. ‘We are equipping students to use [GenAI] tools responsibly and ethically, and to critically evaluate their output. Critical evaluation is a cornerstone of legal practice and recent controversies illustrate why digital literacy combined with an understanding of the broader ethical and regulatory context and the ability to cite real authority is so important.’

A large part of the message to students is not to over-rely on GenAI tools. ‘Law students still need to know what the law says and be able to apply it critically, citing robust references, even when they are using GenAI tools,’ adds Pritchard. CLS has a significant programme of staff training in GenAI as well as a sharp focus on students’ digital wellbeing. ‘Students are developed to be digitally agile so that they can deal with change, and also focus on their digital wellbeing.’

What are students doing?

How do law students really use GenAI in their legal studies? I asked a student at the University of Bristol Law School who was attending the WE R LegalTech Festival.

The university’s policy on academic integrity prohibits GenAI, beyond ‘generating the occasional short phrase within a sentence and checking basic grammar and spelling’.

While there is scope for flexibility – ‘some assessments may allow more comprehensive use of these tools, but this will be detailed in your assessment instructions’ – failure to comply with the policy constitutes academic misconduct.

While law students are not using GenAI to write their essays and assessments, it is widely used as a learning tool. This is for basic research – in the knowledge that its output definitely needs to be fact-checked – and for exploring generalist questions of law.

More controversially, students use GenAI as an additional tutor, helping to boost their grades. You can upload an assignment to ChatGPT or Claude and ask it for critical feedback and suggested improvements, with prompts like, ‘you are a law professor at a leading UK law school. What’s the weakest part of this essay and how can I improve it?’

Students generally understand prompting because they turn to GenAI models for everyday queries, for organising their schedules, creating and editing videos and podcasts. ‘Vibing’ between multiple GenAI models comes naturally to them. And as GenAI is conversational and you can add personas, like law professor, proofreader or friendly adviser, students are increasingly using it as a therapist or for emotional advice.

Winners and losers

Legal AI hit national headlines this month, with the launch of the ‘first AI law firm’ authorised by the Solicitors Regulation Authority. While this landmark decision opens the floodgates to more virtual robot lawyers, and having the SRA on board is a big step towards normalising legal AI, Garfield AI is not autonomous. The documents it produces are checked by its co-founder, who is also on hand to advise clients who require further advice (so it is also a business development tool).

 

In the same week, two High Court cases were derailed by fake citations. Barrister Sarah Forey cited five fake cases that Mr Justice Ritchie said would potentially constitute negligence if she had ‘obtained the text from AI and failed to check it’. He ordered his judgment to be sent to the Bar Standards Board and the SRA. In a second case, a former solicitor presented 25 fake cases to support his appeal against strike-off.

Respecting boundaries

Patrick Grant is associate professor of legal technology at the University of Law, which is significantly less restrictive than Bristol University in its policy towards students using GenAI. ‘We actively encourage students to use GenAI to assist with research, organisation and editing but not to complete assessed work,’ he says. ‘Importantly, we actively encourage students to critically review the output and any references, just in case the GenAI has hallucinated them. A great example is when students learn to draft clauses in LPC/SQE classes. Often they will spend an hour writing a clause, editing and adapting it and then will be let loose onto GenAI to see what it comes back with. Nearly 100% of the time students will agree that the output from say, ChatGPT, would serve as a first draft but nothing more.’

Grant explains further: ‘Effectively, this is all about drawing a clear boundary between using GenAI as an assistant and using GenAI to “outsource work”. For law students with a view to joining the professions this is especially important as all the knowledge and understanding they build at university will come up again in the SQE or Bar exams. Furthermore, at ULaw, our students are being trained not just in knowledge, but in professional judgement and ethical reasoning – neither of which can be outsourced to an app.’

Digital literacy as a differentiator

'If we look back just five years, technology was about availability, support and infrastructure management. Now, all our infrastructure is in the cloud, so it’s not a differentiator'

Eddie Twemlow, Burges Salmon

Law firms are also experiencing a generation gap when it comes to digital literacy and GenAI adoption. Eddie Twemlow, head of technology and operations at Burges Salmon, identified this while implementing a central part of its digital enablement strategy – the firm-wide roll-out of Microsoft 365. ‘We started with a group of digital champions, and we approached the roll-out in a traditional way – we gave people the product and some training and then we left them to get on with it.’

But while that may be sufficient for tech champions, Twemlow realised that, across the business, some people were less comfortable than others with new concepts like virtual assistants. They required ongoing training, support and communication from the tech and innovation functions, including the tech champions in each department. ‘If we look back just five years, technology was about availability, support and infrastructure management. Now, all our infrastructure is in the cloud, so it’s not a differentiator.’ Now, it seems, adoption is the differentiator, and this requires digital literacy. 

Eddie Twemlow

Eddie Twemlow, Burges Salmon

As part of the Microsoft 365 implementation, Twemlow is tracking usage by department and role. ‘I track the number of prompts per day and share the data as it helps people understand where we need to improve, and how successful a particular intervention is. It was noticeable that our trainees adopted the new technology faster than the rest of the firm.’

The GenAI generation gap is partly explained by the fact that the trainees are digital natives, but adoption is also role-specific. Furthermore, trainees use GenAI technology in a different way from more experienced colleagues. ‘They are in a space where they are constantly learning, whereas a senior lawyer already knows how to do the things they are turning to GenAI to help with. They use GenAI as a learning tool, rather than a process tool,’ says Twemlow.

He acknowledges that, while GenAI is to some extent replacing supervisory learning, senior lawyers are using AI tools for tasks that were previously delegated to junior lawyers as part of their development, so it is replacing elements of both roles.

This pattern raises new challenges. Until now, clients have been paying for trainees to learn, by carrying out lower-level tasks. Now those tasks are automated, law firms need to find ways of training junior lawyers without passing the cost on to clients.

Another challenge is that GenAI tools are developing so fast that adoption is no longer about learning new systems and applications. It is about building resilience in the form of digital literacy to develop the capability to constantly adapt to more sophisticated technology tools.

 

Topics