More clarity is needed in the law governing liability for harms caused by artificial intelligence, Chancery Lane has said amid the growing use of AI systems by professionals - and as operators of driverless taxis prepare to offer services. Law Society chief executive Ian Jeffery was welcoming an initiative by the UK Jurisdiction Taskforce, chaired by the master of the rolls, to tackle current uncertainties in the area.
The taskforce, part of the LawtechUK initiative, is due to publish a Legal Statement on Liability for AI Harms shortly following consultation on a draft statement released in January. It will cover liability for physical as well as economic harm by autonomous systems, defined as those where outputs are not programmed in advance.

Noting the statement's confirmation that AI has no legal personality in English law, Jeffery said: 'It needs to be easier to legally prove what caused AI harm and who should pay for it. Clearer guidance is needed on how professional standards evolve when AI tools are used by practitioners and how to manage systemic risks when AI is used in the justice system.'
Jeffery also welcomed the taskforce's publication of a guide on the control of digital assets - objects such as crypto tokens and smart contracts, recognised as a new category of legal 'thing' under the Property (Digital Assets etc.) Act 2025. The report, published last week, is designed to provide 'a practical explanatory guide, aimed at assisting court uses and the judiciary when considering how legal principles ... can be applied as a matter of fact to various and evolving technologies.'
Jeffery said the clarification would help ensure that the UK remains a jurisdiction of choice for international digital disputes. 'We look forward to working with the taskforce to explore how providing further clarity on these two important policies creates a safer environment for people and businesses that can lead to more growth and jobs across the country.'






















No comments yet