August has delivered a cascade of important tech developments affecting lawyers and law firms, despite it being the holiday season in Europe and the USA.

Jonathan Goldsmith

Jonathan Goldsmith

The most surprising was the joint statement published last week by 12 national data protection authorities from around the world (including the UK’s Information Commissioner’s Office) on data scraping and data protection.

At last, I thought! I hoped that the problem I highlighted recently would be dealt with, namely AI companies scraping data from lawyer and law firm websites (among many others), to feed into their generative AI models like ChatGPT, to train them up, so that the models can do our work in the future - taking from us to harm us.

But it appears that the joint statement deals with a narrower, though still important, problem: the way that the personal information on social media sites is being scraped for future nefarious purposes (for cyberattacks, identity fraud, profiling and spying on individuals, political or intelligence gathering, and spam).

The joint statement calls on social media sites to protect their customers’ data better, and gives advice about how we can protect ourselves. We can’t do much individually, apart from reading the terms of service, thinking about what information we want to share, and understanding and managing our privacy settings.

However, the statement wants social media sites to do a better job of protection, for instance by using CAPTCHAs, blocking IP addresses that scrape, sending ‘cease and desist’ letters, and so on.

The statement is silent on the data scraping of websites to feed the insatiable training maw of generative AI models, presumably because the problem is so new. It is also notable that, although the UK signed the statement along with 11 others, neither the US nor any EU body signed it.

Speaking of the US, the American Bar Association (ABA) passed a resolution earlier this month urging lawyers ‘to keep informed about new and emerging technologies and protect digital products, systems, and data from unauthorized access, use, and modification’.

Aha! I thought again: a bar is taking action against data scraping for AI machine learning. But no, once more, the document, though very useful, has a narrower scope.

The background report to the ABA resolution provides very useful resources about the dangers law firms face from cyber-attacks, and mostly gives advice on how to deal with them.

The ABA approach to technology is different from ours (better, I think). The ABA is not a regulatory body, but establishes model rules for state bars to adopt, or adapt.

Some time ago, it amended a model ethical rule (Rule 1.1 of the Model Rules), which now says: ‘To maintain the requisite knowledge and skill, a lawyer should keep abreast of changes in the law and its practice, including the benefits and risks associated with relevant technology …’

Given the way that technology now affects our practices – and the ABA stresses the importance of protecting client data and confidentiality against bad actors – this seems very sensible wording. To date, 40 US states have affirmatively adopted a duty of technological competence, and the others have said that the requirement is implicit in the current rules (closer to the SRA stance).

Finally, in August’s cornucopia of IT developments affecting lawyers, the EU’s Digital Services Act became operational on 25 August for 19 very large online platforms and very large online search engines operating within the EU (‘very large’ means more than 45 million users).

Requirements under the act include a ban on targeting users with ads based on sensitive data (and entirely prohibits targeted ads against children), along with transparency requirements about how platforms’ algorithms work, and new liability obligations for illegal content such as hate speech and bans on deceptive design patterns. Sensitive data refers to a broad range of attributes, including sexual orientation, religion, health history and political persuasion.

That in itself may seem to have little to do with UK law firms, but, as with GDPR, the overspill from an EU regulatory act may become global. In any case, the act applies to nearly all UK businesses which offer online services in the EU.

The part relating to the very large platforms started last week, but the rest will follow, including accountability for illegal content (companies must act quickly and efficiently) and increased transparency (who paid for the advertisements, the targeting criteria used, and the performance metrics).

So, we are engulfed by technological change, and it is our ethical duty (implicit or explicit) to keep up with the changes, or we will not be able to maintain confidentiality, nor advise our clients’ businesses properly … nor, to return to my favourite theme, able to keep our hard-won output from being stolen and fed into generative AI, to enable it to undermine us.

 

Jonathan Goldsmith is Law Society Council member for EU & International, chair of the Law Society’s Policy & Regulatory Affairs Committee and a member of its board. All views expressed are personal and are not made in his capacity as a Law Society Council member, nor on behalf of the Law Society

Topics