We have become accustomed to the idea of post-truth – one of the more unsettling elements of modern life. Many claim that the institutions on which we should rely for facts, for instance mainstream media, are lying to us. We are assailed by fakes, political exaggeration, foreign government bots and conspiracy theories. The BBC has set up a Disinformation Unit. 

Jonathan Goldsmith

Jonathan Goldsmith

Our own government is testing us sorely. The Supreme Court has said that Rwanda is unsafe, based on evidence, but the government is legislating to say that it is indeed safe. We tread carefully. What should we believe?

Two recent stories also highlighted the need for care in everyday legal life. Neither comes from post-truth politics, but both caution against naivety.

The first comes from Canada, and is from the light end of the spectrum. You can bookmark it as another case of AI talking rubbish – or, more politely, AI hallucinations.

The facts were these. A Canadian found out that his grandmother had died and so immediately booked a flight from Vancouver to Toronto. Air Canada offers reduced bereavement fares if you need to travel because of a death in your family. He used a chatbot on the Air Canada site, which told him that he could claim this discount 90 days after reservation, which he did. (Fortunately, he took a screenshot of the page, which is sound advice to all of us in similar circumstances.) It was only when claiming within 90 days that he discovered that the real Air Canada policy is that you have to claim before your travel is completed. The chatbot had made up the other policy about a claim within 90 days.

The case report says that Air Canada tried to wriggle out of responsibility: ‘In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions.’ This line of argument was easily dismissed, indeed called a ‘remarkable submission’. The judgment went on to say: ‘While Air Canada argues Mr Moffatt could find the correct information on another part of its website, it does not explain why the webpage titled “Bereavement travel” was inherently more trustworthy than its chatbot. It also does not explain why customers should have to double-check information found in one part of its website on another part of its website.’ Amen to that.

This is the first lesson in ‘Be careful what you believe’. The second comes from an entirely different area of law and practice, and is much more serious.

The Financial Times broke a story earlier in the month about how one or more sanctioned Iranian entities are using UK shell companies to avoid sanctions via well-known UK high street banks. The shell companies are not sanctioned and look innocent. The shares of the banks concerned dropped on the news.

I was taken by the Mail’s headline on this: ‘Can we really expect British banks to police rogue states? It is unrealistic and an outsourcing of responsibility, says the City after claims Iran used four of the UK’s biggest lenders’. A senior banking executive is quoted as saying: ‘Governments are outsourcing too much of their responsibilities for investigating hostile powers on to us. But we don’t have access to the same information as the security services. It’s unrealistic.’ Well, amen to that, too.

This echoes what I wrote a couple of years ago about it being unrealistic for law firms to be able to see behind hidden and complex structures, which are innocent in themselves, to know when money laundering has taken place, if the players concerned are based in countries far away, and the events are opaque, barely reported and historical. It is an absurdity, on which the government repeatedly doubled down. Maybe things have a chance of changing if big banks are now complaining.

As many lawyers’ organisations have argued, the AML legislation requires lawyers not to believe their own clients, not to believe in innocent-seeming structures, and to expend considerable resources in digging up dirt behind clients’ backs. It is true that bad players dazzle professionals with a mesmerising pattern of fake structures to hide ill-gotten gains. It is also true that money laundering is an evil. Yet it should not follow that the responsibility for penetrating such structures should be fragmented so that each individual lawyer and law firm has to undertake its own detective work, particularly where the players come from countries like Iran and Russia, for which our own government has so much intelligence already at its fingertips.

I find it distressing that we are less and less sure about what is true. We are told to test everything, and we are on our own in testing it. If we get it wrong, we often bear the consequences.

This article was not written with the help of AI.


Jonathan Goldsmith is Law Society Council member for EU & International, chair of the Law Society’s Policy & Regulatory Affairs Committee and a member of its board. All views expressed are personal and are not made in his capacity as a Law Society Council member, nor on behalf of the Law Society