A party in a tax appeal provided a tax tribunal with nine previous rulings to support her case - all of which turned out to be ‘hallucinated’ by an artificial intelligence program, a judgment has revealed. However in Harber v Commissioners for His Majesty’s Revenue & Customs, the tribunal judge accepted that appellant Felicity Harber did not know that her ‘authorities’ - which she said had been provided by 'a friend in a solicitor’s office' - were fabrications. 

Appearing at the First Tier Tribunal over a penalty of £3,265, Harber claimed she had two bases for a ‘reasonable excuse’ for failing to notify HMRC of liabiity to £16,326 in capital gains tax. These were her mental health condition and that it was reasonable for her to be ignorant of the law. Her written response to the tribunal included the names, dates and summaries of what she said were nine decisions in which the FTT had accepted these excuses. 

The citations included: ‘Jewell v HMRC (2016)’ in which a taxpayer 'successfully argued that they had not been aware of the requirement to file a tax return as they had not received any correspondence from HMRC' and ‘Baker v HMRC (2020)’ in which the FTT apparently found in favour of a taxpayer 'who argued that their mental health condition, combined with other factors, had made it impossible for them to submit the return on time'. 

At a reconvened hearing following the discovery of the fabrications, Fiona Man, for HMRC, said that this case had similarities with Richard Baker v HMRC [2018]: 'However, not only was the year different, but Mr Richard Baker lost his appeal.' Harber’s 'citations' frequently repeated identical phrases and in six of the nine ‘found in their favour’ appeared with the American spelling ‘favor’. 

According to the judgment, when Harber was asked if the cases had been generated by an AI system she said 'this was “possible”, but moved quickly on to say that she couldn’t see that it made any difference’.

Tribunal judge Anne Redston accepted that Harber was not aware that the cases in the response had been fabricated and did not know how to locate or check case law authorities. The tribunal found as a fact that the cases had been created 'by an AI system such as ChatGPT'.

Harber’s appeal was dismissed and the penalty confirmed. The judge noted that her decision would have been the same if Harber had not provided the false cases. 'Nevertheless, providing authorities which are not genuine and asking a court or tribunal to rely on them is a serious and important issue.'

Even though misleading a court is likely to have less impact in a tax appeal than in many other types of litigation, 'that does not mean that citing invented judgments is harmless', she said.


This article is now closed for comment.