UHY Ross Brooke Chartered Accountants

The risk of AI hallucinations in tax cases

should you use ai chatgpt for tax advice

David Jones tax director UHY Ross BrookeA number of tax tribunal cases in the last three years or so have revealed the risks of using AI to support cases. As a result, professional institutes are advertising a new special section within the Professional Conduct in Relation to Tax (PCRT), related to AI use.

By David Jones – Tax Director

In a fairly recent case, a judge observed: “These cases do not exist and are classic AI hallucinations,”. Ouch!

While I am the last person who could be described as a ‘tech-early-adopter’, I have to say that, yes, even I have started to become aware of and use the real power that AI-enhanced or AI-generated services offer. And when executing tedious chores, like preparing a ‘first-cut’ thumb-nail benchmarking review, the power of AI is compelling.

However, from the ‘old-school’, I always check the result in detail, and I have to tell you that it is surprising how often the source and/or supportive documentation is just not available, or not unless an inordinate amount of effort is invested, if indeed it exists at all. Of course, a first-cut benchmarking exercise is one thing. Standing up in court and relying on AI-generated precedent is quite another.

A number of tax tribunal cases in the last three years or so have revealed the risks that this strategy presents. It is now getting so dangerous (and by that, I mean prejudicial to winning, with a topping of Court sanction) that a lot of the professional institutes are advertising a new special section within the Professional Conduct in Relation to Tax (PCRT).

The problem: temptation

If you are a taxpayer and think you are competent enough to sustain a challenge against HMRC in the later stages of an enquiry, or even in front of the Tax Tribunal, then, genuinely, you could be right, and I wish you good luck. But you better be sure the support for your arguments is robust.

Consider the case of Felicity Harber, heard back in 2023. Mrs Harber, appearing in person, referenced several previous Tax Tribunal decisions that she contended supported her appeal. At first glance, her precedents seemed plausible. Even at second and third glance, HMRC strangely stood by as the appeal proceeded.

Anyway, the Tribunal had trouble identifying the cases to which she referred, as did HMRC subsequently. When challenged on this, she admitted that it was possible that the cases had been sourced by AI as the submissions had been prepared by “a friend in a solicitor’s office”. So, Mrs Harber had nearly done the exact right thing (seeking ‘professional’ advice) but she had been left high and dry. Incidentally, in the Harber case, the Tribunal considered that their time was wasted because they were diverted to looking for matching case name references and alike. It was a black mark against Mrs Harber’s case, although in that case the Judge was sympathetic. Nevertheless, Mrs Harber lost the case anyway.

Since then, there has been a regular trickle of similar instances. Most recently, the case Elden v HMRC (2026) highlighted a different variation: submissions included correctly named earlier cases, but extracts that were inaccurate, irrelevant or unsupported. The Judge issued a clear warning, the responsibility for checking references lies with the human relying on them.

In that spirit, I have been generally referring to:

The conundrums

So, your mate in the office is looking for quick tax advice, or they present an AI-generated synopsis and ask:

“It looks okay… shall I go with it?”

You already know the answer. The first input may seem innocuous, but it could set a chain of events in motion that unravels at any moment. Used sensibly, AI may help you start a conversation or frame initial thinking, but it will require substantial reinforcement before it should ever reach an inspector’s desk.

And don’t forget there is a massive distinction between a generic search in a public AI engine (which may learn from your prompts) and a private environment designed for confidential client work. Mix those up, and you introduce a whole new risk: confidentiality.

Guidance on the use of AI

The new PCRT guidance on the use of AI is required reading for us all, whether in the profession, or whether you are an unrepresented taxpayer. You can access it on the ACCA or CIOT websites, amongst many others. As the ACCA say in their topical guidance dated 19 January 2026,

“In all cases it is important to remember that outputs from AI tools should not be used as authoritative tax or legal advice, with review to be undertaken by a qualified professional in the specific context of the client to whom the advice is being provided”.

To me, this means targeted use/advice, in the knowledge of all salient facts and with the time for the said professional review to be performed.

Think on!

The next step

As ever, we are here to help. If you need professional help with your tax, or a tax investigation, please do get in touch

Share This Post

Related insights

Talk to us

Newbury: 01635 555666
Abingdon: 01235 251252
Swindon: 01793 610008
Hungerford: 01488 682546