Judge fury after ‘fake’ cases cited by rookie barrister in High Court

Avatar photo

By Angus Simpson on

21

“I consider that it would have been negligent for this barrister, if she used AI and did not check it, to put that text into her pleading,” says Mr Justice Ritchie


A High Court judge has issued a scathing ruling after multiple fictitious legal authorities were included in court submissions.

The case concerned a homeless claimant seeking accommodation from Haringey council. Things took a sharp turn when the defendant discovered five “made-up” cases in the claimant’s submissions.

Although the judge could not rule on whether artificial intelligence (AI) had been used by the lawyers for the claimant, who had not been sworn or cross examined, he also left little doubt about the seriousness of the lapse, stating: “These were not cosmetic errors, they were substantive fakes and no proper explanation has been given for putting them into a pleading” said Mr Justice Ritchie, adding: “I have a substantial difficulty with members of the Bar who put fake cases in statements of facts and grounds.”

He added:

“On the balance of probabilities, I consider that it would have been negligent for this barrister, if she used AI and did not check it, to put that text into her pleading. However, I am not in a position to determine whether she did use AI. I find as a fact that Ms Forey intentionally put these cases into her statement of facts and grounds, not caring whether they existed or not, because she had got them from a source which I do not know but certainly was not photocopying cases, putting them in a box and tabulating them, and certainly not from any law report. I do not accept that it is possible to photocopy a non-existent case and tabulate it.”

Judge Ritchie found that the junior barrister in question, Sarah Forey of 3 Bolt Court Chambers, instructed by Haringey Law Centre solicitors, had acted improperly, unreasonably and negligently. He ordered both Forey and the solicitors to personally pay £2,000 each to Haringey Council’s legal costs.

Certainly the judge’s warning will echo across the profession:

“It would have been negligent for this barrister, if she used AI and did not check it, to put that text into her pleading.”

This case has sparked discussion on social media. Writing on LinkedIn, Adam Wagner KC of Doughty Street Chambers commented on the judgment, noting that while the court didn’t confirm AI was responsible for the fake cases, “it seems a very reasonable possibility.” Wagner added:

“A.I. can be a time saver, especially if you don’t really know where to start (as sometimes happens in law!), but the key lesson is that A.I. should only ever be the *starting point* of a research or drafting task.”

The case emphasised that responsibility for accuracy lies with lawyers. This news comes after judges received refreshed guidance on spotting AI-generated submissions last month. Meanwhile, the SRA approved the first ‘AI-driven’ law firm — which claims their AI cannot propose caselaw, to avoid hallucinations.

The 2025 Legal Cheek Chambers Most List

21 Comments

Hmmm

This is troubling. Another example of AI adding to the burden of already strained resources: also see graduate recruitment where AI applications are like weeds – rife and easy (but time consuming) to spot.

Jane

A1 therefore being a hindrance rather than help

Just Anonymous

This is such a sad and frustrating case.

The barrister has not admitted using AI. However, I see no other credible explanation (other than patent and bizarre dishonesty).

However, as often happens, the initial mistake – although bad in itself – has probably been superseded by the ‘cover up’.

Put another way:

Unwittingly submitting fake cases to the court by naively trusting AI is forgiveable.

Refusing to admit that mistake – and advancing a patently false explanation to a High Court Judge which said Judge then correctly rejects (see [53] of the judgment) – is not.

TIMOTHY COMPTON

An argument for the ongoing use of books?

Legal ninja

Its ridiculous that she is still practising . Shows how low the BSB standards are allowing lazy unscrupulous individuals such as her to practise . She should have been held in contempt of court . I know barristers who have been jailed for such antics

Legaleaglebeagle

Which barristers have been jailed for such antics?

Hmmm indeed

I’m not sure this is an example of AI adding to the burden of already strained resources. It looks like an individual has done that. Also, in relation to recruitment, again, it suggests the individuals involved are the concerns, not the technology available to them.

Puzzled

Easy but time-consuming?

Lion Stew

Yes, if you read the sentence in context (albeit that the punctuation could be tightened up), what the poster is saying is that each individual weed is very easy to spot and deal with. However, the very fact that they are so numerous makes the task incredibly time-consuming.

Lewis Graham

I’m not a lawyer but if I quoted a non-existent document in a client report I would get into serious trouble.
It’s a great place to start but unless you have seen the document for yourself don’t rely on it. AI specialists use the term “hallucinations” to describe when a large AI model has inferred something that doesn’t exist.

Kevin cowing

AI shouldn’t be used to create any content for court documents it should always be based on actual case law, solicitors charge enough to their clients, AI encourages laziness as this case has shown.

Mr. Justice Eddie

We’ll have to resolve this with a frying pan! 🍳

RIP Rik Mayall

Only the over 30s will get this reference, sadly.

Urgh

This is terrible behaviour that I feel undermines confidence in the profession. What kind of care and concern do you have for a paying client if you can’t be bothered to check that actual law is cited in your submissions?

Might as well work as an author of fiction.

Dave Watson

I’m a lawyer and openly admit I use A.I. for certain tasks. As Adam Wagner KC points out, it can be a great place to start your research. It can also be helpful to formulate your arguments into a structure for you to write pleading, submissions or witness statements. I would never rely on A.I. for legal authorities without checking, because they often misunderstand what rule or principle the case gives authority on. I am a little skeptical that A.I. was used here, as I have never found it generates fictional cases.

A.I. when used properly, can speed up some legal tasks, which in turn, keeps my clients costs lower. Legal fees are high enough, so if I can use technology to keep those costs lower whilst maintaining the high quality of the work my clients expect, I will use it.

SB

It absolutely creates fictional cases, I’ve seen it done several times.

Barrister

What AI programme / software are you using? I’m not aware of any that could help with “structure” for pleadings but then I’m not in a law firm.

To be honest by the time you’ve worked out how to give the AI the info it needs to give you something bog standard, you might as well do it yourself. (I appreciate it might be different for simple cases or in-house legal teams producing e.g. small money claims pleadings at volume.)

Day

A Solicitor lost their practicing certificate and their career is over for telling their employer they were taking a relative to hospital to get a day off, when instead they had a personal court hearing they did not wish to tell of.

And this barrister gets a rap on the knuckles for an outrageously deceptive stunt in court? With no innocent explanation even proferred?

I mean, double standards much. Time for the SRA which regulates solicitors to go.

7 years' PQE

The judge has referred the barrister and the instructing solicitors to the Bar and the SRA, respectively; so the jury is out for now as to what regulatory punishment each will dish out.

Jane

Yes, it will be interesting to see the follow-up. It seems to me that most of the focus is on whether AI was used, but I don’t think that’s particularly important. The key issue is that they put a load of rubbish into the pleadings, and when challenged on this, tried to cover it up instead of admitting their mistake. Surely that’s serious professional misconduct.

Duncan M

I train AI for a living. It hallucinates like a tripping child. AI is nowhere near the level of usefulness people assume. We still need to think for ourselves and do our own research. Sigh!

Join the conversation

Related Stories

AI avatar lawyer barred from US court

Cyber counsel frustrates (human) judges

Apr 8 2025 8:28am
3

‘Silly examples of bad practice’ are no reason to shun AI, says top judge

Master of the Rolls backs artificial intelligence in latest speech

Feb 11 2025 8:47am
3
lawyers AI robots

The blame game: who takes the heat when AI messes up?

Megha Nautiyal, a final-year law student at the University of Delhi, explores the relationship between liability and technology

Aug 8 2023 8:55am
1