Barrister becomes latest ‘victim’ of fake ChatGPT cases

Avatar photo

By Legal Cheek on

15

Reported to regulator


A barrister has been referred to the Bar Standards Board (BSB) after citing a non-existent case generated by ChatGPT in immigration tribunal proceedings, later arguing he was himself “a victim” of the AI technology.

Muhammad Mujeebur Rahman appeared for an appellant in an immigration matter when he included in grounds of appeal a reference to “Y (China) [2010] EWCA Civ 116”, claiming it supported arguments on delay. The tribunal found the case did not exist.

When challenged at a hearing in June 2025, Rahman initially claimed he meant to cite other authorities including YH (Iraq), R (WJ) v SSHD and Bensaid v UK. After being given a break he told judges he had “undertaken ChatGPT research during the lunch break” and insisted Y (China) was genuine and it was a decision made by Pill and Sullivan LJJ and Sir Paul Kennedy.

The panel gave him a deadline to either produce a copy of the judgment or explain what had happened if he could not. As the panel moved on to the next case, Rahman handed the tribunal clerk a nine-page internet printout containing “misleading statements”, including references to the fictitious Y (China) case under the citation for YH (Iraq). It made no mention of the key case on delay.

In a follow-up letter submitted before the deadline, Rahman explained that he had meant to cite YH (Iraq) and apologised for failing to provide the full and correct case name.

He attributed the mistake to “acute illness” he had suffered before drafting the grounds, as well as to a trip to Bangladesh during which he was hospitalised with diabetes, cholesterol issues and high blood pressure. He also argued that he should not be penalised for this error, noting that he has five dependants—his wife and four children.

At a further hearing, Rahman finally accepted that he had used ChatGPT to draft the grounds of appeal and to create the document he handed up via the clerk, but argued that he was “misled by the search engine and is thus also a victim”.

The tribunal said he had failed to carry out any checks on reputable databases such as Westlaw, LexisNexis, Bailii or EIN, and that his letter was “a less than honest attempt to pretend” he had simply made a typographical error and had not relied on AI.

Although it concluded there had been no deliberate fraud, the panel said Rahman had not acted with honesty and integrity, and that the use of fake authority likely contributed to permission being granted on one of the grounds.

Referring the matter to the BSB, the tribunal noted that lawyers have a professional duty to verify authorities and warned that “taking unprofessional short-cuts which will very likely mislead the Tribunal is never excusable”.

Rahman, who has since completed further training on immigration law and the use of AI, apologised for his conduct and argued he should not be referred to the BSB. He said he now has a proper understanding, has been honest, will act with integrity in future.

15 Comments

well well well

well well well

Definitely not generated by ChatGPT

AI is like a toaster—it can only make what you put in it. If someone’s using it to toast their integrity, that’s a people problem, not a robot uprising.

Richard

Is there any different from taking a case cited in a textbook like Chitty or even theWhite Book and citing it as authority for the proposition contained within that book without actually looking at the cases itself?
Few barristers have not done that.

Lilly

Yes! Those are acceptable sources that th judge can look up and review the arguments of. Wasting everyone’s time with fake cases as authorities is disrespectful and shows that you’re a sh!t lawyer.

Charles Utley

I has an awful time, years ago, trying to convince the Magistrates’ clerk that the full report of a decision of the House of Lords was preferable to a footnote in Stone’s Justices Manual. The footnote said the case decided one thing the report in AC confirmed it decided the opposite. In the end, fortunately, the lay chairman of the bench was a lot brighter than the clerk and the law lords won over the text book editor.

Disbelief Incarnate

It’s baffling to me how so many keep falling into this trap, particularly in something as niche as law. Ask ChatGPT to explain a first-year undergrad legal concept like objective/subjective recklessness and if you’re at all well-versed with the topic you’ll see how it hedges and doesn’t quite word its answer as precisely as you’d expect even an undergrad student to be able to, even if it is broadly correct.

Imagine trusting it to generate accurate answers on more specific topics. Some people must really think AI is magic. There’s no other word for it: negligent.

Alex

It’s just a new incarnation of an old problem. I’ve been in cases where my opponent only looked at a head note , didn’t have the latest version of the white book. Hadn’t appreciated that a case had been overturned etc. LLMs are a very useful and powerful addition, but bad lawyers will always be bad.

Sean

This.

AI in my sector will allow bad lawyers to be faster and cheaper.

A race to the bottom

Al

“Read the next line down….”

Archibald O'Pomposity

Rahman “said he now has a proper understanding, has been honest, will act with integrity in future”.

How reassuring. Should I need barristerial services, I will be sure to look him up.

John N

If you are professionally using any AI without understanding “AI hallucinations” you’re not a victim, you’re lazy/stupid/tight [Delete as necessary]

Anonymous

I’m stupid.

The realest

Mr Rahman needs to listen to Tupac. He was the realest.

Barrister making questionable life choices

There is no way a barrister should be using AI at all given how much they get paid p/h, let alone to get cited cases by it. Unbelievable. Hope he gets disbarred asap.

Turkey Twizzlah

I can’t even get the AI my shop uses (which probably costs them a chunk of my salary) to pull the correct provisions from a section in an Act of Parliament.

Meanwhile the shopkeepers want us to feed our work product into the AI. It really is like turkeys voting for Thanksgiving/Christmas.

Join the conversation

Related Stories

ChatGPT

Another lawyer faces ChatGPT trouble

Documents referenced 'nonexistent' cases

Feb 4 2025 8:45am
4

AI avatar lawyer barred from US court

Cyber counsel frustrates (human) judges

Apr 8 2025 8:28am
3