Barristers given fresh AI guidance amid rise in fake cases cited in court

Avatar photo

By Legal Cheek on

6

Bar Council urges caution when using tech tools


Barristers have been issued updated guidance on using ChatGPT and other legal AI tools amid a rise in lawyers inadvertently citing made up cases in court.

The Bar Council has refreshed its advice following recent High Court rulings involving fabricated judgments and the rapid growth of AI across the profession. The message from the bar bigwigs is that AI can be useful, but only when barristers understand what it is doing and keep proper oversight.

The updated guidance stresses that barristers should understand how systems like Google’s Gemini, Perplexity, Harvey and Microsoft Copilot work before relying on them in case preparation. It highlights key risks including hallucinations, information disorder, bias in data training, mistakes and cybersecurity vulnerabilities. It also reminds barristers that AI does not have a conscience or social and emotional intelligence.

Most importantly, the responsibility for accuracy, confidentiality and compliance with professional rules remains entirely with the barrister.

APPLY NOW: The Legal Cheek Virtual Pupillage Fair is Thursday 11 December

Barbara Mills KC, chair of the Bar Council, said:

“Recent case law, including the High Court judgment, emphasises the dangers of the misuse by lawyers of artificial intelligence, particularly large language models, and its serious implications for public confidence in the administration of justice.”

She continued: “We recognise that the growth of AI tools in the legal sector is inevitable and occurring at a fast pace. As the guidance explains, the best-placed barristers will be those who make the efforts to understand these systems so that they can be used with control and integrity. Any use of AI must be done carefully to safeguard client confidentiality and maintain trust and confidence, privacy, and compliance with applicable laws.”

The guidance also references recent academic research into the reliability of AI legal research tools and notes that authoritative sources remain available at the Inns of Court libraries. It goes on to stress confidentiality, data protection and intellectual property concerns and clarifies that it applies to LLM software specifically aimed at lawyers.

The Legal Cheek Virtual Pupillage Fair returns on 11 December 2025, bringing together over 40 leading chambers from across the bar. It’s an invaluable opportunity for aspiring pupils to speak directly with barristers, get a real feel for the type of work different chambers do, and pick up tips and tricks for success in this year’s pupillage applications. APPLY NOW.

6 Comments

Well

Meanwhile, Vos continues to throw caution to the wind and maintain that we should all be using AI for everything…

Sparkling

What barrister doesnt check their case law, this is extremely worrying.

Juries are dead, the Bar is dead

“Inadvertently”? How do you cite a case inadvertently? Who doesn’t check the case exists before citing a case to a judge? I mean reading it would be even better but let’s not ask too much these days…

Anonymous

In this day and age using AI is crucial in order to keep up with the professional demand. Nevertheless it is paramount that all case citations are checked manually and verified. I find it easier to just make an assertion along the lines of “as per English law” and subsequently support the argument with case citations as opposed to risking citing a case. It is important to note that most arguments can be made logically or with a logical framework and afterwords can be supported by caselaw.

Junior paralegal

You can always make the argument itself and afterwards support it with caselaw. It doesn’t make sense to risk misleading the court when your career might end up in the gutter.

Ai Hal

“I fear the day that technology will surpass our human interaction. The world will have a generation of idiots,” said Albert Einstein.

Join the conversation

Related Stories

ChatGPT

Another lawyer faces ChatGPT trouble

Documents referenced 'nonexistent' cases

Feb 4 2025 8:45am
4