You have 0 free articles left this month.
Advertisement
Big Law

Chief Justice not ready to scrap restrictive GenAI rule

In the seven months since the NSW Supreme Court’s generative artificial intelligence (GenAI) practice note came into effect, Chief Justice Andrew Bell said some of his concerns have increased.

September 12, 2025 By Naomi Neilson
Share this article on:

Source: Supplied.

expand image

When ChatGPT 4.0 became available, a colleague of Chief Justice Andrew Bell asked that chatbot how he was perceived and was given the response of “conservative”. Asked for the source, the chatbot pointed to a media outlet’s “legal analysis piece” that did not exist.

In a speech delivered to the Australian Bar Association, Chief Justice Bell said the chatbot gave the false information “in a confident and unqualified manner”, going so far as to use inverted commas to offer a quote “directly from that non-existent article”.

 
 

When challenged to identify the actual article, the chatbot admitted it overstepped “a bit there”, apologised for the “slip”, and claimed the characterisation “does pop up in legal commentary here and there”.

An offer to dig around for “another” reference was misleading as it implied it had already supplied one reference, Chief Justice Bell said.

“This, on one view, innocuous enough example with the most recent technology does not give comfort to a judge who is concerned with the integrity of information being provided to courts tasked with determining the facts and faithfully applying the law justly to determine citizens’ disputes,” Chief Justice Bell added.

The NSW Supreme Court’s practice note came into effect in February, along with a ban on using open-sourced large language models. If certain conditions are met, practitioners are permitted to upload certain material onto a closed-source GenAI program.

Chief Justice Bell said the court took a “more cautious approach” to GenAI than other jurisdictions, “if not all around the world”.

“I am unapologetic about this, although I continue to monitor the issues that gave me concern and led me to approach reflected in Practice Note SC Gen 23-18 and accept that refinement of technology may address at least some of the issues,” Chief Justice Bell said.

At a briefing delivered last December, Chief Justice Bell said he was most concerned with hallucinations – being the generation of apparently plausible but inaccurate references – being used in court.

In the two years since Nash v Director of Public Prosecutions, the first Australian case where GenAI use was exposed, Chief Justice Bell said there have been at least 23 other cases in courts and tribunals.

“Alarmingly, a number of recent cases of use of GenAI resulting in false references being provided to courts have been by lawyers, although it must be acknowledged that the difficulty is most prevalent with unrepresented litigants,” the Chief Justice added.

Recently, a Victorian lawyer became the first in Australia to face professional sanctions for using GenAI to submit hallucinations.

Lawyers who are caught relying on hallucinations in their submissions “are at best incompetent and at worst, dishonest”.

“Either way, it presents fundamental difficulties for judges who rely on the competence and integrity of lawyers to assist them in the discharge of their judicial responsibilities,” Chief Justice Bell said.

The Chief Justice said lawyers and litigants should be aware of the disciplinary options open to courts and tribunals, including referral to legal professional bodies, contempt of court findings, adverse costs orders, and ordering continuing professional development courses.

If hallucinations are relied on, the best practice is to “promptly bring the error to the court’s attention, admit to the use of AI, issue an unconditional apology, take full responsibility, propose positive steps to address the mistake, and, where appropriate, seek leave to file an affidavit to open and honestly explain the circumstances”.

“Lawyers must remain vigilant in the use of artificial intelligence, as persistent AI hallucinations remind us that accurate, reliable and critical legal analysis currently remains a solely human capability.

“But the risks extend far beyond so-called AI hallucinations and fundamental quality control. AI has the real potential to encourage or feed laziness in research and analysis and loss of essential skills and critical thinking,” Chief Justice Bell said.

In the same speech, Chief Justice Bell warned of the impact AI-manipulated video, or “deepfakes”, can have on the justice system.

Naomi Neilson

Naomi Neilson is a senior journalist with a focus on court reporting for Lawyers Weekly. 

You can email Naomi at: This email address is being protected from spambots. You need JavaScript enabled to view it.