You have 0 free articles left this month.
Advertisement
Big Law

‘Not a substitute’: Qld courts introduce AI guidelines

Queensland’s courts and tribunals have introduced a set of generative AI guidelines, including a caution that chatbots should not be used as a substitute for trusted legal sources.

September 22, 2025 By Naomi Neilson
Share this article on:
expand image

Queensland Courts issued guidelines for judicial officers and non-lawyers, including self-represented litigants, to better understand the appropriate uses of generative artificial intelligence (GenAI).

The new guidelines are applied to the Supreme Court, District Court, Planning and Environment Court, Magistrates Court, Land Court, Children’s Court, Industrial Court, Queensland Industrial Relations Commission, and Queensland Civil and Administrative Tribunal.

 
 

They first make clear that while GenAI tools may be useful to find material, it is “not actually intelligent in the human sense”. Rather than providing accurate answers, it produces what the chatbot has predicted to be the “most likely combination of words”.

“The result is that, as with any other information available on the internet in general, AI tools may not be useful to find material you would recognise as correct but have not got to hand, but are a poor way of conducting research to find new information which you cannot otherwise verify,” the Queensland Courts guidelines state.

The guidelines prohibit uploading information into an open-source chatbot that is not already in the public domain, and to take care not to enter private, confidential or suppressed material.

Any information that is generated should be checked before it is used or relied on in the courts or tribunals. Not only would this ensure the information is accurate, but it would also guarantee any required source acknowledgements or citations are included.

Lawyers should also be required to confirm they have independently verified the accuracy of any research or case citations.

“The use of such chatbots is not a substitute for conducting research using trusted sources such as academic texts or legal databases,” the guidelines said.

The guidelines went on to say all legal representatives are responsible for material put before courts and tribunals and have a “professional obligation to ensure it is accurate and appropriate”.

“Provided these guidelines are appropriately followed, there is no reason why you cannot use generative AI as a potentially useful secondary tool for research or preparatory work,” they added.

“However, you must ensure that any use of AI tools by you or your staff is consistent with the core judicial values of open justice, accountability, impartiality, and equality before the law, procedural fairness, access to justice and efficiency.”

Queensland Courts also noted courts have always had to handle forgeries, but judges should be aware of how deepfake technology generates “fake material, including text, images and videos”.

Judges should also be alert to the use of AI by experts and consider whether the expert should be required to identify its use.

Naomi Neilson

Naomi Neilson is a senior journalist with a focus on court reporting for Lawyers Weekly. 

You can email Naomi at: This email address is being protected from spambots. You need JavaScript enabled to view it.