A Victorian lawyer has made Australian legal history – for all the wrong reasons – becoming the first practitioner to face professional sanctions for using artificial intelligence in court.
A Victorian lawyer has become the first practitioner in Australia to face professional sanctions for using artificial intelligence in a court case, after submitting a list of entirely non-existent legal citations.
The solicitor, who cannot be named, has had his ability to practise as a principal lawyer taken away, following an investigation by the Victorian Legal Services Board and Commissioner (VLSB+C).
The incident occurred in July 2024 during a family law dispute in which the lawyer was representing a husband.
According to The Guardian, when Federal Circuit and Family Court Justice Amanda Humphreys and her associates attempted to verify the citations, they were unable to confirm the references.
When the matter returned to court, the lawyer admitted that the list had been generated using AI-assisted legal software and acknowledged that he had not verified the information before submitting it.
The solicitor offered an “unconditional apology” to the court, pledging to “take the lessons learned to heart” and requesting that the matter not be referred for further investigation.
As reported by The Guardian, Justice Humphreys accepted the apology and noted that the incident caused significant stress but was unlikely to be repeated. However, she stressed that an investigation was necessary in the public interest due to the growing use of AI in legal work.
Following the referral, the VLSB+C conducted an investigation and confirmed on 19 August 2025 that the solicitor’s practising certificate had been varied.
In a statement, the VLSB+C outlined the severe consequences for the lawyer: he is no longer entitled to practise as a principal, can’t handle trust money, may not operate his own law practice, and may now practise only as an employee solicitor under strict supervision.
The legal body has described the case as a warning to practitioners about the risks of relying on AI tools without proper verification, emphasising the need to use such technology responsibly.
“The board’s regulatory action in this matter demonstrates our commitment to ensuring legal practitioners who choose to use AI in their legal practice do so in a responsible way that is consistent with their obligations,” it said.
This case is not an isolated incident. Since then, more than 20 other Australian court cases have involved lawyers or self-represented litigants submitting AI-generated material containing false references.
Most recently, a lawyer was referred to the Legal Practice Board of Western Australia for submitting court documents that contained four citations that could not be identified or did not correspond with the legal principles relevant to his client’s case.