Earlier this month, a Victorian lawyer made history by becoming the first Australian practitioner to face professional sanctions for using artificial intelligence in court. Such disciplinary action begs the question: should law students, who will be the lawyers of tomorrow, also face sanctions for inappropriate use of AI in the course of their education?
That lawyer, as recently reported by Lawyers Weekly, submitted a series of non-existent legal citations, following which the Victorian Legal Services Board and Commissioner said, in varying the lawyer’s practising certificate, that its regulatory action in this matter “demonstrates our commitment to ensuring legal practitioners who choose to use AI in their legal practice do so in a responsible way that is consistent with their obligations”.
In the face of incidents like these (which, of course, were not an isolated instance of lawyers using AI for court-related tasks or filings), courts around the country have issued guidelines for AI use. NSW’s top judge, Chief Justice Andrew Bell, has been vehement about the restricted use of AI in NSW courtrooms, to prevent inaccuracies and “laziness” from entering the legal profession, he has said. Just last week, Queensland Courts issued guidelines for judicial officers and non-lawyers, including self-represented litigants, to better understand the appropriate uses of generative AI (GenAI), stressing that the use of such tools is “not a substitute” for conducting research.
Against this backdrop, Kate Offer – a senior lecturer at Curtin Law School and adjunct associate professor at UWA – noted that legal education’s fundamental purpose is “to develop future lawyers’ ability to understand legal principles, synthesise complex information”, and cultivate sound legal judgement”. To this end, and as GenAI continues to advance, law schools will face new challenges in ensuring that students are developing the requisite skills for legal practice and are not unduly relying on GenAI in the process.
More broadly, M de Mestre Lawyers principal MaryAnn de Mestre, who serves as the convenor of succession law at Macquarie University (who last year won the Academic/Researcher of the Year category at the Women in Law Awards) added that as the legal profession adapts to the realities of artificial intelligence, it is critical that law students are held to the same standards expected of practising lawyers.
“The next generation of practitioners must be equipped not only with technical knowledge of the law but also with the ability to think critically, exercise judgement, and maintain integrity in their work,” she said.
“If students are allowed to rely on AI without scrutiny, they risk developing habits of uncritical acceptance by regurgitating text rather than analysing, testing, and refining arguments. This undermines the very purpose of legal education, which is to train independent thinkers capable of advising clients and assisting courts with accuracy and professionalism.”
To this end: should students be sanctioned for using AI, as they will be once they become lawyers?
People training in tertiary education
In answering such a question, it is necessary to reflect on the broader purpose of a law degree, in developing a student into a practitioner of tomorrow. Part and parcel of such development is – and will always be – the human element of law, inclusive of but not limited to client service delivery, and appreciating how best to interact in the professional services market, on top of the technical and academic tribulations that one will undertake.
Put simply, Dr Sebastian Sequoiah-Grayson – a senior lecturer in epistemics at the School of Computer Science and Engineering at UNSW – said, “at universities, we make people”.
“Much of the point of universities is to get students to engage in practices that will bring about the relevant changes in their personhood that a university education is designed to bring about. Lawyers, surgeons, engineers, teachers, journalists, artists and so on, are not merely people who possess sets of skills in particular. They are particular types of people,” he said.
“Some properties of persons might be pursued best indirectly. Happiness and authenticity spring to mind. We call such properties, ‘elusive properties’. It is the nature of elusive properties that they slip away from us when we do attempt to pursue them directly. Much better for us to engage in other practices that are likely to bring elusive properties to the foreground, nonetheless.”
“We used to call such practices rites of passage,” Dr Sequoiah-Grayson continued. “These days, we call such practices internships, legal and clinical placements, apprenticeships, and so on.”
“The existence of such passage rites betrays the fact that we know that elusive properties cannot be gotten by instruction. Rather, they are gotten by practice.”
The role of law schools
Following this, an exploration of how law schools can and should be developing future legal practitioners is required.
Zooming out, Sequoiah-Grayson reiterated that, at its core, a “proper” university degree is a rite of passage.
“The practices of assessment therein are designed to engage students in those practices that will elicit in themselves those elusive properties of persons that such a degree is purposed to elicit,” he said.
“Universities do not ask their students to work on group projects, essays, lab experiments, or mathematical problem sets because the world needs more of these things. Rather, they ask their students to work on such things because the practices that such things demand elicit those elusive properties of persons that are the very mark of lawyers, surgeons, teachers, and the like.”
This does not necessarily mean, de Mestre said, that holding students to higher standards means discouraging AI use.
On the contrary, she said, “students should be encouraged to engage with these tools but ethically and thoughtfully”.
“They must learn to verify sources, question reasoning, and apply human judgement to ensure accuracy and reliability. Importantly, students must also develop their own independent empathy when dealing with client issues, recognising the human impact of legal advice beyond what an algorithm can suggest.”
Most law schools across the country, Offer noted, have responded to the educational and vocational challenges posed by the rapid advancement of new technologies by updating their teaching and learning policies to address AI use by students as best they can, and in ways those faculties deem fit for purpose in the current climate.
“While approaches vary across institutions and sometimes even within law schools, these policies typically either prohibit generative AI entirely or require students to disclose when and how they’ve used AI to assist with their assignments,” she said.
“Violations can carry serious consequences, typically an academic misconduct investigation which can, in severe cases, result in expulsion from the university.”
To this end – sanctions are indeed on the table, and should serve as a clear line in the sand that use of AI in ways that would not befit practising lawyers also apply to the next generation of practitioners (if not equally, then at least with severe penalties).
Further thoughts
The actions that have been taken by law schools nationwide thus far are, of course, nowhere near exhaustive and will continue to evolve.
As GenAI continues to improve and become more sophisticated, Offer said, law schools will need to fundamentally rethink their assessment methods.
However, she said, “by maintaining rigorous academic standards and enforcing penalties for breach of policies, law schools can ensure that graduates enter legal practice with both the analytical foundation essential to the profession and the ability to leverage generative AI responsibly”.
By making clear and instilling certain standards from the outset, de Mestre said, law schools can ensure consistency between academic training and professional expectations.
More importantly, she added, “it prepares graduates to navigate a future where AI will be commonplace, but never a substitute for the lawyer’s essential role in critical thinking and ethical decision making”.
For Sequoiah-Grayson, the question of students’ AI use and how universities should manage it ultimately comes back to their holistic formation as future professionals.
“Public-facing GenAI is a plagiarism machine. I am yet to meet a single person who does not know this to be true,” he said.
“That anyone might think it right that universities should permit their students to use this plagiarism machine in such a way as to undermine the formation of their future personhoods is an abomination of both the intellect and of morality.”
Jerome Doraisamy is the managing editor of Lawyers Weekly and HR Leader. He is also the author of The Wellness Doctrines book series, an admitted solicitor in New South Wales, and a board director of the Minds Count Foundation.
You can email Jerome at: