While the use of artificial intelligence is solving myriad headaches for legal practitioners and access to justice, for criminal lawyers, deploying AI in certain contexts could create more problems than it solves.
Three months ago, the UK Ministry of Justice unveiled its AI Action Plan for Justice, detailing how the British government intends to harness the power of AI to transform the public’s experience, making their interactions with the justice system simpler, faster, and more tailored to their needs.
The plan, Minister for Prisons, Probation and Reducing Reoffending Lord James Timpson said, focuses on three priorities: strengthening Britain’s foundations, embedding AI across justice services, and investing in the people who will deliver such transformation.
One aspect of the plan, as announced by the Ministry of Justice (MOJ) in early September, is a pilot program under which offenders would have to complete remote video check-ins on their own mobile devices, alongside existing measures such as GPS tags and in-person probation appointments, as part of a crackdown on reoffending.
The measures, MOJ noted in a statement, will also require offenders in select English regions to record short videos of themselves and use artificial intelligence to confirm their identity, and answer questions about their behaviour and recent activities.
“Any attempts to thwart the AI ID matching or concerning answers will result in an instant red alert being sent to the Probation Service for immediate intervention, helping prevent crimes before they happen,” it said.
Lord Timpson said: “This new pilot keeps the watchful eye of our probation officers on these offenders wherever they are, helping catapult our analogue justice system into a new digital age.
“It is bold ideas like this that are helping us tackle the challenges we face. We are protecting the public, supporting our staff, and making our streets safer as part of our Plan for Change,” he said.
Should initiatives to utilise AI for such predictive purposes be adopted Down Under, it may well have implications not just for criminal justice, but for legal practitioners operating in this space.
Potential implications
For Jeannette Fahd, the principal of Just Defence Lawyers, the idea of governments using AI to predict crime raises “profound legal and ethical challenges”.
“Predictive tools risk entrenching bias by relying on data drawn from policing and prosecutorial practices that already reflect systemic inequities, including biases and risks targeting certain groups. If such tools are adopted, lawyers will need to be vigilant in challenging the assumptions and algorithms underpinning them, ensuring transparency and accountability,” she said.
“There is also the danger of eroding fundamental principles, like the presumption of innocence, if individuals are treated as ‘likely offenders’ based on patterns rather than conduct. People can’t be treated as potential offenders just because they share traits with others in a dataset.”
“AI will inevitably reshape criminal law practice – from digital evidence analysis to case preparation – but its role must be carefully scrutinised.”
“Practitioners must adapt quickly, embracing the benefits and efficiencies while guarding against injustices amplified by automation. Technology can never replace human judgement and empathy.”
Conditsis Lawyers director Manny Conditsis noted it is unclear how the UK pilot will operate – that is, whether it will only apply to convicted offenders or alleged offenders on bail, or both.
“Even well-intentioned laws, unless they include appropriate safeguards, can ultimately be misused and end up being oppressive and erode the cornerstone of the criminal justice system, the presumption of innocence,” he said.
“Commentary is suggesting nefarious intentions and warning of the dangers of over-reliance on AI and questioning its corruptibility.”
The scheme would undoubtedly open significant new work for criminal lawyers, Conditsis noted, although he added that what that would look like will depend on the specificity of the legislation.
“For example, would there be some sort of preliminary hearing where the offender could state their case? Would the offender be detained and brought before a court?” he said.
Executive Law Group partner Jahan Kalantar pointed out that whenever revolutionary technologies are proposed for deployment in a setting with established norms, like the justice system, “there is always a healthy degree of uncertainty”.
“The plan suggests that it will lead to efficiencies in areas like prison bookings and reducing bottlenecks. I am sceptical of this, given how often I have seen technology in the court system be the cause of headaches,” he said.
“I also worry about the embedding of AI systems, which may have in-build bias or blind spots. One of the cornerstones of the legal system is understanding human fallibility and the fact we get things wrong (hence why we have appeals).”
If Australia begins to try and systemise away from that, Kalantar posited, “we risk losing the very thing that makes the legal system work”.
“I remain hopeful that AI could deliver some relief to a court system that really needs capacity, but I also remain concerned that AI could in fact create more problems than it solves.”
The future of the criminal justice system
While the reduction of crime is a great thing, Quantum Law Group founder and managing partner Zile Yu said, it is important to bear in mind that our entire criminal justice system and our legal system were developed to ensure that a fair trial and rights are protected and not eroded.
“Laws exist to guide, teach and discipline humans to not be ruled by emotion. The fundamental presumption of innocence before a finding of guilt exists to preserve and protect fundamental rights. It is easy to consider that if a person commits a crime, then they should be punished, but they should not lose their rights,” he said.
“How will they be able to defend themselves, noting that under all criminal justice systems in existence, none are likely perfect (and will have convictions which may be innocent)?”
Crime and offending are complex social issues, Yu continued, with a multitude of underlying factors, “and while AI is great at pattern recognition and analysis, this is on the assumption that the parameters and questions being asked for it to process are correct”.
If this is not correct, he warned, “then we would not know the wide-ranging impacts this may potentially have, especially if this is going to affect people’s lives and freedom”.
While the British pilot sounds great in theory, Yu suggested, “focusing on certain data, e.g., reduction of burglaries by a certain percentage, does not necessarily address the fundamental problem of why such criminal activity exists in the first place, and the factors to reduce the root cause”.
Moreover, Yu went on, there are significant issues in using AI to solve what is fundamentally a human problem.
“Applying law, exercising judgement, communicating values and ideas, and delivering justice being an outcome that is considered fair are human processes. We should focus on creating a transparent AI governance system to reduce future negative consequences. We want AI to help us improve our efficiency and accuracy and ultimately deliver better justice (than without AI),” he said.
“It is inevitable that all technologies, including AI, will become part of governments and our policing and justice system. However, the discussion and implementation and structuring the governance of any new technology is an important task, and getting it wrong will have disastrous consequences. For example, only recently is the world becoming aware of the harms of social media and constant connection to the internet.”
A society’s laws reflect its level of development, Yu mused.
“Once a upon a time, trial by combat was a legitimate way to resolve disputes; decades and centuries from now, the law, the justice system will be very different and it is up to us to shape it in the right way to benefit society with discussion, education, and communication (as there will also be groups motivated to shape the same by profit or control),” he said.
Jerome Doraisamy is the managing editor of Lawyers Weekly and HR Leader. He is also the author of The Wellness Doctrines book series, an admitted solicitor in New South Wales, and a board director of the Minds Count Foundation.
You can email Jerome at: