You have 0 free articles left this month.

Lawyers Weekly - legal news for Australian lawyers

Powered by MOMENTUM MEDIA
lawyers weekly logo
Advertisement
Big Law

AI-generated video of murder victim used in court to address killer

In a move that may represent the future of criminal proceedings, an AI-generated video has been used in a US court to address the victim’s killer.

May 14, 2025 By Daniel Croft
expand image

Editor’s note: This story first appeared on Lawyers Weekly’s sister brand, Cyber Daily.

Christopher Pelkey, 37, at the time of his death, addressed his murderer, Gabriel Paul Horcasitas, in a Maricopa County, Arizona Superior Court on 1 May.

The scripted video written by his sister, Stacey Wales, depicts Pelkey with a beard and green shirt and shows him calling the murder unfortunate.

“It is a shame we encountered each other that day in those circumstances,” said the AI-generated version of Pelkey.

“In another life, we probably could have been friends.”

The avatar also discloses that he is AI-generated in the clip, which is somewhat obvious based on the movement of his mouth not syncing with the audio.

While Wales said she wasn’t ready to forgive Horcasitas, she believed that Pelkey may have been more compassionate.

“You’re told that you cannot react, you cannot emote, you cannot cry,” she said. “We looked forward to [the sentencing] because we finally were gonna be able to react.”

“The goal was to humanise Chris, to reach the judge, and let him know his impact on this world and that he existed.

“[AI] is just another avenue that you can use to reach somebody.”

The unique use of AI appears to be one of the first in the US court systems. Courts usually restrict AI in legal proceedings, and a number of lawyers have been sanctioned after citing fake AI-generated cases.

In this case, however, as the video did not present evidence, it was allowed during sentencing. Horcasitas was also convicted of manslaughter and endangerment. He is now facing 10.5 years in prison.

University of Waterloo Professor Maura Grossman said that the use of AI in the court didn’t raise any ethical or major legal dilemmas in her opinion.

“Because this is in front of a judge, not a jury, and because the video wasn’t submitted as evidence per se, its impact is more limited,” she told NPR.

Additionally, a professor of law, ethics and emerging technologies at Arizona State University’s Sandra Day O’Connor College of Law, Gary Marchant, also said this use of AI was largely harmless to the legal process.

“Victim statements like this that truly try to represent the dead victim’s voice are probably the least objectionable use of AI to create false videos or statements,” he told NPR.

You need to be a member to post comments. Become a member today
Got a tip for us?
If you have any news tips or stories to share, feel free to send them our way.
Momentum Media Logo
Most Innovative Company