AI in the Courtroom: A Digital Voice From Beyond
In a groundbreaking moment that raises profound questions about technology’s role in the justice system, an Arizona courtroom witnessed what may be a legal first: a murder victim delivering his own impact statement through artificial intelligence.
Chris Pelkey, killed in a 2021 road rage incident, appeared to address his killer last week via an AI-generated likeness that spoke words crafted by his sister. “To Gabriel Garcia, the man who shot me, it’s a shame we encountered each other that day in those circumstances,” the digital recreation of Pelkey stated. “In another life, we probably could have been friends. I believe in forgiveness… and I still do.”
The unusual presentation occurred during Garcia’s sentencing hearing, marking what legal experts believe could be the first instance of AI-generated victim testimony in a U.S. courtroom.
Stacey Wales, Pelkey’s sister, explained to NewsNation that the idea emerged from her struggle to articulate her own impact statement. “I couldn’t get out what I wanted to say. It didn’t sound genuine… it didn’t sound like anything Chris would say,” Wales recounted. “I couldn’t help but thinking he doesn’t get a voice in this. Nobody knows what he thinks about this. He’s the ultimate victim.”
Wales turned to her husband, who has experience with AI technology, to create the digital representation. “I know exactly what he would say,” Wales told NewsNation. “I’ve been trying to write my own impact statement for two years, but I wrote his in five minutes.”
The video was permitted under Arizona’s Victims’ Bill of Rights, which Wales noted allows victims to present statements “in any medium that we would like.”
However, the use of AI in this context raises significant legal and ethical concerns. Former Judge Jeffrey Swartz, also speaking to NewsNation, expressed reservations about the precedent. “This could happen if this is admitted even as a victim impact statementāare we now going to start having people project what they believe someone else would say?” Swartz questioned.
Swartz highlighted potential constitutional issues: “That could affect the Sixth Amendment right to confrontation.” He further noted that while Wales presented her brother in a light she found favorable, “it doesn’t mean that everybody else saw him that way… even if she believed that that is what he would have said.”
The judge added that he “more likely than not would not have permitted this video” in his own courtroom, noting that Pelkey’s appearance “had been altered from what he looked like at the time that the event took place.”
This case sits at the intersection of technological innovation and judicial tradition, raising questions about authenticity, representation, and the boundaries of victim advocacy. As AI capabilities advance, courts will likely face increasing pressure to establish clear guidelines for when and how such technology can be employed in legal proceedings.
While the emotional impact of hearing from victims is a crucial part of the sentencing process, the mediation of those statements through AI creates new territory for a justice system built on direct testimony and cross-examination.
For Wales and her family, the technology provided a means to honor her brother’s memory and ensure his perspective was represented. For the legal system, it represents a challenge to traditional notions of testimony that will require thoughtful consideration as technology continues to evolve.