A dead Arizona man gave his own victim statement in court thanks to AI and a script written by his sister — possibly a first in the U.S.
Advancements in artificial intelligence aren’t just entering creative industries — they’re making their way to the courtroom as well.
An Arizona family used an AI-generated version of their deceased family member to give his own victim statement in court, local outlet ABC 15 reports. Chris Pelkey, 37, was an Army veteran and avid fisherman killed in an alleged road rage incident on Nov. 13, 2021. According to the Chandler Police Department, Pelkey died after being shot by Gabriel Horcasitas, 54, who police say exited his car after a driving confrontation and fired his gun. (Attorneys for the state and Horcasitas did not respond to Rolling Stone’s request for comment.) Horcasitas was convicted of endangerment and manslaughter in a 2023 trial, but was recently sentenced to 10.5 years in prison after Pelkey’s victim statement was played in court — a one year increase from the state’s recommended 9.5 years, according to ABC15.
Pelkey’s sister, Stacey Wales, told Fox 10 that she was inspired to use AI to recreate Pelkey’s likeness after trying to collect victim impact statements for the trial. AI was used to make the image sound like Pelkey, but she wrote the script. “We received 49 letters that the judge was able to read before walking into sentencing that day. But there was one missing piece. There was one voice that was not in those letters,” she said. “But it was important not to make Chris say what I was feeling and to detach and let him speak because he said things that would never come out of my mouth, but I know would come out his.”
The impact statement used a picture of Pelkey and his voice profile to recreate a version of him. This video was interspersed with real video footage and photos of Pelkey, but also had him directly address people in the courtroom. “In another life, we probably could’ve been friends. I believe in forgiveness and in God who forgives. I always have and I still do,” the AI said.
While the AI-assisted image was only used in the victim statement, its inclusion in the courtroom brings up interesting questions about how artificial intelligence could help — or hurt — in matters of law. This appears to be the first time artificial intelligence has been used to present a victim impact statement in a U.S. courtroom.
The Arizona Supreme Court’s Chief Justice Ann Timmer wasn’t involved in the case. But she told ABC15 that “a measured approach is best” and that to respond to potential use in the future, the Arizona Supreme Court has formed a committee to make recommendations on the best way to use AI in the courtroom. “At bottom, those who use AI — including courts — are responsible for its accuracy,” she said.
#Bullet #Killed #Brought #Life #Court