
Screenshot from video of the AI-edited impact statement of Chris Pelkey. (Courtesy of Stacey Wales)
More than 20 people gathered in a Phoenix courtroom on a sunny May 1 for the sentencing of Gabriel Paul Horcasitas, 54, who had been found guilty of manslaughter for killing U.S. Army veteran Christopher Pelkey, 37, in a 2021 road rage incident.
Nearly 50 people submitted written victim impact statements on behalf of Pelkey and 10 people gave verbal statements before Maricopa County Superior Court Judge Todd Lang while a slideshow played pictures of Pelkey with family members, on a church mission trip and in the Army.
Pelkey's sister, Stacey Wales, was the last person to give a statement. She told the court that she kept thinking about what her brother would have said if he could have been there, and then she played a nearly four-minute artificial intelligence video.
In the video, a photo of Pelkey had been edited and animated using AI tools to make it appear as if he was speaking the words Stacey Wales had written.
"Just to be clear for everyone seeing this, I am a version of Chris Pelkey recreated through AI that uses my picture and voice profile. I was able to be digitally regenerated to share with you today," the AI avatar said.
While the AI-edited video played in the courtroom, people were mostly quiet with a few softly crying.
Jason Lamm, the criminal defense attorney for Horcasitas, was surprised by the video, but didn't object.
"I was like, 'Is this really happening? This is cringe,'" he said.
Lamm now says he believes the video contributed to his client receiving a longer sentence than he should have.
Jessica Gattuso, the victims' rights attorney for Stacey Wales, had seen the video the day before, and was nervous not knowing how Lamm or Judge Lang would react.
"To be honest, I was freaking out a little bit," she said. "I think everyone was just impressed. So as it played, I kind of started to relax."
The AI-edited victim impact statement featuring an avatar of Christopher Pelkey.
Stacey Wales' husband, Tim Wales, who had spent days working on the video with his business partner Scott Yentzer, was slightly trembling and didn't initially look at the rest of the courtroom.
"It was like the first time I watched it," he said. "It was when I looked up at my kids and Stacey and the audience that I realized, 'Well, this had the emotional impact I thought.'"
Stacey Wales said Judge Lang watched the video intently. "I knew that he was captured in it, not enamored, but he was fixated on watching this," she said.
"To Gabriel Horcasitas, the man who shot me," the AI avatar said, "it is a shame we encountered each other that day in those circumstances. In another life, we probably could have been friends. I believe in forgiveness and in God who forgives. I always have, and I still do."
After the video ended, Stacey Wales returned to her seat and got a hug from her brother John Pelkey, who told her that is what Chris would have said.
Stacey Wales' 14-year-old son then hugged her. She said her son told her, "'Thank you so much for making that. I really needed to hear from Uncle Chris one last time.'"
Judge Lang sentenced Horcasitas to 10½ years for manslaughter. Arizona sentencing guidelines for manslaughter range from seven to 21 years. Horcasitas also pled guilty to one count of endangerment, and was sentenced to more than two years on that charge. Lamm said the judge gave his client concurrent sentences for the two offenses, meaning Horcasitas will serve less time than requested by the state, which had asked for consecutive sentences.
A few days later, Stacey Wales posted the AI video on YouTube and local news outlets reported on the video being shown at the sentencing hearing followed by national media outlets like The New York Times, CNN and NPR.
The video is believed by legal experts to be the first time someone has used AI to create a digital replica of a deceased victim to give an impact statement at a sentencing hearing. A victim impact statement is testimony given to a court about how a crime has affected a person emotionally, physically and financially.
While this may be a technological innovation, experts say that using generative AI in this manner could potentially pose both legal and ethical issues.
The increased use of generative AI models that can create written, audio and visual content based on large amounts of data in legal work has already led to problems.
For example, more than a dozen attorneys have gotten in trouble for submitting documents with fake case citations that were fabricated by generative AI tools.
Legal scholars also have raised concerns about the proliferation of AI-manipulated photos and videos, known as deepfakes, in courts, and have even proposed rule changes to deal with it.
Jerry Bui, a digital forensics expert, told Law360 that generative AI video creation has advanced to the point where forensics examiners can't tell the difference between real and fake videos, and AI detection tools are being rendered obsolete.
"It's moving so fast and getting so realistic that it's evading some of our previously reliable techniques," he said.
While Stacey Wales made clear the video of her brother that she presented in court was created using AI, public defenders still found it to be problematic.
Jennifer Sellitti, New Jersey's top public defender, said one of her concerns with the video is that the AI version of Christopher Pelkey is saying words that a court can't confirm are true, such as what Pelkey thinks of his shooter, whether they could have been friends, or whether he still believes in forgiveness after being killed.
Sellitti said that even though sentencing hearings have more relaxed rules about evidence and hearsay, a person can't tell the court what another person is thinking without having a conversation with that person beforehand.
"I think that's where we crossed the line," she said. "We have no way of knowing whether he really would. … There's just no way because the event took his life."
Sellitti said Stacey Wales could have told the court that while her brother was alive he believed in forgiveness or had his AI avatar read something that he wrote about forgiveness when alive.
Sellitti added that traumatic events can change people's lifelong views. For example, some people who believed in the death penalty all their lives have changed their minds after a loved one was murdered, realizing a death sentence wouldn't fix the harm done to them, she said.
"Anytime a crime is committed, it's a moment of extreme trauma … and it's in those moments that we can really change our perspective on life," she said. "So kind of guessing what somebody's reaction to something would be based on who they have been is, I think, really dangerous in these moments."
Sellitti said if a family member created an AI video of a deceased victim asking for a tough sentence, that might sway a judge to sentence more harshly than they otherwise would have.
"This is a real slippery slope," she said. "We all know as much as we would like to think that we make decisions based on fact and reason, and this includes judges, that most of these decisions are based on an emotional reaction to what people are hearing."
Stacey Wales told Law360 that she tried to be careful not to have the AI avatar say things that she didn't know about her brother's beliefs. This is why she didn't have the avatar explicitly forgive her brother's killer — only say that he believed in forgiveness generally — or ask for a specific sentence.
"I said what I knew about my brother, and I didn't say what I didn't know, which I didn't know how many years he wanted. Chris did believe in accountability and justice, so I do believe he would want years, but I don't know if he wanted seven or 10½ or anywhere in between," she said.
She added, "I really do think he would've said, 'I forgive you, dude.' I think he would've, but I wasn't going to make him do that because I do think that that is a personal choice Chris would've had to make. And I know that he would've had it in him to do it because he believed in forgiveness, but I didn't make him say it."
Another concern raised about the AI video is that it could pave the way for admission of generative AI-created victim impact statements based on a deceased person's text messages and emails.
Melanie Foote, who is the education, recruitment and strategic planning division director at the Kentucky Department of Public Advocacy, said she is worried judges might allow generative AI-created impact statements of deceased victims at sentencing, not understanding how that is different from the Pelkey AI video.
Foote noted that generative AI content can be inaccurate, and using the technology to recreate a deceased victim can misrepresent that person.
Generative AI "may put inflection or tone or facial expressions that don't belong to that person and attach it to them, and through that, we're not honoring the person who has actually been impacted the most in this case by that act that was criminal," she said.
Foote said she is also concerned about people using AI to recreate deceased victims without their consent.
"If a person passed away from a crime, they probably did not have the opportunity to consent to use their likeness or use their mind or use their words in that process. And so I am concerned if we're truly looking at making sure we're never revictimizing somebody in a criminal legal act, I'm worried that the use of AI without their consent could be doing just that," she said.
While some public defenders think that AI-generated victim impact statements are unfair to criminal defendants, victims' rights advocates argue that they are not any more unfair than written or spoken statements.
Gattuso, Stacey Wales' victims' rights attorney, noted that the purpose of victim impact statements is to show who a victim was and how a family was affected by the loss of their loved one.
Gattuso said that an AI-generated video of Pelkey isn't much different than a slideshow of him and his family.
Under Arizona state law, victims have the right to present their impact statements in any form, including written, audio, video or live, according to Gattuso.
Gattuso and Renée Williams, CEO at the National Center for Victims of Crime, each said that an AI-generated video impact statement presented at sentencing isn't more prejudicial than a written or spoken statement.
"I don't think it's overly prejudicial to the point that it shouldn't be allowed," Gattuso said.
They also noted that defendants have the opportunity at sentencing to counter with their own statement and character witnesses' statements on their behalf.
Defense attorney Lamm said that six people submitted letters on behalf of Horcasitas and three people spoke on his behalf at the sentencing hearing.
Horcasitas also spoke at the sentencing hearing, and Judge Lang found his remorse to be genuine and sincere, according to Lamm.
Williams, who is also the executive director of the National Crime Victim Bar Association, said victim impact statements are more about giving victims a chance to heal than influencing sentencing.
"We encourage victims to really put their emotion and what they feel needs to be said in these statements for them to help heal," she said.
Paul Grimm, a retired federal judge and director of the Bolch Judicial Institute at Duke University School of Law, noted that sentencing hearings tend to be very emotionally charged even without the use of generative AI.
He said that as a federal judge, he saw people cry and get angry, and once had to break up a fight between teenage family members and a defendant.
He added that federal judges can interject during impact statements in certain circumstances, like if a victim starts threatening the defendant.
"Sometimes I would say, 'I know this is difficult. I want to make sure that I'm respectful of your feelings, but I don't think it's helpful for you to be using that tone of voice at this time. You want to take a minute? Can you take a minute to get yourself together and then continue?' You try and deal with it the way you can when it comes up," he said.
Grimm said that an AI-generated video impact statement is something he would have liked to have been notified of before a sentencing hearing to review and give the defense counsel an opportunity to raise objections.
"The best way to deal with that is to make sure that there's advance disclosure so that judges and the other lawyers all have a right to look at it, complain if they think it's unfair, and let the judge make a ruling," he said.
However, in Arizona, victims are not required to disclose their impact statements before a sentencing hearing, according to Gattuso and Lamm.
Gattuso said that defense attorneys can object to victim impact statements during sentencing hearings.
She said that Lamm didn't object to the AI-generated victim impact statement, but he did object to Tim Wales' impact statement on the grounds that Wales evaded service attempts to have him testify in court.
Lamm said that victims have a lot of freedom in terms of what they present in their impact statements, so although he didn't have a basis for objecting to the AI-generated video, he still contends that it led to his client receiving a longer sentence than he otherwise would have.
He said the video is an issue as it appears the judge used it to negate mitigating factors in the case, including remorse and no prior criminal record.
Lamm noted that in Arizona, judges consider mitigating and aggravating factors when determining sentences, but in this case, the prosecution didn't submit any aggravating factors.
"You can consider victim comments, but in this case, they could not be used as aggravation because the state did not file the notice of the aggravating circumstances. And that's where the potential error comes in. Because the judge in essence found three mitigators. He could not find aggravators. So impliedly, he used the victim impact statement as an aggravating factor, although under the facts of this case, he was legally unable to do so," he said.
While victims' rights advocates didn't have issues with the AI-generated video in this case, they acknowledged that AI could be used to create inappropriate victim impact statements.
They said that it would be improper for an AI-generated victim impact statement to be gory or to say what happened in a case where there were no witnesses.
"There are all sorts of ways this can go wrong, but there are all sorts of controls in place to stop instances like that from going that far afield," Williams said.
Within hours of the sentencing hearing in the Horcasitas case, Lamm filed a notice of appeal for his client.
Stacey Wales said she wasn't surprised by the appeal, noting that her victims' rights advocate told her many trial convictions get appealed.
She said that even if the appeal is upheld and the judge has to reduce his original sentence, she doesn't regret showing the AI video.
"I know that in the moment that we needed to make our voices heard and be impactful," she said.
Even if an appellate court finds an error was committed during sentence, it might conclude that the error doesn't rise to the level of requiring resentencing, according to Foote of the Kentucky Department of Public Advocacy.
Sellitti said that in light of the novelty in this case, she hopes that if an appeals court were to find an error, it would be sent back for resentencing.
"I would like to think that when you have an issue of first impression, you have something that is clearly extremely emotionally charged, that if a court found it was improper, it would at least be remanded for resentencing," she said.
Grimm said that more people will likely try to make AI-generated impact statements for deceased victims, but it won't happen in every case, and courts won't be flooded with them.
"I think that in any one particular court, they may get one of these in an entire year," Grimm said.
Bui, the digital forensics expert, noted that generative AI video creation tools are fairly easy to use and vary in cost with free tools available.
He said that mainstream AI tools like OpenAI's ChatGPT have safeguards built into them to prevent people from making realistic deepfakes.
Tim Wales said he and his business partner Yentzer used a mix of mainstream and proprietary AI tools to create the video of his brother-in-law Chris Pelkey. They had experience making similar videos of deceased CEOs and founders for corporate clients.
Bui added that the quality of generative AI videos largely depends on how much time and money people spend on them.
"The quality of the output really depends on not just the tool but the person using the tool," he said.
Have a story idea for Access to Justice? Reach us at accesstojustice@law360.com.
For a reprint of this article, please contact reprints@law360.com.