Deepfakes Could Be Arbitration's Next Gen AI Shake-Up

(May 13, 2024, 10:53 PM EDT) -- In a high-stakes arbitration, lawyers for one of the companies present what they say is surveillance video of a bribe being accepted by its opponent's president. They argue the video presents incontrovertible evidence that the case should be decided in their client's favor — and a tribunal might be inclined to agree. But what if it turns out that the video is a fake, generated by artificial intelligence?

This is a not-so-distant problem foreseen by some arbitration experts. Deepfakes — AI-generated videos that create convincing audio and video hoaxes — aren't easily detected by the untrained eye. They can be so convincing that a person's accent, patterns of speech and mannerisms can be perfectly mimicked. And, experts say, the cost to create them is not as high as one might think.

"The challenge here is that for the naked eye, [deepfakes] are almost impossible to decipher," said Orlando F. Cabrera C., a senior associate with Hogan Lovells who sat on a Silicon Valley Arbitration & Mediation Center task force that issued a set of guidelines for AI in arbitration last month. "And this is just the outset. Imagine within five or 10 years — [deepfakes] will be indistinguishable."

As of right now, the issue of deepfakes appears to have gotten lost in a deeper morass of how the legal profession should approach other generative AI technologies, like ChatGPT. Although many arbitral institutions have begun providing guidelines on how parties and counsel should approach chatbots like ChatGPT, the specific issue of deepfakes doesn't appear to be on most people's radars.

But it deserves more attention, experts say.

"We're at a stage here where the technology is almost developed enough to really be dangerous. It should be addressed before we have the big scandal that makes everybody think, 'Oh my God, we need to fix this now, and why didn't we have a very robust system in place to catch this already?'" said Sean McCarthy, an international arbitration consultant and a co-founder of ArbTech, a worldwide online community forum focusing on technology, dispute resolution and related issues.

Holland & Knight LLP partner Adolfo Jimenez, who leads his firm's international arbitration and litigation team, felt similarly.

"I personally think that deepfakes could become a major issue and problem in international arbitration," he said. "As much as you want to think that businesses don't engage in that kind of activity ... in my experience, international arbitration has not been immune from forgeries, and deepfakes are just a very, very sophisticated forgery."

As for how deepfakes could be used in international arbitration, the example provided at the beginning of this story is just one possible scenario. A deepfake could also be created to make a video of an individual saying anything.

For example, a company called HeyGen can create an avatar using just a two-minute recording of someone speaking to the camera, McCarthy noted in an article posted last month on the Kluwer Arbitration Blog. A video could then be created of that avatar saying whatever script is provided — even recreating facial expressions, mouth movements and body language, according to McCarthy's article.

The rise of Zoom hearings during the pandemic also makes it possible for video testimony to potentially be faked as well, in real time — meaning that an avatar could respond to a question based on customizable sources of information, McCarthy wrote in his article.

"The effort and the cost involved to do a recreation that looks pretty real is so low, that I wouldn't be surprised if we see more people trying their luck at it," McCarthy told Law360, adding that the often grainy and blurred backgrounds that appear on video platforms like Zoom and Teams make them an easier task for AI to recreate.

Still, he added, techniques like this or other deepfakes probably wouldn't work in every case.

"I think it can absolutely happen in lower-profile cases or cases where the parties aren't used to technology ... but [in] the highest-profile cases, it's unlikely that you're going to be able to fool a full set of law firm teams and a full tribunal and any of the people that are helping them," McCarthy said. "So that's probably the final frontier."

Jimenez told Law360 there are several reasons why international arbitration in particular could be more prone to bad actors using deepfakes. For one thing, discovery in international arbitration is more limited than it is in litigation. So if a photo is provided of a supposedly nefarious act, for example, opposing counsel probably wouldn't be able to depose the person who took the photo, he said.

A second aspect unique to international arbitration is that tribunals have a limited ability to penalize wrongdoers; tribunals do not have authority to find a party in contempt. Arbitration is also largely confidential, particularly in the commercial space, so there will be no press to potentially deter bad actors, Jimenez said.

There are tools currently available to reduce the risk of deepfakes in international arbitration. One such tool made by Intel, called FakeCatcher, analyzes "blood flow" in video pixels to return results in milliseconds with 96% accuracy, according to an Intel press release.

Perhaps more important, however, is simply raising awareness in the arbitration community, according to Hogan Lovells' Cabrera.

"One of the purposes of drafting the [Silicon Valley Arbitration & Mediation Center AI] guidelines was to raise awareness in the legal community, and more specifically [within] the arbitral community, that artificial intelligence is real," he said. "It's there, and we cannot be oblivious of the fact that some people may use [it], and it's going to be a challenge."

--Editing by Alanna Weissman and Lakshna Mehta.

For a reprint of this article, please contact reprints@law360.com.

Hello! I'm Law360's automated support bot.

How can I help you today?

For example, you can type:
  • I forgot my password
  • I took a free trial but didn't get a verification email
  • How do I sign up for a newsletter?
Ask a question!