News

Watch out for ‘deepfake’ evidence forgery, family lawyer warns

By on
2

Byron James says his client’s voice was manipulated to create fake recording of threat to other side

Lawyers can no longer take video or audio evidence at face value thanks to the rise of “deepfake” technology, a barrister who uncovered a fake recording of his client has warned.

Family lawyer Byron James says that voice forging software was used to create a fake recording of his client threatening another party to a dispute — and reckons that fellow practitioners should be on the alert.

James, a partner at cross-border family specialists Expatriate Law, says that “it is now possible, with sufficient content, to create an audio or video file of anyone saying anything”.

Deepfakes rely on the power of machine learning to create super-sophisticated fake footage. Software now widely available can be fed old footage of, say, former President Obama giving a speech and spit out a new vid (see below) of the same POTUS spouting things he never said.

While high-profile hoaxes can be quickly exposed by mainstream media, James warns that lawyers and judges used to taking recorded evidence at face value might not be disposed to question a piece of footage in day-to-day practice.

In an article soon to be published in the International Family Law Journal, James explains how his interest in the topic was piqued when his own client was hit with a deepfake.

LAWYER EVENT: How to follow your dreams outside law while still working as a lawyer — with A&O Peerpoint: Find out more

The Dubai-based barrister describes how his client was alleged to have threatened another party over the phone, but was adamant that he had never uttered the alleged threat. The matter seemed to be put beyond doubt when “an audio file was produced which included a recording using the precise words my client had been accused of”. James continues:

“This is always a difficult position to be in as a lawyer, where you put corroborating contrary evidence to your client and ask them if they would like to comment. My client remained, however, adamant that it was not him despite him agreeing it sounded precisely like him, using words he might otherwise use, with his intonations and accent unmistakably him. Was my client simply lying?”

“In the end, perhaps for one of the first times in the family court, we managed to prove that that the audio file had not been a faithful recording of a conversation between the parties but rather a deepfake manufacture.”

The journal article takes lawyers through the creation of a deeptake step by step, in what James says is a “simple” process with the right software. “With practice”, James says, “a deepfake video can be so plausible that even an expert may not be able to readily identify it as manufactured”.

But help is at hand: the expat lawyer has researched different ways of exposing a deepfake, including by checking the original file. “This should always be requested in any case where such evidence is sought to be relied upon”, James comments. The full low-down will be available in the March edition of the International Family Law Journal.

Sign up to the Legal Cheek Newsletter

2 Comments

An extra large margherita pizza

Kudos to Byron for speaking out about this.

Family law proceedings frequently involve unstable and desperate clients who may care more about harming one another than about any children they’ve had.

Faked ‘evidence’ doesn’t surprise me in the slightest.

Anon

Given the family division’s casual approach to evidence, some congratulation is in order in getting the judge to take these submissions seriously.

Join the conversation

Related Stories