Site icon Legal Cheek

Beware of ‘deepfake’ clients, regulator warns lawyers

Concerns over money laundering and terrorist financing


The Solicitors Regulation Authority (SRA) has issued a new warning about the risk posed by artificial intelligence (AI) to the legal profession in the form of ‘deepfake’ technology.

As part of their regular risk assessments for anti-money laundering and terrorist financing, the SRA has highlighted the potential risks of deepfake technology alongside other emerging and existing issues.

“Not meeting a client face-to-face can increase the risk of identity fraud and without suitable mitigation such as robust identity verification may help facilitate anonymity,” the warning states.

Whilst “not meeting face-to-face may make sense in the context of a given transaction or wider context… where clients appear unnecessarily reluctant or evasive about meeting in person, you should consider whether this is a cause for concern.”

The 2024 Legal Cheek Firms Most List

Firms are also told to be aware of the use of AI to create so-called ‘deepfakes’, which can impersonate a real person’s appearance convincingly.

“This increases the risk of relying on video calls to identify and verify your client. If you only meet clients remotely, you should understand whether your electronic due diligence protects you against this, or to explore software solutions to assist in detecting deepfakes,” the SRA adds.

In a speech last week the second most senior judge in England and Wales, Sir Geoffrey Vos, highlighted the continued growth of AI in the legal profession, and its potential for further expansion.

“One may ask rhetorically whether lawyers and others in a range of professional services will be able to show that they have used reasonable skill, care and diligence to protect their clients’ interests if they fail to use available AI programmes that would be better, quicker and cheaper,” Los said.

Noting also the potential use of tech in judicial decisions, he added:

“I will leave over the question of whether AI is likely to used for any kind of judicial decision-making. All I would say is that, when automated decision-making is being used in many other fields, it may not be long before parties will be asking why routine decisions cannot be made more quickly, and subject to a right of appeal to a human judge, by a machine. We shall see.”

Last month Shoosmiths became one of the first law firms to offer guidance to students on the use of AI when making training contract and vacation scheme applications.

Exit mobile version