News

Regulate the robots? Law Society’s incoming president launches review

By on
3

As Society appoints ‘new’ CEO following dramatic resignation of predecessor

The Law Society has launched a review into the impact of robot-esque technologies, such as artificial intelligence (AI), on human rights and justice.

Unveiled today, the public policy commission is being spearheaded by incoming Law Society president Christina Blacklaws, and will involve interviews with tech companies, government and legal experts.

The adoption of AI in policing and the criminal justice system is emerging in a number of areas in England and Wales (the focus area of the report). Police forces are piloting facial recognition technology to cross-reference individual members of the public at certain events (such as the Notting Hill Carnival) with crime data.

Durham Constabulary has been using an algorithm called HART (Harm Assessment Risk Tool) to predict what is the level of risk of an individual committing further crimes over a subsequent two-year period. The algorithm then decides whether the individual is a candidate for a specific rehabilitation programme.

The 2018 Firms Most List

There are concerns, however, that such technology and algorithms can be biased and raise human rights issues. Liberty, the civil liberties organisation, is currently backing a challenge — which is being crowdfunded at the moment — by campaigner Ed Bridges against the use of facial recognition by South Wales Police.

The Chancery Lane commission will investigate these and other examples of AI to examine what sort of framework is needed “for the use of big data and algorithms to protect human rights and trust in the justice system”.

Law Society vice president and commissioner of the report Blacklaws commented that the use of AI “could — and sometimes does — keep us safer, preserve scarce resources and expand the reach of increasingly stretched law enforcement”. But she also warned:

“The design, sale and use of algorithms to deliver justice or maintain security also raises questions about unconscious bias, ethics and rights. Further potential risks may emerge when an algorithm is developed by a business focused on profit rather than by an organisation focused on delivering justice.”

Co-commissioners of the report, which is expected in early 2019, are Birmingham Law School professor Sylvie Delacroix and University College London computer science expert Professor Sofia Olhede.

The Law Society also announced today that interim boss, Paul Tennant, will stay on as chief executive. Tennant took the helm last year following the dramatic departure of Catherine Dixon in early 2017. Dixon quit during a spat with the 100-strong Law Society Council which, she argued publicly, blocked her attempts at much-needed governance reforms.

Sign up to the Legal Cheek Newsletter

3 Comments

Law Tech Queen

..under GDPR people will soon say they object to being ‘subject to automated decision making’ which includes profiling, how will all of that work out then?

(0)(0)

Anonymous

All this red tape mumbo jumbo. Why can’t we go back to the good old days when you could leave your front door open and bathe at the local pool topless without worrying about some punk taking photographs with their phone?

(0)(0)

Dave

Big Brother Watch are also challenging police facial recognition – they’ve written to the Met and the Home Secretary: https://www.crowdjustice.com/case/face-off/

(0)(0)

Comments are closed.

Related Stories