Why Elon Musk’s pigs are a legal headache

Avatar photo

By William Holmes on

Bristol University student and future trainee William Holmes explores the challenges ahead for brain-computer interface (BCI) systems

Elon Musk (credit: Duncan.Hull via Wikimedia Commons) and Gertrude

Elon Musk’s pig, Gertrude, looks like any other pig. But the coin-sized chip Musk’s company Neuralink have placed in Gertrude’s brain makes her a key part of a ground-breaking experiment to discover if technology can enable us to do things with thoughts.

The chip is a brain-computer interface (BCI) which picks up neural activity. Musk hopes to decode this neural activity so that it can be understood as instructions for a computer, allowing BCI users to control a computer with their minds. In other words, BCIs can transform a thought into an act.

For many who have lost certain bodily functions, BCI technology is a scientific miracle. The technology has the potential to treat neurological conditions like dementia or Parkinson’s, restore paralysed individual’s ability to control their bodies and even allow the blind to see again. But for prosecutors, judges and policy makers, BCIs are a troubling legal headache.

Proving criminal responsibility for most crimes requires the prosecution to prove both a defendant’s criminal act (actus reus) and intention (mens rea). So, how would this work for a defendant who used a BCI to commit a crime? An act is defined in most legal systems as “a bodily movement” (the quote here is from the US Model Penal Code). But a crime committed using a BCI involves no bodily movement. Nevertheless, if we take a neuroscientific approach, this is not an insurmountable obstacle for a prosecutor.

The chain of causation for a BCI user is as follows. First, the BCI user imagines an act that they want the computer to perform (I shall refer to this as a “mental act”). Second, neural activity is triggered by the mental act that is input for the BCI. Finally, the BCI interprets this neural activity and performs the act. Just as a finger pulls the trigger on a gun, neural activity triggers the BCI. Therefore, the neurons that fire and produce measurable neural activity could plausibly be considered the actus reus in cases involving the use of BCI technology. So, it appears that a legal loophole in prosecuting disembodied acts can be avoided. But at a price.

By finding actus reus in the activity of a defendant’s neurons, we have been forced to expand the law into the mental sphere. This is a sphere which, in keeping with the Roman law maxim that “nobody shall be punished for thoughts” (cogitationis poenam nemo patitur), is not regulated by the law. In the UK, this doctrine is a right enshrined in article 9 of the Human Rights Act 1998. Given the repercussions for our freedom of thought, is it acceptable to regulate BCIs? If not, can legal systems that only regulate outward behaviour properly maintain the rule of law in BCI cases?

The middle ground between a BCI Wild West and criminalising thoughts is granting BCI users the ability to waive their right to freedom of thought. For those that this technology offers the most, for example tetraplegics, this may well be a right they are happy to waive. Should an individual be allowed to take such a decision? Legislators would have to step in to clarify who can use BCIs and judges would have to recognise implied consent from BCI users to waive this right to freedom of thought.

Want to write for the Legal Cheek Journal?

Find out more

When deciding this, we must not ignore how significant this expansion of government regulation would be. For the first time, certain thoughts or mental acts would be outlawed. As a result, law-abiding BCI users will be forced to think before they think, regulating themselves in an unprecedented way. This is the immediate ‘legal headache’: BCIs force us to consider the merits of breaking a human right that is fundamental to democratic society and individual liberty in order to avoid criminal loopholes.

There is, however, a second long-term ‘legal headache’. Using the brain’s neurons to establish responsibility forces us to reconsider how we determine responsibility more broadly. How we attribute responsibility is (and has always been) a social decision. In some societies in the past, if an act was compelled or inspired by a divine force, then the law did not deem the individual responsible. In societies where an artist considered the muses responsible for their work, an acceptable waiver of responsibility was the excuse that “God made me do it”.

Today, we consider acting people to be responsible. But this could change in the future, especially if BCIs help to promote neuroscience to the forefront of the legal system. A recent example that highlights the influence of neuroscience on policy is Holland’s adolescent criminal law that came into force in 2014. This law allows those aged between 16 and 22 to be tried as an adult or as a juvenile at the court’s discretion. The underlying rationale is based on neuroscience: Holland’s new system hopes to take into consideration the mental development of defendants when sentencing them. This represents a social shift that sees the brain as the responsible agent.

This shift, which was famously critiqued as “brain overclaim syndrome” by Stephen J. Morse, could have some troubling consequences. The data recorded by BCIs (especially from the amygdala which regulates emotion) offers temptingly persuasive evidence for a defendant’s mens rea and mental state. The question for judges is whether this data is admissible evidence.

A neurocentric legal culture would encourage a judge to admit such evidence. If admissible, a high level of cross-examination is vital to ensure that there is clarity around neuroscience’s technical and interpretive limits. For example, there is evidence that factors like parenting and socio-economic status change the way the amygdala and prefrontal cortex function. The fact that neuroscientific technology is overwhelmingly tested on students from Western Educated Industrialised Rich and Democratic (WEIRD) population means that there is a possible bias in interpreting neuroscientific information. Unquestioned, these limitations allow lawyers to cast uncertain aspersions based on competing expert testimony which could lead juries to jump to false conclusions.

Furthermore, if the brain is considered responsible for criminality, then a reform of the penal system is implicit. The chances of recidivism and the methods with which guilty prisoners are treated — be it regenerative or punitive — would no longer be based on human nature and character. Instead, neuroscience would nuance our understanding of criminality and how to treat it. And the result might not be dissimilar to the Ludovico Technique, a type of psychological treatment that Antony Burgess portrays in his dystopian novel A Clockwork Orange.

Gertrude the pig is just the start of a technology that could rewire the legal norms of responsibility and radically change the legal concept of action. In light of this, policy makers and judges must prepare the criminal justice system for advent of BCIs. There is currently no regulation that is specific to BCI technology in the UK, as the British government acknowledged in a report published in January 2020. That is because the technology is still being developed and there are no clear solutions yet. But one thing is for sure: Elon Musk’s pigs promise to be a complex legal headache for scholars, lawyers, judges and legislators for decades to come.

William Holmes is a penultimate year student at the University of Bristol studying French, Spanish and Italian. He has a training contract offer with a magic circle law firm.

Want to write for the Legal Cheek Journal?

Find out more

Please bear in mind that the authors of many Legal Cheek Journal pieces are at the beginning of their career. We'd be grateful if you could keep your comments constructive.

Join the conversation

Related Stories

Why the law should treat algorithms like murderous Greek statues

Future magic circle trainee William Holmes considers whether 'mutant algorithms' should have their day in court, following this summer's A-Level exam results fiasco

Sep 17 2020 11:32am

What Mediaeval animal trials can teach us about AI and the law

Future magic circle trainee William Holmes unlocks the method in 'Mediaeval madness'

Jul 21 2020 11:12am

What TV trials are really asking us

Future magic circle trainee William Holmes examines the interplay between justice and entertainment

Oct 27 2020 9:19am