Journal

Clause 8(e): The Cambridge Analytica enabling clause

By on
6

Law student Joe Ferris delves deep into the Data Protection Bill in his runner-up entry to the BARBRI International Privacy Law Blogging Prize

When a major scandal emerges, the trope of ‘something must be done’ usually arrives shortly after. Thankfully, something is already being done. The Data Protection Bill is currently making its way through the legislative process, and at the time of writing, is about to enter the report stage of the House of Commons. However the bill is not without issue.

The Cambridge Analytica scandal

In 2014, University of Cambridge researcher Aleksandr Kogan in collaboration with Cambridge Analytica, used an app called ‘This Is Your Digital Life’ to pay around 270,000 Facebook users to take a personality test. With their consent, this harvested the data of those users. However, as well as harvesting the data of the paid users, the app harvested the data of their friends too, resulting in the collection of up to 87 million individuals’ data. Cambridge Analytica was later hired by the pro-Brexit campaign group Leave.EU in 2015 and by Trump’s presidential campaign in 2016. It is currently unclear to what extent this data was used in these campaigns, but a whistle-blower has attested that it was used extensively.

Can Cambridge Analytica’s behaviour be repeated?

Prior to 2015, Facebook allowed app developers to collect user data from Facebook using their own apps, which permitted the collection of friends’ data to improve user experience in the app, but not for it to be used for advertising or to be sold on. Facebook changed these rules after 2015, removing this ability. Cambridge Analytica broke the user agreement by misusing the data it collected for advertising. That is not to say what was done can not be repeated, but it can not be repeated in the same way.

Data can be harvested via an app in largely the same manner, though the collection will no longer be so vast from a relatively small participation size. Data can be bought but, again, it is unlikely to be of such large size. Even so, greater data protection laws are at the core of the European General Data Protection Regulation (GDPR) and the Data Protection Bill.

The legislative framework

The Data Protection Bill will implement the vast majority of the GDPR. There is a plethora of changes contained in the bill, focused on providing greater access to the data that companies hold, as well as giving greater power to the Information Commissioners Office to uphold information rights.

Some notable changes include the requirement for businesses to obtain a ‘positive opt-in’, clearly explaining that consent is being given when they intend to rely on consent to lawfully use a person’s information. The £10 charge for a Subject Access Request will now be free and there will be greater powers to request erasing data. The GDPR also increases the severity of fines for organisations who mishandle an individual’s data. The maximum fine has gone from £500,000 to up to €10 million or 2% of the firm’s global turnover (whichever is greater).

BARBRI International is hosting an Independence Day party and you're invited! Click here to register to attend

In general, the stricter regulation of data protection from the GDPR can be seen as a victory for the privacy-conscious social media user. Though currently what is troubling is not the prospect of what might enter the bill, but what it already contains. There are some parts of the Data Protection Bill that have caused concern, most notably clause 8 (e).

Data Protection Bill: The problem with clause 8(e)

This subtly concerning clause emerged in the draft bill after an amendment was agreed. Clause 8(e) states that an “activity that supports or promotes democratic engagement” is an example of a necessary process of personal data that could be undertaken on the grounds of lawfulness in the public interest. This clause gives good reason to be concerned.

First, with such an extremely wide scope of activity potentially included in this provision, it has the capability of including the campaigning tactics of Cambridge Analytica. This has the effect of legitimising their behaviour, and potentially forms an enabling provision.

It would not be outrageous to suggest that the amendment appears to be inserted out of concern that politicians will hamstring themselves when processing data during campaigns. The ease at which data could be used tactically during campaigns will no doubt be of great concern to politicians. It is easy to see this as a politically self-serving amendment, rather than one which is in the spirit of the GDPR.

Furthermore, recital 45 of the GDPR does not afford the extremely wide ambit of all democratic activities which clause 8(e) does. Bundling clause 8(e) with clauses (a)-(d) affords it the enormous scope the other deliberately wide clauses cater for. However, clause 8(e) is far more susceptible to abuse, which has the capacity to arise in the form of Cambridge Analytica’s strategies.

Second, as clause 8(e) applies to any data controller, this amendment largely conflicts with the processing of political opinions, a special category of data in article 9 of the bill, which is reserved for registered political parties, as opposed to any data controller. The clause consequently puts a much looser constraint on the processing of such data.

Troublesome still, is that whilst clause 8(e) of the Data Protection Bill was inserted 13 March 2018, and the expose of Cambridge Analytica published a matter of days later on 17th March, successive hearings have failed to show any indication of amending clause 8(e).

Concluding remarks

Whilst the Data Protection Bill introduces a largely welcomed arsenal of tools to ensure data is handled properly, clause 8(e) displays a worrying exception. Its primary issue is its scope, and the potential for it to become a form of protection for the tactics of Cambridge Analytica. The scope should be dramatically reduced, and with ample opportunity for this change to be implemented, there is little excuse for it to be left in its current form.

Joe Ferris is an LLB graduate from the University of East Anglia who completed the BPTC at BPP Law School. He will commence a masters degree at Cambridge later this year.

BARBRI International will be hosting an Independence Day party at its London office on 4 July. Register to attend here.

Please bear in mind that the authors of many Legal Cheek Journal pieces are at the beginning of their career. We'd be grateful if you could keep your comments constructive.

6 Comments

Anonymous

Decent article.

Anonymous

It is a good article.

What is harder to write is “the cutting edge of the information gathering technology companies have the capacity and the inclination to do x y and z in the medium term, such is the demand from their clients, and the patents they have filed”

This is how the gdpr fasten on to those aims: x y and z

So you can see how it enables or disabled those aims.

Further , IT companies are now able to harness (something, e.g. keystrokes or algorithms)and this something will yield the same fruit as personal data, without the information being classified as that.

Those with an eye for private equity expansion and legislation which will have been pored over by corporate lawyers will expect warfare, rather than peacetime from this 2018 statute, where data is concerned, because it is regarded as the new oil.

People’s rights to oil versus corporate rights to it are unlikely to be changed by the financiers’ favourite toy, the eu. Expect the gdpr and facebook to continue to feed the machine

Anonymous

What are you on about?

Alan

It isn’t the most articulate, but I got the message.

Anonymous

Why would data about keystrokes (*from* algorithms?) not be classified as personal data?

Anonymous

Maximum fine is 10/20 million or 2/4% (but it’s dependent on provision breached). I also don’t think there are clear conflicts with Article 9 as a separate lawful basis would need to be satisfied for special category data (e.g political opinions) – so the scope to actually do anything with 8(e) could be low.

Anyone looking to exploit personal data in this way would still be required to meet the principle of fairness and transparency before processing in this way – so providing the info required in Art 13/14 would be difficult as they’d have to open to the uses of the data.

Join the conversation

Related Stories