The EU regulations may not be the change the world of data protection actually needed, says law graduate Chloe Amies in her shortlisted entry to the BARBRI International Privacy Law Blogging Prize
As most people will be aware, after the 25 May 2018 the General Data Protection Regulations 2016 came into force in the UK and the Data Protection Act 1998 will cease to exist. If you are not aware, where have you been? Emails with the subject line, ‘We’ve updated our Privacy Statement…’ were a near-daily occurrence in the run-up. However, the regulations may not be the change that the world of data protection actually needed.
It would be basically impossible to argue that UK data protection did not need reforming. Protection of personal data in the UK was covered by the Data Protection Act 1998 and the Privacy and Electronic Communications Regulations 2003. However, these were designed to protect our personal data in an era when the internet was still in its early stages, Facebook hadn’t been invented yet and apps didn’t exist.
We now live in a networked environment where certain technology giants have the negotiating power of what appears to be a small country.
Personal data now has a value and is bought and sold by social media corporations in exchange for free services. If we did not volunteer personal information to Facebook (and apparently give them the right to harvest our data too?) they would not provide their service for free. Algorithms now make decisions about and for us without our knowledge and our data is processed seamlessly and invisibly.
When this is all taken together it demonstrates that there is an urgent need for data protection law that has current technological advances in mind — this was the rationale for the GDPR. The GDPR require greater transparency from those who possess personal data in their activities and a principle of accountability whereby they must demonstrate compliance with the regulations.
The GDPR have introduced new concepts into data protection law. For example, the principle of ‘Privacy by Design’ is now mandatory, whereas it was previously only encouraged. This requires privacy to be the paramount consideration from the start of a process and throughout. Article 25 GDPR provides that:
“The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed.”
It will be surprising to no-one that has studied the law that the law here is as vague as ever (but as all good lawyers will know, vagueness allows for flexibility in the law so it’s fine…) However, this is nevertheless a huge step from the Data Protection Act 1998 where a need for such a principle was not even envisaged.
The ‘dark side’ of the GDPR relates to its application.
The regulations apply to anyone who is processing data regardless of the size of the workforce, turnover, negotiating power, etc. Through working for a small business, I know first-hand that they rely on their contact databases to generate profit, however the method used to compile these databases may no longer comply with the GDPR. Therefore, smaller businesses that posed no threat of unlawfully processing data, and even if they did the consequences of this would not be far-reaching, are now restricted as to the ways they can generate income. They must now also spend hours making sure they are able to demonstrate compliance with the GDPR rather than dedicating these hours to profit-making activities.
These businesses often do not have a dedicated legal team so employees who are not trained in the law will have to get to grips with these new, vague, provisions and work out how to implement them whilst also doing their contracted job. For example, privacy by design has been made mandatory, which is good for us social media users, but the law does not explain how this should be implemented. It appears that the GDPR may have created unnecessary bureaucracy for the wrong people.
The problem at hand exists largely in relation to social media companies outside of the EU sharing data, usually with each other, and using our data unlawfully. Small companies going about their business were not part of the problem and should not have their innovation impeded by a response to a problem they did not create. In my opinion, the answer lies in regulating the activities of companies who assume that they are above the law and subjecting them to higher levels of scrutiny.
To cut what could be a much longer story short, there needs to be a way to stop CEOs of companies and of social media platforms from acknowledging that the law exists but finding new ways to avoid it, whilst, at the same time, retaining the ability for small businesses to make the money and provide the jobs that their CEOs have worked hard to generate.
Chloe Amies is a recent LLB graduate from the University of Liverpool, who is going on to study an MA in applied human rights at the University of York.
BARBRI International will be hosting an Independence Day party at its London office on 4 July. Register to attend here.