The right to erasure (‘right do be forgotten’) was introduced by the article 17 of the General Data Protection Regulation (GDPR), to enhance the level of legal certainty in terms of privacy for the data subjects. Even though this kind of regulation came in an attempt to better balance obligations among data subjects and controllers, it is still questionable whether such a right would be able to be fully exercised in the digital era, where information spread as fast as the speed of light, and technology evolves not in a friendly way for most part of its users. The right to be forgotten is one of the mechanisms brought by the GDPR to enable data subjects to have more control over their personal data and privacy, but some questions had arisen as to how it would affect both the data subjects and controllers.
Before the GDPR, the control over personal data by data subjects were regulated by the European Data Protection Directive, and the right to be forgotten was implicitly extracted by an interpretation of articles 6.1(e) and 6.1(d) – known among scholars as “passive right to be forgotten”. In summary, such provisions provided that personal data should be kept available by the controllers only for the necessary time to achieve their purposes and should be deleted, erased or rectified once it had achieved the result for which it has been collected for. This kind of regulation, however, created a wide field for data controllers to kept personal data collected, calling the attention for the regulators to different treatment under the GDPR’s provisions. Hence, what was valued the most by the regulators as to the personal data, privacy and right to erasure (to be forgotten) was data’s subject consent.
The ex-post empowerment approach to providing for the ‘right to erasure’ in the GDPR has called for discussions on its mode of enforcement. The claims encompass the range of responsibilities that the approach imputes upon both the data subject and the controller. While there is no doubt that the right is an important tool in the hands of data subjects, the ex-post approach can also be argued to result in devaluation of the right since the paramount responsibility is on individuals who are mostly not willing to exercise it. Potential reasons for that would include the “very limited individual impact (or at least the difficulty in measuring it) and the apparent high threshold for taking legal action”. As against the aforesaid, there is also the controller centric view that the modalities of enforcing the right pose hindrance to the economic operations of the controller.
In a typical scenario, controllers are usually multi-national corporations with asymmetric bargaining power over data subjects. While they are capable of negotiating terms, the CJEU’s decision in the Google Spain case followed by the GDPR impose an obligation on controllers to effect erasure if so requested. The controllers are now burdened with hundreds of thousands of requests for erasure of personal data. The internal handling of such requests imposes substantial costs for internet service providers. The expanding volume of such requests posits such challenges that no judiciary has the resources to handle that gamut of work. There being no meaningful appellate system in place, controllers are likely to erase links or data in case of doubt. This has an adverse impact on freedom of expression and right to information. There also arises the question of compromising of security by controllers to fulfil the GDPR objective of allowing data erasure. An example of the same can be found in Apple’s defence that once an user’s voice recording on Siri is uploaded to Apple’s server, its link to the user’s account is cut off and cannot be located thereafter.
The construction of the right to be forgotten is often considered to be an inadequate solution. As is now the position after Google LLC v. CNIL (France) 2019, the de-linking has to be undertaken only on certain versions of a search engine (data controller) that correspond to the Member States of EU. This vitiates the suggestion that was forwarded by the 2014 Working Party with regard to an extra-European effect. The problem is further compounded by the difference in domestic law of various countries which renders it practically impossible to bound the nations with an overarching data protection regime.
The internet has an unforgiving memory and works more like quicksand. The claimant in the Google Spain case, Mario Consteja Gonzalez in ascertaining his claiming be forgotten, is remembered by the internet, with more than 70,000 search results coming up just for his name and it is almost always associated to his case, and the debts he tried so hard to make the internet ‘forget’. Even more, the original information about Mr. Gonzalez was never deleted, it was just the link from the search engine that was removed. Although the Regulation provides individuals with the chance to claim this right, the Regulation can remedy it to the extent of disclosure in the search engine alone.
The right to be forgotten tries to tackle an important problem; but with a very blunt instrument which could also have a chilling effect on access to information and ultimately censorship. Taking a look at the government requests to remove content to Google from UK in the past 12 years, there has been a total of 118,991 items removed since 2009, giving reasons including but not limited to privacy, national security, defamation, violence and trademark. The decision to remove these come from Google itself, and is entirely controlled by their own internal decision making process. This shows an uncomfortable shift, where the Regulation has shifted responsibility of protecting privacy and implementing censorship, from the judiciary to private businesses, regulating themselves and the information that we receive. There is no regulatory obligation, or process available that shows that in such a request, the search engine is required to take into consideration any responses from the party that has disclosed the information on the search engine, making these decisions quite arbitrary.
This brings us to the question of what it means to ‘forget’; in the era of artificial intelligence and social media where personal data is remembered through a multitude of forms, and has become a currency of the internet. It is, the “panopticon beyond anything Bentham ever imagined”. Memory in terms of technology is vastly different from remembering in the human form, and hence, the right to be forgotten, while seemingly straightforward on paper, does not translate in the same manner for machines and technology.
In machine learning environments, current methods of implementing data privacy through data anonymisation, minimisation, and even deletion, come at a cost of loss of functionality in the technology itself. While the right under the data protection framework focuses on legitimacy grounds, purpose and our right to object, much more of our data remains and continues to be utilised through many other means, and it may as well be impossible for individuals to track, identify and remove all of them. With such a reactive measure, the current legal framework does not provide enough safeguards but puts the burden on individuals to claim and request to be forgotten in a realm that is beyond the jurisdiction of the framework itself. Ultimately the aim should be to integrate legal and technical approaches together to enable the right to be forgotten. The solution is not to regulate technology to promote privacy, but to promote privacy by using technology to regulate itself.
References
- Hans Graux, Jef Ausloos and Peggy Valcke, ‘The Right to Be Forgotten in the Internet Era’ (November 12, 2012). ICRI Research Paper No. 11, Available at SSRN: https://ssrn.com/abstract=2174896 or http://dx.doi.org/10.2139/ssrn.2174896.
- Jef Ausloos, ‘The Right to Erasure in Practice: In The Right to Erasure in EU Data Protection Law’ (OUP 2020) Retrieved 21 Feb. 2021, from https://oxford.universitypressscholarship.com/view/10.1093/oso/9780198847977.001.0001/oso-9780198847977-chapter-8.
- Michael Douglas, ‘Questioning the Right to Be Forgotten’ (2015) 40 Alternative LJ 109.
- Jeffrey Rosen, ‘The Right to Be Forgotten’ (2011-2012) 64 Stan L Rev Online 88.
- Urs Gasser, Recoding Privacy Law: Reflections on the Future Relationship Among Law, Technology, and Privacy, (2016) Law, Privacy & Technology Commentary Series
- Li, Tiffany & Fosch Villaronga, Eduard & Kieseberg, Peter. (2017). Humans Forget, Machines Remember: Artificial Intelligence and the Right to Be Forgotten
- Rolf H Weber, ‘The Right to Be Forgotten: More than a Pandora’s Box’ (2011) 2 J Intell Prop Info Tech & Elec Com L 120