Many of us will remember the sense of foreboding induced by the simple threat, usually uttered by a teacher: ‘This will go on your permanent record’. This administrative bogeyman exploits our early awareness of the importance of being able to leave some things in the past.
While some of the things we do may go on public record at various points in our lives, it used to be possible to comfort ourselves with the thought that these would soon be buried deep in the archives, where most people would not care to invest the time to search. In the digital age, however, search engines do the digging and can serve up previously long-forgotten results on a simple search of a person’s name.
The ready availability of information that in a previous age would have been forgotten makes many of us worry that our mistakes can no longer truly be left in the past.
But what claims do we have against
information being brought back to light in this way?
Do we have a right to be forgotten?
This is the way that Article 17 of the EU’s General Data Protection Regulation (GDPR) has been interpreted. The right to erasure, also known as ‘the right to be forgotten’, gives individuals the right to have their personal data erased by data controllers under certain circumstances.
One of the most controversial aspects of Article 17 is the way in which it has been applied to search engine operators like Google. A landmark case gave Individuals the right to request that search engines prevent certain search results being shown when their name is entered as a search term.
This data protection measure has been defended on the basis that we do have some right to be forgotten. If there is such a general right, it could not plausibly impose obligations directly on others to forget certain facts about us. What and who we remember is not something we can directly control. If we subscribe to the principle that ought implies can, we cannot be obliged to do something which is not within our control. Even if we could forget things at will, there would be deep problems with enforcing a right to be forgotten, given that this would involve policing how people organise information in their own minds.
A right to be forgotten could, however, give us claims pertaining to how information is stored and presented by others. The thought being that once information is deleted from public records, it is more likely to ultimately be forgotten.
In a recent paper, ‘Privacy, Publicity, and the Right to be Forgotten’, I’ve argued that data protection measures such as the right to erasure can be defended on the basis that we have certain claims to how information about us gets presented, rather than claims to the information being deleted.
Instead of focussing on what should be remembered, and what forgotten, I suggest that we should focus on how things are remembered. That gives us a different perspective to address how we ought to shape our information-sharing practices, beyond the binary of store/delete.
This approach is not strictly in tension with the thought that it would be good for some things to be forgotten. When information is framed a certain way – as outdated, for example, rather than currently relevant – the framing may make it more likely for the information in question to slowly fade from people’s memory. But it rejects the thought that the ultimate goal of our information sharing practices ought to be to aim for complete deletion.
Problems with the right to be forgotten
Several scholars have made the case in favour of the right to be forgotten by drawing on the idea that the forgetting of personal information over time has long been an important aspect of our cultural practices. Such gradual forgetting, it is argued, has been crucial to our ability as individuals to lead adequately autonomous lives, unencumbered by mistakes or embarrassing mishaps from our past.
From this perspective, the widespread accessibility of old information made possible in the digital age represents a radical and destabilizing shift, which threatens our ability to live our lives on our own terms as the years go by.
Responses to this problem have focussed on how to strike a balance between protecting individual privacy, on the one hand, and the public interest in information, on the other. Proposed solutions include setting an expiry date on personal data, after which it would automatically get deleted (Mayer-Schönberger 2011); regulators directing data controllers when to delete personal data (Sartor 2014); or giving individuals the power to delete any of their personal data at any point (Jones 2018).
A concern with these approaches is that it looks like they amount to shutting the stable door once the horse has already bolted. Can we really make personal information private again after it has been made public in some way?
Even if we could, there remain objections to allowing information to be deleted if it was made public through legitimate means in the first place. Critics have argued this goes too far to undermine the important interest in freedom of information (Post 2018).
The ruling that applied the right to be forgotten to search engine results tried to strike a balance between the individual’s right to be forgotten and the interest in freedom of information.
The case in question involved Mario Costeja González, who had defaulted on a debt many years prior. The details of the foreclosure on his property had been reported in a newspaper notice. This newspaper notice showed up in search results when you Googled his name many years later.
The court ruled that the search result should be removed from the Google search results page associated with his name, but that the original notice should remain in the newspaper’s digital archives.
As a general strategy, we can see how this could make it more likely that the details of the foreclosure would be forgotten over time, while ensuring that the information remains on record in case there is a legitimate need to retrieve it.
However, this particular instance generated a version of the Streisand Effect. The court ruling was widely reported. As a result, when you now Google Costeja González’s name, links to newspaper articles about the court ruling are displayed, which include details of the original foreclosure.
One reaction to this might be to accept that while the original test case was bound to backfire for the plaintiff, the general strategy of suppressing search engine results while retaining information in original archives is still a good way to balance the right to be forgotten with the interest in freedom of information.
However, the result of the test case reveals that there will always be some tension between the demands of the right to be forgotten and the reasons that require us to preserve important information.
Taking a different approach to what is at stake for the individual in such cases helps us to avoid the worst of this tension. This involves stepping away from the idea of a right to be forgotten, and instead focussing on what claims individuals might have to how things get remembered.
The importance of how we remember
The concern that motivates arguments in favour of the right to be forgotten is that the preservation of information in the digital age makes it harder for us to lead autonomous lives without being constantly encumbered by mistakes from our past.
What I suggest is that how things are remembered can have significant bearing on the extent to which we feel shackled to the past in this way. In particular, it matters whether we continue to be held accountable for things in our past.
Thomas Nagel has written about the importance of ‘norms of reticence’ (Nagel 1998). These apply in situations where everyone knows some piece of information, but is also aware that it is not an appropriate thing to bring up in certain contexts.
These norms of reticence can be understood as part of our practices of accountability. Even when we all remember a person’s past misdeed, there usually comes a point when we also recognize that it is no longer appropriate to hold that person accountable for it.
Some ways of sharing and presenting information can disrupt this process. For example, search engines do this when they display links to certain information at the top of the search results associated with a person’s name. Google’s search results page is not a neutral index, it is a ranking by relevance. The implicit message in displaying a search result prominently is to say: this is highly relevant information about this person.
From this perspective, the concern about search engines making old information accessible is not about the information being out there, but about how that information is presented. Accordingly, the remedy need not be to ensure that information is forgotten, but instead to ensure that it is shared in ways that align with appropriate practices of accountability.
This allows us to say that the Streisand effect in the Costeja González case did not render the original ruling entirely self-defeating. The information about the original foreclosure is still prominently available in the top search results associated with the plaintiff’s name. But those newspaper articles also include the fact that the foreclosure was considered to be outdated information. In other words, they make it clear that it is no longer appropriate to hold Costeja González accountable for defaulting on his debt all those years ago.
We need to be able to leave our mistakes in the past. But this does not require them to be forgotten. What we need to do is to ensure that our information sharing practices align with appropriate practices of accountability.
References:
Jones, Meg Leta. 2018. Ctrl+ Z: The Right to Be Forgotten. New York: NYU Press.
Mayer-Schönberger, Viktor. 2011. Delete: The Virtue of Forgetting in the Digital Age. Princeton: Princeton University Press.
Nagel, Thomas. 1998. Concealment and exposure. Philosophy and Public Affairs, 27, 3–30.
Post, Robert C. 2018. Data privacy and dignitary privacy: Google Spain, the right to be forgotten, and the construction of the public sphere. Duke Law Journal, 67, 981–1072.
Sartor, Giovanni. 2014. The right to be forgotten: dynamics of privacy and publicity. Pp. 1–15 in Luciano Floridi (ed.), Protection of Information and the Right to Privacy: A New Equilibrium? Cham: Springer International Publishing.