A recent European Court Justice (ECJ) ruling may affect how privacy, transparency, and open data interact and has a direct relation with growing discussion about the “right to be forgotten”. Roughly summarized the ruling finds that organisations which publish information may be obliged to “take down” and remove information when an individual requests that removal even when the information is true and is a matter of “public record”.
This is potentially a significant change, adding to the work and responsibilities not just of big corporations like Google, but also to the creators of open databases big and small. The so-called “right to be forgotten” undoubtedly encapsulates a justified fear that lots of us have about our loss of personal privacy. However, this decision also appears to have the potential for significant (unintended) negative consequences for the publication and availability of key public interest information – the kind of information that is central to government and corporate accountability.
More discussion on this and related topics in area of open data and privacy in the Personal Data, Privacy and Open Data working group
The Ruling and What it Means
The core of the case was the request by a citizen to have web pages about him dating from 1998 removed from online newspaper archives of La Vanguardia, and significantly, for the Google Search results linking to that article also to be removed.
Now the pages in question contained information that one would normally consider to be of reasonable “public record”, specifically as summarized by the ECJ they “contained an announcement for a real-estate auction organised following attachment proceedings for the recovery of social security debts owed [by the citizen]”.
The Spanish Data Protection Agency (AEPD) who handled this in the first instance made what seemed a somewhat surprising ruling in that:
- They rejected the complaint against La Vanguardia, taking the view that the information in question had been lawfully published by it.
- But they upheld the complaint against Google and “requested those two companies [Google Spain and Google Inc] to take the necessary measures to withdraw the data from their index and to render access to the data impossible in the future.”
The ECJ (which opines on law not facts) essentially upheld the legal logic of AEPD’s decision, stating:
Court holds that the operator [e.g. Google] is, in certain circumstances, obliged to remove links to web pages that are published by third parties and contain information relating to a person from the list of results displayed following a search made on the basis of that person’s name. The Court makes it clear that such an obligation may also exist in a case where that name or information is not erased beforehand or simultaneously from those web pages, and even, as the case may be, when its publication in itself on those pages is lawful.
At first glance, this decision has some rather substantial implications, for example:
- It imposes potentially very substantial obligations on those who collect and curate “public” (open) data and information. For example, to respond to requests to remove information (and to continue to track this going forward to ensure continuing compliance).
- It appears to entitle individuals to request the take-down of information with a strong “public-interest” component. For example, imagine an online database providing information on corporate entities which may list the (true) fact that someone was a director of a company convicted of fraud. Would this ruling allow the director to request their removal?
What is especially noteworthy is that the decision appears to imply that even if the data comes from an official source (and is correct) a downstream collector or aggregator of that information may be required to remove it (and even where the original source does not have to remove the information).
We should, of course, remember that any holder of information (whether an original source or an aggregator) has legal (and moral) obligations to remove content in a variety of circumstances. Most obviously, there is an obligation to remove if something is false or some private information has been mistakenly published. This already has implications for transparency and open data projects.
For example, in the OpenSpending project information is collected from official sources about government finances including (in the UK) details of individual spending transactions. It is possible that (by accident) the description of a published transaction could provide sensitive information about a person (for example, it could be a payment to social services regarding an abused child where the child’s name is listed). In such circumstances both the original source (the government data) and OpenSpending would have a responsibility to redact the personal information as quickly as possible.
However, the case discussed here concerned what one would normally consider “public-interest” information. Traditionally, society has accepted that transparency concerns trump privacy in a variety of public interest areas: for example, one should be able to find who are the directors of limited liability companies, or know the name of one’s elected representatives, or know who it is who was convicted of a crime (though we note that some countries have systems whereby an offender’s conviction is, after some period, expunged from the record).
This ruling appears seriously to undermine this either in theory or in fact.
In particular, whilst a company like Google may dislike this ruling they have the resources ultimately to comply (in fact it may be good for them as it will increase the barriers to entry!). But for open data projects this ruling creates substantial issues – for example, it now seems possible that open projects like Wikipedia, Poderopedia, OpenCorporates or even OpenSpending will now have to deal with requests to remove information on the basis of infringing on personal data protection even though the information collected only derives from material published elsewhere and has a clear public interest component.
The everlasting memory of the internet, and the control of our personal data by corporations like Facebook and Google, undoubtedly present huge challenges to our rights to privacy and our very conception of the public/private divide. But we mustn’t let our justified concerns about ancient Facebook photos prejudicing our job prospects lead to knee-jerk reactions that will harm transparency and undermine the potential of open data.
More discussion on this and related topics in area of open data and privacy in the Personal Data, Privacy and Open Data working group
Excerpted Summary from the ECJ Summary
Excerpted from the ECJ Summary:
In 2010 Mario Costeja González, a Spanish national, lodged with the Agencia Española de Protección de Datos (Spanish Data Protection Agency, the AEPD) a complaint against La Vanguardia Ediciones SL (the publisher of a daily newspaper with a large circulation in Spain, in particular in Catalonia) and against Google Spain and Google Inc. Mr Costeja González contended that, when an internet user entered his name in the search engine of the Google group (‘Google Search’), the list of results would display links to two pages of La Vanguardia’s newspaper, of January and March 1998. Those pages in particular contained an announcement for a real-estate auction organised following attachment proceedings for the recovery of social security debts owed by Mr Costeja González.
With that complaint, Mr Costeja González requested, first, that La Vanguardia be required either to remove or alter the pages in question (so that the personal data relating to him no longer appeared) or to use certain tools made available by search engines in order to protect the data. Second, he requested that Google Spain or Google Inc. be required to remove or conceal the personal data relating to him so that the data no longer appeared in the search results and in the links to La Vanguardia. In this context, Mr Costeja González stated that the attachment proceedings concerning him had been fully resolved for a number of years and that reference to them was now entirely irrelevant.
The AEPD rejected the complaint against La Vanguardia, taking the view that the information in question had been lawfully published by it. On the other hand, the complaint was upheld as regards Google Spain and Google Inc. The AEPD requested those two companies to take the necessary measures to withdraw the data from their index and to render access to the data impossible in the future. Google Spain and Google Inc. brought two actions before the Audiencia Nacional (National High Court, Spain), claiming that the AEPD’s decision should be annulled. It is in this context that the Spanish court referred a series of questions to the Court of Justice.
[The ECJ then summarizes its interpretation. Basically Google can be treated as a data controller and …]
… the Court holds that the operator is, in certain circumstances, obliged to remove links to web pages that are published by third parties and contain information relating to a person from the list of results displayed following a search made on the basis of that person’s name. The Court makes it clear that such an obligation may also exist in a case where that name or information is not erased beforehand or simultaneously from those web pages, and even, as the case may be, when its publication in itself on those pages is lawful.
Finally, in response to the question whether the directive enables the data subject to request that links to web pages be removed from such a list of results on the grounds that he wishes the information appearing on those pages relating to him personally to be ‘forgotten’ after a certain time, the Court holds that, if it is found, following a request by the data subject, that the inclusion of those links in the list is, at this point in time, incompatible with the directive, the links and information in the list of results must be erased.
Image: Forgotten by Stephen Nicholas, CC-BY-NC-SA
Rufus Pollock is Founder and President of Open Knowledge.
Rufus
I agree that the ruling could be used to stifle transparency projects. but I think you are missing a few key elements.
The basic argument is that the Google results under your name form a profile – leaving aside the issues around unique correspondence of names, etc. This means that Google has to make sure that the profile is accurate and relevant. It is the inclusion of information that it is no longer relevant in a prominent position in a profile that it is under question here. Not the publication of the information itself.
The ruling makes clear that public interest can override privacy, but it makes this the exception. The only criteria explored by the court is whether the person is a public figure. The problem is that transparency projects may relate to people who avoid the public eye, but are nevertheless powerful, so this criterion is not good enough. But future cases can and should expand on the public interest.
Ultimately, it is true that you need a good reason to hold and process my data. Reversing the argument by saying that that “transparency concerns trump privacy in a variety of public interest areas” is not very useful either. Each case balancing different rights has to be considered in its own merits.
For Open Data projects, the ruling confirms what European data protection experts (Article 29) have been saying about open data for some time now: data protection applies to published data (unless it is self-disclosed). Wikipedia, Poderopedia and OpenCorporates always had data protection obligations when building profiles of living individuals based on publicly available information. The novelty is to apply this to search engines.
These issues are being discussed in the personal data working group http://personal-data.okfn.org/
Javier: great points (as always!) and I think all clarifications are well taken. As you say, the need to strike some balance between public interest disclosure and the needs of privacy has been (long) acknowledged. The question is how these two needs get balanced and I do think this decision can be seen to bring this trade-off sharply into perspective as well as to raise new issues.
As you rightly point out, Google could be seen as somewhat unusual both because of its ubiquity and power and because of the way it presents the information. I also hope it was clear from the post that there are very genuine privacy issues here – both about what is presented about someone, when it becomes obsolete (when it should be “forgotten”) etc.
Lastly, let me endorse strongly your suggestion for people interested in this to join the Personal Data working group :-)
Posting this on behalf of Tony Hirst who has struggled with disqus …
To the extent that the ruling suggests that operators act so as to process personal information and are de facto data controllers of that information, and notwithstanding the public interest exceptions, it strikes me there may be another form of exemption if the operator is a news organisation (because I think news organisations have exemptions from certain disclosure requirements under eg things like the Data Protection Act in the UK.
This is tough on watchdog NGOs that aren’t classed as news organisations, but that do collect and process personal data on subjects they are investigating, and I think are subject to personal data disclosure requests etc from individuals?