The European Court of Justice is due to hand down its long awaited judgment in Google v Spain (Case No C-131/12) at 9.30am on 13 May 2014 In this post, Professor Lorna Woods sets out the background to the decision.
The case before the ECJ arises from a reference from a Spanish appeal court, the Audiencia Nacional, and concerns the applicant’s claim that Google should remove certain information from its search results. In 1998, a Spanish newspaper had – at the direction of the relevant Spanish court – advertised that property of the applicant was to be auctioned in relation to the applicant’s social security debts.
The applicant paid the debts and the foreclosure did not take place. Yet, approximately a decade on, a search on the applicant’s name revealed the newspaper notice of the foreclosure. The applicant applied to the Spanish Data Protection Authority (AEPD), asking for an injunction against both the newspaper and the search engine. The AEPD dismissed the claim against the newspaper (which was under a legal obligation to publish the official notice), but issued an injunction against Google Spain SL and Google Inc. to delete the data from the search engine’s index. Google appealed to the Audiencia Nacional which referred 9 questions on the interpretation of the Data Protection Directive (DPD) and the Charter of Fundamental Rights to the ECJ. Note also that the e-commerce Directive, which normally provides safe harbours for internet intermediaries, excludes data protection issues from its scope.
This case is not an isolated incident; reports suggest that there are 130 similar cases in Spain and the problem of longevity and prominence of old and sometimes inaccurate or misleading information does not just affect Spanish citizens. The questions referred, while relating to the interpretation of the DPD raise fundamental questions about the nature of the internet, who makes decisions about content and on what basis.
The hearing took place on 26th February 2013 and the Opinion of the Advocate General was released on 25th June 2013.
The questions of legal interpretation referred can be divided into broadly three categories:-
- Nature of Google’s activities
- Obligations under the DPD, namely the existence of the ‘right to be forgotten’.
As regards these questions, the Opinion of the Advocate General can be summarised as saying that
- Yes the DPD and Spanish implementing rules can apply to Google in these circumstances
- Yes, Google processes data, but no it is not a data controller; and
- Even if the DPD applies to Google’s activities, there is no ‘right to be forgotten’.
Normally national laws (here the Spanish rules implementing the DPD in Spain) do not apply outside the territory of the state enacting those rules. So the question here is whether Spanish laws apply to Google. The analysis here was complicated by the fact that Google’s search engine is operated solely by Californian-based Google Inc. The only presence Google has in Spain is a subsidiary, Google Spain SL, the activities of which are limited to promoting and selling advertising space on the search engine.
The relevant provision is Article 4 DPD, specifically Article 4(1)(a), which provides a Member State is to regulate when:
The processing [of data] is carried out in the context of the activities of an establishment of the controller of the data on the territory of the Member State…..
and Article 4(1)(c), where a Member State should regulate when:
the controller is not established on Community territory and … makes use of equipment, automated or otherwise, situated on the territory of the said Member State… .
The referring court questioned whether Article 4 is satisfied by the fact that Google uses its crawlers to gather information from websites hosted in Spain, or from the fact that it uses a Spanish country code TLD (Google.es) and directs Spanish users to searches and results relevant for them in terms of language or location and whether the “use of equipment” to trigger Article 4(1)(c) can be inferred from the fact that Google stores the indexed information in servers the location of which is undisclosed. The referring court also asked whether the Charter could extend the jurisdiction beyond the circumstances expressly envisaged in the DPD.
The DPD imposes obligations on ‘data controllers’ who process personal data, so the question is whether Google’s activities fall within the definition of processing found within Article 2(b) and 2(d) DPD. It describes processing thus:
Any operation or set of operations which is performed on personal data, whether or not by automatic means, such as collection, recording, organization,, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure, or destruction.
‘determines the means and purposes of the processing of personal data’.
Assuming that Google is a data controller processing personal data, the court asked whether the AEPD court require Google to remove the results without also asking the newspaper so to do.
Right to Be Forgotten
The final set of questions relates to the scope of the rights of data subjects. The DPD provides individuals with rights to obtain the “rectification, erasure or blocking of personal data” which are “incomplete or inaccurate”. The questions referred specifically question whether the rights to the erasure or blocking of data (Art. 12(b) DPD) and to object to the processing of data (Art. 14(a) DPD) allow the data subject to prevent search engines from indexing personal information of not because it was illegal but purely because it was damaging (and outdated).
Opinion of the Advocate General
The Advocate General suggested that where a parent search engine based outside the EU is a data controller, its EU subsidiary must also be considered to be a data controller on the basis that the EU subsidiary would be acting as a ‘bridge’ for the search engine function to that member state’s advertising market. The Advocate General stated that ‘an economic operator must be considered a single unit…[and] not be dissected on the basis of its individual activities related to processing of personal data’. This means in principle the Spanish laws can apply. This has potentially significant repercussions for internet intermediaries with a similar business model. It raises some interesting questions about the country of origin principle in the DPD and the extra-territorial reach of the DPD. Certainly, it is inconsistent with the approach taken in some national courts relating to Facebook (though the national courts have not themselves taken a consistent line). This issue would be dealt with by the proposed Data Protection Regulation, Article 3.
Nonetheless, on the facts, the Advocate General concluded that while Google was processing data, it was not a data controller – or only partially. Before coming to this conclusion, the Advocate General made some remarks about the potential scope of application of the DPD, suggesting that it was too broad and its application. He suggested that in applying the DPD, the ECJ should
“apply a rule of reason … the principle of proportionality, in interpreting the scope of the Directive in order to avoid unreasonable and excessive legal consequences”.
This novelty in interpretation of the DPD seems to run against the views of those arguing for the review of the data protection regime on the basis that it was not stringent enough. According to the Advocate-General, a data controller must be aware of a defined category of data and process with some degree of intent in respect of that data in order to be a controller. The Advocate-General argued that Google is not ‘aware’ of the actual personal data on those third party websites, nor is it intending to process that personal data in any ‘semantically relevant way’. The obligations in the DPD do not apply directly to it, but would only bite if search engines processed personal data in a manner inconsistent with the instructions or requests of publishers. Google is, however, processing data as regards the index of the search engine. The Advocate General concluded that in relation to this data, Google had satisfied its duties under the DPD.
The Advocate-General also considered the general balancing of rights – in particular the balance between the right to privacy/right to reputation and freedom of expression, and argues against a right to be forgotten. The Advocate General stated that a right to be forgotten ‘would entail sacrificing pivotal rights such as freedom of expression and information’. According to the Advocate General, the right to erasure and blocking under article 12(b) applies to incomplete or inaccurate data and the right to object under article 14(a) arises only where there are compelling legitimate grounds. A data subject’s wish to restrict the dissemination of true and accurate public information on the grounds that it is harmful or contrary to his interests does not satisfy this requirement. There is no general right to be forgotten. Again, this issue would have been dealt with by the proposed Data Protection Regulation, though it should be noted that the proposed right was controversial and has been watered down from the original proposal. Nonetheless, the national courts in some Member States (such as France) seem to recognise some form of right to be forgotten.
The Opinion of the Advocate General is very much a policy driven opinion; and hard to square with the wording and stated purpose of the DPD. His approach raises interesting questions about all three issues, as well as the balance between freedom of expression and the right to a private life/reputation which will have more general repercussions.
- Which laws apply to transnational digital service providers? Will EU rules become global standards?
- What is the balance between freedom of expression and other rights and interests – and who speech rights are we concerned with?
- Who is responsible for moderating content: publisher or re-publisher/location tool?
Peter Fleischer, Google’s Privacy Counsel has some personal comments on the right to be forgotten, while over at SafeGov, Richard Falkenrath also has some further thoughts on the right to be forgotten.
Professor Lorna Woods is the Director of the LLM in Internet Law at the University of Essex.
This post originally appeared on the Law, Justice and Journalism website and is reproduced with permission and thanks