The much-anticipated decision in NT 1 & NT 2 v Google LLC [2018] EWHC 799 (QB) was handed down on 13 April 2018.  The joint judgment in two separate claims against Google, is the first time the English courts have had to rule on the application of the ‘right to be forgotten’ principle following the decision in Google Spain SL, Google Inc. v Agencia Espanola de Proteccion de Datos (AEPD) and Mario Costeja Gonzalez (Case C-131/12).

This article explores the decision and the ramifications on future delisting requests to Google.  The judgment necessarily had to deal with a number of novel issues and discuss the legal approach to such claims.  Much of this is technical.  Lay readers may wish to skip the section entitled “the Approach” and re-join the discussion at the paragraphs entitled “Findings”, “Commentary” and “Practical issues for those wanting to make delisting requests following this judgment”.

The facts of the claims

The two trials took place sequentially in February and March of this year, both before Mr Justice Warby and with the claimants sharing the same legal team.

Both Claimants were granted anonymity (as ‘NT1’ and ‘NT2’ respectively) so as to avoid undermining the purpose of their claims.  Third parties and businesses were also anonymised in the public judgement to try and prevent jigsaw identification.  Similarly, many of the dates given in the judgment are vague.  Even with these safeguards the 230-paragraph public judgment is supplemented by a detailed private judgment in each case.

NT1

NT1 is a businessman.  He had been involved in a controversial property business in the late 1980s and early 1990s, when he was in his thirties.  In the late 1990s he was convicted of conspiracy falsely to account, having transferred monies to offshore companies to cheat the revenue.

NT1 received a four-year custodial sentence, and was released in the early 2000s.  When the sentence was passed it was of such a length that it would never have been deemed “spent” for the purposes of the Rehabilitation of Offenders Act (‘ROA’).  However, it became spent following a change in the law in March 2014 (which had retrospective effect).  Notably had  the sentence been one day longer, the position would not have changed.

NT1 sought the delisting of three URLs from the search results returned upon entry of his name into Google’s search engine.  Two of the URLs related to contemporaneous media reports of NT1’s conviction.  The other was a book extract which referred to the conviction.

NT2

NT2 is also a businessman.  Around the turn of the century he was involved in a business which had attracted public controversy for environmental reasons.  The business was targeted by individuals seeking to disrupt it.  NT2 took steps to identify those individuals; this included sanctioning the use of unlawful phone and computer hacking. NT2 was convicted for his part in this and received a six-month custodial sentence, of which he served six weeks.  His conviction had also become spent in March 2014 (but it would have become spent in July 2014, even if the law had not changed).  NT2 complained of about 11 URLs, some being contemporaneous reports of his prosecution and conviction, and some more recent.

In October 2015 both claimants sued Google for breach of the Data Protection Act 1998 (‘DPA’) and the Misuse of Private Information owing to Google’s refusal to ‘delist’ the URLs they had complained of.  The Information Commissioner (who has a statutory duty to review Google’s decisions following a request by a data subject) was granted permission to intervene in the proceedings.

The judge summarised the main issues in the case as follows:-

[9](1) whether the claimant is entitled to have the links in question excluded from Google Search results either (a) because one or more of them contain personal data relating to him which are inaccurate, or (b) because for that and/or other reasons the continued listing of those links by Google involves an unjustified interference with the claimant’s data protection and/or privacy rights; and (2) if so, whether the claimant is also entitled to compensation for continued listing between the time of the delisting request and judgment. Put another way, the first question is whether the record needs correcting; the second question is whether the data protection or privacy rights of these claimants extend to having shameful episodes in their personal history eliminated from Google Search; thirdly, there is the question of whether damages should be paid.

The approach

Abuse of process

At the outset, Google sought to argue that the claims were an abuse of the court’s process as they were attempts to circumvent the laws and procedures of defamation law.  This was somewhat surprising given the number of previous judgments that have recognised the efficacy in choosing to sue under the DPA instead of (or as well as) libel.  As Warby J said “[61]As a general rule, it is legitimate for a claimant to rely on any cause of action that arises or may arise from a given set of facts. This is not ordinarily considered to be an abuse just because one or more other causes of action might arise or be pursued instead of, or in addition to, the claim that is relied on.

The exemption

Again rather optimistically, Google attempted a second knock-out blow by arguing that it could avail itself of the “journalistic exemption” under section 32 of the DPA because the “the processing is undertaken with a view to the publication by any person of any journalistic, literary or artistic material”.  Noting that Google’s search function was indiscriminate, Warby J dealt with this robustly: “[98]the concept is not so elastic that it can be stretched to embrace every activity that has to do with conveying information or opinions. To label all such activity as “journalism” would be to elide the concept of journalism with that of communication...”  Moreover, the other constituent elements of the defence were not made out: “[102]There is no evidence that anyone at Google ever gave consideration to the public interest in continued publication of the URLs complained of, at any time before NT1 complained…it would still have to go on to show that it held a belief, that was reasonable, that it would be incompatible with the special purposes for its continued processing of the data to be carried out in compliance with the DPA. There is no evidence of that at all. Google’s “right to be forgotten” assessment process is not designed or adapted for that purpose. That may be because it has not considered until recently that the journalism exemption might be available to it. It certainly did not suggest this in the Google Spain case…”

Assessing inaccuracy (the Fourth Data Protection Principle)

Warby J held that the correct approach to adopt when determining whether data was inaccurate or misleading was similar to that applied in the law of defamation.  In other words, the alleged inaccuracy had to be considered in context and not on an artificial microlevel.  Moreover, a single meaning was applied to the alleged inaccuracy.  However, unlike defamation, the burden of proof was on the claimant to prove that the ‘meaning’ was inaccurate.  This is an important and, perhaps overlooked, issue for anyone complaining about inaccurate data processing.  It is simply not sufficient to make bald assertions of inaccuracy.

With NT2, a complaint that a URL exaggerated the extent of the claimant’s criminality was accepted (the article wrongly imputed the motive of the crime was financial gain).  With NT1the judge was unimpressed with the claimant’s attempt to particularise and evidence inaccuracy.  Indeed, he was generally unimpressed with the claimant’s evidence and reliability, to the extent that he felt it appropriate to make positive findings of wrongdoings beyond NT1’s conviction.

First Data Protection Principle (fairness/lawfulness)

It was suggested by the Information Commissioner’s Office (‘ICO’) that it might be necessary for the Court to disapply the provision of the First Data Protection Principle which requires a condition in Schedule 3 of the DPA to be met before there can be any lawful processing of sensitive personal data (which includes allegations of criminality and reporting of criminal proceedings).  This was because if none of the conditions applied all such data processing (i.e. returning search results about any criminal proceedings/allegations) would automatically be unlawful.  However, such an approach would have potentially been problematic as, unlike in Vidal-Hall v Google [2015] EWCA 311 (where the Court of Appeal disapplied section 13(2) of the DPA), there was no real discrepancy between the EU parent legislation (Directive 95/46/EC/the Data Protection Directive) and the domestic legislation.  In the event, the judge found that condition 5 of Schedule 3 was met: the information had been made public as a result of steps deliberately taken by the data subject.  Warby J held that a consequence of the open justice system is that in committing a criminal offence (even one committed in private), one is deliberately taking steps to make information about that offence public.

Additionally, in order for any personal data to be processed, one of the conditions in schedule 2 must be met.  The most widely-drawn condition in Schedule 2 is condition 6(1) which, in essence, allows data processing where it is necessary in pursuance of legitimate interests by the data controller or by third parties to whom the data is disclosed.  Processing for such ‘legitimate interests’ is prevented only where it is unwarranted by virtue of prejudice caused to the rights, freedoms, or legitimate interests of the data subject.  The judge was in no doubt that Google had a legitimate interest in processing personal data in pursuit of it business as a search engine operator, and that third parties (i.e. the general public) have a legitimate interest in being able to receive information from Google (as well as other search engines).  The question was whether, in these cases, such processing was “necessary”, in pursuance of those legitimate interests, or “unwarranted”, considering any prejudice caused to NT1 and NT2.  The test here was essentially the same as the Article 8 v Article 10 [right to a private life v right to freedom of expression] “ultimate balancing test” applied in privacy cases, and part of the balancing process mandated by Google Spain.  The judge noted that the starting point was that neither had precedence and that any inference to the contrary drawn from Google Spain was misunderstood, such comments simply being descriptive (of the likely scenario in the majority of cases involving historic engine results), rather than legally prescriptive.

Google Spain and the Second, Third and Fifth Data Protection Principles

The Court held that the remaining data protection principles [not to process data other than for a specified legal process, to ensure personal data is adequate, relevant and not excessive and not to retain data longer than is necessary] were all conveniently subsumed/collapsed into the Google Spain/Article 8 balancing exercise.

The Working Party Committee’s Criteria

The judge adopted the EU Working Party’s 13-point checklist for factors that should be considered by data protection authorities (such as the ICO) when deciding whether search results should be delisted.  The headline questions for each factor are reproduced below:-

  1. Does the search result relate to a natural person – i.e. an individual? And does the search result come up against a search on the data subject’s name?
  2. Does the data subject play a role in public life? Is the data subject a public figure?
  3. Is the data subject a minor?
  4. Is the data accurate?
  5. Is the data relevant and not excessive? Does the data relate to the working life of the data subject? The overall purpose of these criteria is to assess whether the information contained in a search result is relevant or not according to the interest of the general public in having access to the information. Relevance is also closely related to the data’s age. Depending on the facts of the case, Does the search result link to information which allegedly constitutes hate speech/slander/libel or similar offences in the area of expression against the complainant? Is it clear that the data reflect an individual’s personal opinion or does it appear to be verified fact?
  6. Is the information sensitive within the meaning of Article 8 of the Directive 95/46/EC [Section 2 of the DPA]?
  7. Is the data up to date? Is the data being made available for longer than is necessary for the purpose of the processing?
  8. Is the data processing causing prejudice to the data subject? Does the data have a disproportionately negative privacy impact on the data subject?
  9. Does the search result link to information that puts the data subject at risk?
  10. In what context was the information published? a. Was the content voluntarily made public by the data subject? b. Was the content intended to be made public? Could the data subject have reasonably known that the content would be made public?
  11. Was the original content published in the context of journalistic purposes?
  12. Does the publisher of the data have a legal power – or a legal obligation – to make the personal data publicly available?
  13. Does the data relate to a criminal offence?

Full details of the guidelines can be found here.

The judge considered each question for both claimants, some of which obviously overlapped with issues to be considered under the First and Fourth Data Protection Principles.  In both instances, he found factor 13 to be the most important issue, quoting the commentary:-

[161] As a rule, [data protection authorities] are more likely to consider the de-listing of search results relating to relatively minor offences that happened a long time ago, whilst being less likely to consider the de-listing of results relating to more serious ones that happened more recently. However, these issues call for careful consideration and will be handled on a case-by-case basis.”

Warby J carried out a detailed analysis of the nature of the offending of NT1 and NT2, the surrounding circumstances, events since the offences took place, and the claimants’ attitude to their convictions.  This was in turn linked to the question of whether the claimant was ‘rehabilitated’.

Rehabilitation of Offenders Act 1974 (‘ROA’)

Those with spent convictions hoping for clarity (or even guidelines) on when they can expect Google to delist results will be disappointed by the judgment.  No tangible principle of when a URL relating to a criminal conviction should be delisted has been established.  The commonly argued proposition that results should be delisted at the point a conviction becomes spent was rejected as being a “blunt instrument” that was incompatible with human rights [freedom of expression] jurisprudence.  If a conviction was spent this was held to be “weighty factor” in favour of delisting, but “not determinative”.  A more nuanced assessment was required in each case than simply looking at the length of a sentence and the prescribed time limits under the ROA (which can be found here).

The judge sought to summarise the rationale and approach:-

[166] Without attempting to be exhaustive I have arrived, for the purposes of these cases, at the following reconciliation:

(1) The right to rehabilitation is an aspect of the law of personal privacy. The rights and interests protected include the right to reputation, and the right to respect for family life and private life, including unhindered social interaction with others. Upholding the right also tends to support a public or societal interest in the rehabilitation of offenders. But the right is not unqualified. It will inevitably come into conflict with other rights, most notably the rights of others to freedom of information and freedom of expression. It is not just legitimate but clearly helpful for Parliament to lay down rules which clearly prescribe the point at which a given conviction is to be treated as spent. But such rules, depending simply on the offender’s age and the nature and length of the sentence, can only afford a blunt instrument. Parliament has legislated for exceptions, but these cannot be treated as necessarily exhaustive of the circumstances in which information about a spent conviction may be disclosed. More subtle tools are needed, if the court is to comply with its duty under the HRA to interpret and apply the law compatibly with the Convention. Section 4 of the 1974 Act must be read down accordingly as expressing a legal policy or principle.

(2) The starting point, in respect of information disclosed in legal proceedings held in public, is that a person will not enjoy a reasonable expectation of privacy (Khuja, [61] above). But there may come a time when they do. As a general rule (or “rule of thumb”, to adopt the language of the Working Party), the point in time at which Parliament has determined that a conviction should become spent may be regarded as the point when the convict’s Article 8 rights are engaged by any use or disclosure of information about the crime, conviction, or sentence (see T, [49(2)] above). But this does not mean that in 1974 Parliament enacted a right to confidentiality or privacy from that point on (Pearson, L v The Law Society, [47-48] above). Still less does it follow that the convict’s Article 8 rights are of preponderant weight, when placed in the balance. As a matter of principle, the fact that the conviction is spent will normally be a weighty factor against the further use or disclosure of information about those matters, in ways other than those specifically envisaged by Parliament. The starting point, after all, is the general policy or principle in favour of that information being “forgotten”, as expressed in s 4 of the 1974 Act. That policy has if anything become weightier over time. It is likely that in many cases the particular circumstances of the individual offender will support the application of that general principle to his or her case. But the specific rights asserted by the individual concerned will still need to be evaluated, and weighed against any competing free speech or freedom of information considerations, or other relevant factors, that may arise in the particular case.

(3) Part of this balancing exercise will involve an assessment of the nature and extent of any actual or prospective harm. If the use or disclosure causes, or is likely to cause, serious or substantial interference with private or family life that will tend to add weight to the case for applying the general rule. But where the claim relies or depends to a significant extent upon harm to reputation, the Court is in my judgment bound to have regard to s 8 of the 1974 Act. It is possible to identify a public policy that underlies that section, and which qualifies the public policy that underpins s 4. It is that offenders whose convictions are spent should not be able to obtain remedies for injury to their reputation (or consequent injury to feelings) resulting from the publication in good faith of accurate information about the spent conviction, or the related offending, prosecution or sentence. It is not a satisfactory answer to this point to say that the causes of action relied on are not libel or slander, but data protection and/or misuse of private information. That is too narrow and technical an approach, which ignores the fact that neither cause of action was known to Parliament when it legislated. The fact that, as I accept, reputational harm can support a claim under those causes of action tends, in fact, to undermine the force of that argument. I therefore do not accept that the policy that underlies s 8 falls to be disregarded merely because the claim is not framed in defamation. Again, there can be no bright line, because Convention jurisprudence shows that reputational harm can be of such a kind or severity as to engage Article 8 (Yeo, [140], above); but subject to considerations of that kind I would consider that this statutory policy or principle falls to be applied by the Court.

(4) Another aspect of the proportionality assessment will be the nature and quality of the societal benefits to be gained in the individual case by the use or disclosure in question. Freedom of expression has an inherent value, but it also has instrumental benefits which may be weak or strong according to the facts of the case. The fact that the information is, by its very nature, old will play a part at this stage also.

(5)Most, if not all, of these points about spent convictions are likely to be relevant in more than one context. Where a spent conviction is the subject of a de-listing claim, the Court will need to weave its evaluation according to domestic principles into the overall Google Spain balancing exercise. The Working Party criteria are a key tool for this purpose. One matter that Ms Proops rightly identifies as needing due weight at this stage is fact that de-indexation does not per se remove the source websites containing the relevant data from the online environment. It merely makes that data harder for the public to find.”

Findings

Whilst the legal framework of the decision-making is inescapably convoluted; the factual findings can be summarised relatively quickly.

NT1 had been convicted of a serious dishonesty offence, and received a relatively lengthy sentence of four years.  When it was imposed he could have had no expectation that the conviction would ever become spent. The judge also noted from the Crown Court judge’s sentencing remarks that an even stiffer sentence would have been imposed had it not been for one specific item of personal mitigation. NT1 continued to refuse to accept, or fully accept, culpability for his crime.  His evidence was poor, and the judge found him to be evasive, arrogant and dishonest.  This led to the judge making a number of adverse factual findings against him, including that he started a second business using the proceeds of the first crime. The judge analysed his conduct since the conviction and noted he had been prosecuted for a second matter, with the charges left on file (i.e. he was neither convicted or acquitted) and been involved in a number of other civil claims and regulatory matters, including making a dishonest credit licence application to the OFT. Of particular concern was the fact that NT1 had published information online (or caused it to be published) that actively sought to portray him as a man of upstanding character and integrity.

NT2’s position was markedly different.  His crime was much less serious and had not been committed for financial gain.  NT2 had pleaded guilty at an early opportunity and expressed considerable remorse for a one-off mistake.  He had received only a six-month sentence, which would always have become spent in the fullness of time.  NT2’s evidence came across well and the judge accepted that it was a one-off incident and that he was rehabilitated.

Warby J was critical of both claimants for their failure to properly particularised the damage and distress they had suffered as a result of the data processing.  He raised causation issues and noted the lack of evidence in relation to the number of times searches had been carried out against their names, and particular pages accessed.  In both cases, there was a heavy focus on business reputation.  He was just about satisfied in NT2’s case that substantial damage and distress had been caused by the search results, attaching particular importance to the fact that NT2 had children of school age, and that the could also be affected by the data processing.

Considering all the factors together, the judge agreed that the continued processing of the URLs for NT1 was justified and in the public interest.  In relation to NT2, the judge sided with the claimant, finding that that search results were no longer relevant and did not justify an interference with the claimant’s data protection rights.

Misuse of private information

The outcome of the balancing exercise similarly meant that NT1’s claim for the misuse of private information failed and NT2’s succeeded.  The starting point was that neither claimant had had a reasonable expectation of privacy in respect of their court proceedings and convictions, however with the passing of time NT2 had developed such an expectation whereas NT1 had not.  Google’s interference with NT2’s privacy rights could not be justified in all the circumstances.

Orders made

Warby J made a “delisting order” in respect of all the URLs NT2 had complained about.  No order was in respect of any of the URLs relating to NT1.

The judge declined to make an order for damages in favour of NT2 finding that Google had availed itself of the defence at section 13(3) of the DPA: “In proceedings brought against a person by virtue of this section it is a defence to prove that he had taken such care as in all the circumstances was reasonably required to comply with the requirement concerned.”  For the same reason, no order was made for damages in the claim for the misuse of private information.

NT1 was granted permission to appeal.  Google did not seek permission to appeal the decision in NT2 and has indicated it will respect the Court’s decision.

Commentary

Whilst this case has attracted press headlines because an English Court has told Google what to do (almost all the mainstream press headlines focussed on NT2’s success), it would probably be wrong to describe it as a landmark decision.  Broadly-speaking the judge simply applied existing data protection law in in line with what one would expect following the decision in Google Spain.

From a claimant (and claimant-lawyer) point of view, the judgment is distinctly conservative. A decision that URLs about criminal convictions should be delisted at the point the conviction becomes spent would have opened the floodgates to claims (backed by conditional fee agreements) against Google (media law practitioners will know that Google regularly refuses to delist articles that refer to spent convictions, particularly where the convictions are for serious dishonesty or sexual offences).  The judge was at pains to set out the caselaw in this area and to explain why such a simplistic approach would be wrong.  The practical effect of this is of course that the purpose of the ROA will continue to be seriously undermined as employers and others undertake the most casual of due diligence on prospective employees, using the world’s favourite search engine. ‘Rehabilitated offenders’ will continue to suffer. The flip side of the coin is that the legal definition of rehabilitated offenders includes individuals who have committed serious offences that many members of the public feel we should continue to be warned about regardless of whether a conviction is spent or not.  This is an age old debate which reflects societal anxiety (misplaced or otherwise) that certain types of offender (such as confidence tricksters, and child sex offenders) will always be prone to re-offending.  Of course, there are various legitimate disclosure regimes in place to protect the public, including the enhanced DBS check, and the Child Sex Offender Disclosure Scheme.

The outcome of NT2’s claim was wholly unsurprising.  Without having sight of the submissions to Google it is impossible to know what information was provided to it at what point.  However, NT2’s case as set out in the judgment, is a reasonably compelling one for delisting. If Google had been on notice of the core facts it seems it was entirely wrong to it refuse to delist the URLs at the outset and he should not have had to sue Google and wait three years for a court to confirm this (and of course many individuals will not be in a position to sue).  It follows from this that the judge’s decision to find that Google had a defence under section 13(3) is surprising.  What is said about this is that Google is an enterprise that takes its obligations under Google Spain very seriously.  The fact that Google has a system of sorts to process delisting requests, and complies with some of them is surely not a blanket defence for failing to do the job in any particular case?  Yes, Google has received a lot of delisting requests (670,000 since 2014), and it must process these as well the millions of take-down requests it receives globally as a result of copyright complaints, and other causes of action.  However, Google has for many years had a turnover greater than the GDP of a majority of the world’s nation states.  It has money enough to pay people to review these requests properly.  A more proper basis for not awarding damages – taking into account the judge’s criticism of the way it was pleaded – might have been that the damage/distress suffered was de minimis, although that would surely call into question the validity of the entire claim.  The apparent transposing of the section 13(3) defence to the tort of misuse of private information without further explanation is confusing at best.

Of more general note is that Google has been found liable for the misuse of private information (seemingly for the first time).  There is no discussion of the actus reus required for this.  For instance, there is no suggestion that publication is required and thus the principle in Metropolitan International Schools Ltd v (1) Designtechnica Corporation (2) Google UK Ltd & (3) Google Inc [2009] EWHC 1765 (QB) that Google is a “mere facilitator” for the purpose of defamation law is seemingly irrelevant.  Presumably, the act of making search results available – when on notice (again there is no discussion about this) – can prima facie be a ‘misuse’ of private information.  This finding is significant, not least because it means that privacy claims against Google are exempt from statutory changes introduced in 2014 that prevent the recoverability of conditional fee agreement success fees and after the event insurance premiums (the misuse of private information is an exempted cause of action, breach of the Data Protection Act is not).  This is great news from an access to justice perspective as it may allow individuals to bring claims who would not be in a position to fund claims privately.

Practical issues for those wanting to make delisting requests following this judgment

Individuals who are thinking of submitting a delisting complaint to Google should note the following from this case:-

  • The fact that a conviction is spent is weighty, but not determinative. If you say it is, then Google can rely on this case to say it is not.  Incidentally, media lawyers will know that the judgment reflects the ICO’s own position on this issue.
  • Attempting to go behind a criminal conviction (or the finding of any recognised tribunal) will normally be fatal. The case for delisting will be stronger if you can show that you are genuinely remorseful for your conduct and have felt this way for some time.
  • Your conduct following the ‘incident’ (whether it is a criminal conviction or otherwise) is extremely relevant. This can include any kind of conduct.  The case for delisting will be stronger when you can show you have turned over a new leaf.  This means more than simply be able to say you were not caught again.  If you have turned your life around, explain how you have done this.
  • What you are doing in your life now will be relevant. If you are an accountant who was struck off for breaching your regulator’s code of conduct, the case for delisting will be stronger if you have left the profession and become a yoga teacher.
  • There is a higher threshold if you are public figure. The Court has adopted the very broad definition of “public figure” put forward by the EU Working Party: it may be said to include anyone with a degree of media exposure, members of the regulated professions, and business figures.  There are millions of ‘regulated professionals’ in the UK (over 250,000 registered doctors alone), whilst ‘business figures’ could include even wider numbers of people.
  • The prospects of delisting of URLs relating to convictions are lower where the offence involved dishonesty and/or personal financial gain.
  • The prospect of delisting an unspent conviction following this decision are likely to be remote.
  • If you have put false information into the public domain or attempted to give a false impression as to your history, this may count against you. Using ‘search engine specialists’ to attempt to suppress articles in the mainstream media will often be ineffective.  Many individuals find their search results littered with poor quality, and repetitive links which, in and of themselves, may cause third parties confusion and suspicion.  The suppression of URLs from major websites will often require work to be done on an ongoing basis (thereby leading to great expense) where it is even successful at all (such techniques have a higher chance of success when the offending URLs appear on obscure websites).
  • An interesting consequence of the judge’s somewhat creative use of DPA Schedule 3 Condition 5, is that where criminal proceedings resulted in acquittal or discontinuance it may be arguable that none of the conditions in Schedule 3 are met (and thus the First Data Protection Principle has been breached/the data processing is automatically unlawful). This is because if the data subject has not committed a criminal offence it is harder, if not impossible, to argue that they have deliberately taken steps to put information in the public domain (as found by Warby J in respect ofNT1 and NT2).  The caveat to this is that an acquittal or discontinuance is not a finding of innocence, and  therefore, in certain cases, it may be open to Google to argue that an offence was committed (as it was effectively able to do with NT1 in relation to the second set of criminal proceedings).  It may also argue that the data subject took other steps to put information in the public domain.
  • Any complaint of inaccuracy should be properly particularised with reference to evidence. A bald assertion is insufficient.  Minor inaccuracies in respect of specific items of information may be disregarded if the balance of the information is accurate when viewed in context.
  • Absolute candour is required. Bad points should be addressed.  It may be appropriate to provide evidence of rehabilitation.
  • It is necessary to properly set out the damage and distress you have suffered. Where possible you should give specific examples of the search results affecting your personal life.  For instance, partners or colleagues bringing them to your attention, the loss of a job opportunity, or children raising them.
  • The case for delisting will normally be stronger where there is evidence that damage/distress may also be caused to minors (e.g. school children being bullied about URLs referring to their parents).
  • There is no basis for complaining about data that refers to businesses/companies. Data protection law only concerns living individuals.
  • You should not place general reliance on NT1/NT2 v Google unless you are referring to a specific part of the case that is directly applicable. In many instances, the case will be just as helpful to Google as it is to claimants.
  • Never sue Google LLC without taking specialist legal advice and understanding the potentially huge adverse cost consequences of losing or abandoning your claim. A request to the ICO for a review of a decision by Google is an inexpensive and risk-free alternative (although that should also be submitted via lawyers, where means permit).
  • Never sue Google UK Limited or any other parent or subsidiary. They do not operate the search engine, or process your data and are therefore the wrong defendant.
  • A delisting request should normally be detailed and address the relevant legal criteria. The law in this area is complex and technical.  If you are in a position to instruct specialist solicitors to make a request to Google on your behalf then this is normally recommended so that the focus of the submissions is on the correct issues and to give you the best prospects of the request being successful.

Finally, as the judge eloquently explained, we are in the twilight of the Data Protection Act 1998 and the dawn of the General Data Protection Regulations (‘GDPR’) is nearly upon us.  The GDPR will come into force on 25 May 2018.  From this date, delisting requests will be founded on the GDPR.  Whilst NT1/NT2 will no doubt be persuasive, the underlying law is changing.  The right to be forgotten will be codified in statute as the ‘right to erasure’.  The effect of this will be covered in a future blog.  In short, individuals’ data protection rights will be strengthened (for instance there will be no need to show that any harm has been suffered).  Whilst changes are unlikely to be seismic insofar as delisting requests to Google are concerned, those thinking of making (or renewing) such a request may now benefit from waiting until the new law is in force.

This post originally appeared on the Brett Wilson Media Law Blog and is reproduced with permission and thanks