Google’s autocomplete function has become yet another of the company’s range of services which most internet users use unthinkingly. It offers suggested terms according to the characters you have already typed, based on other users’ search activities, and on the content of existing web pages. The user-driven nature of internet search means that the suggestions it makes for particular individuals’ names are not necessarily those that they would welcome themselves, or ones which are genuinely useful.
By way of an innocent illustration, the top suggestion when I search for the Prime Minister is “david cameron twitter”, whereas the top suggestion for the Deputy Prime Minister is “nick clegg looking sad” (leading to this tumblr which compiles mournful pictures of him). This particular suggestion, while it in fact reflects only the popularity of a photo-based satire, does create the sense of some comment about the respective positions of the two men in the eyes of the electorate.
Sometimes, however, the autocomplete suggestions for particular individuals’ names are more serious, defamatory ones, reflecting gossip or speculation which internet users have tried to follow up through online research. Bettina Wulff, the wife of former German President Christian Wulff last week brought a claim in a Hamburg district court in an effort to remove a number of terms suggested by the autocomplete function which connect her to damaging speculation about her past. Google has refused to remove them, asserting via a spokesperson that they are “the algorithmic result of several objective factors, including the popularity of search terms.”
There has not yet been any autocomplete-based defamation litigation in England and Wales, but it is interesting to consider what the position might be in light of the authorities as they currently stand, and of a Milan court’s 2011 decision in an Italian defamation case. The domestic decisions contain indications that Google could be held liable at common law, primarily as publisher of the defamatory statements, and, in any event, where notification has been made, as a party which has acquiesced to their continuing existence in its aftermath.
There is, of course, a defence for publishers in the Defamation Act 1996, and the Italian court considered the operation of the “hosting defence” provided by Directive 2000/31/EC of the European Parliament and Council which has been used in recent litigation concerning Google’s Blogger platform (Tamiz v Google Inc ( EWHC 449 (QB), and Davison v Habeeb ( EWHC 3031 (QB))
While the suggestions are collated by an algorithm which assesses other users’ search activities, and the contents of web pages indexed by Google, it is Google itself which publishes them, the details of the original searches they draw on rightly remaining confidential to each individual user. Only Google could provide exact details of the number of internet users these publications are made to, but they are certainly substantial. There is also a potential “vicious cycle” effect at work, identified by an SEO expert in this interesting piece on autocomplete’s impact on US Congressmen
“If Google autocomplete suggests a certain search term… people are more likely to search it; the more people search a certain term, the more likely Google autocomplete is to suggest it.”
In Metropolitan International Schools Limited v Design Technica Corp. (2009 EWHC 1765 (QB)), a case about search results, Eady J referred to the question of whether Google had “authorised or caused the snippet to appear on the user’s screen in any meaningful sense”, finding that it had not. He used the analogy of a library’s catalogue compiler in relation to search results, postulating that where a catalogue excerpted and drew attention to the defamatory contents of a particular book “there could be legal liability on the part of the compiler under the repetition rule.” With its autocomplete facility, Google does appear to be operating as just this kind of helpful, or officious, librarian in its attempts to push those who search the catalogue of the internet in a particular direction, thereby becoming a publisher in its own right.
Even if Google were somehow to escape liability as the initial publisher, were the subject of a defamatory autocomplete suggestion to notify the company of its existence, there can be no doubt that it would have the ability definitively to remove it. It has already been subject to orders to do so in other jurisdictions, and its own policy states that it does apply
“a narrow set of removal policies for pornography, violence, hate speech, and terms that are frequently used to find content that infringes copyrights.”
Both Davison and Metropolitan International Schools indicate that it would have time to do so before being fixed with liability. In contrast to the position when Google provides search results, or services such as Blogger, any claimant’s remedy would lie with the company only; reputational vindication or access to justice could not be sought through a claim against any other publisher.
The recent litigation involving Google and its Blogger platform has seen the company rely on the defences contained in the Defamation Act 1996, and in Directive 2000/31/EC, implemented in the United Kingdom by the Electronic Commerce (EC Directive) Regulations 2002.
The Defamation Act s 1 (1) (a) offers a defence to someone who is not a publisher of a statement and, at 1 (2) says that a publisher “means a commercial publisher”. It seems to be at least arguable that, in relation to autocomplete, Google is in the business of “issuing material to the public” (its search results) and “issues material containing the statement in the course of that business” (the suggestions). The question of whether Google is a commercial publisher could well be resolved against the company’s interests here: the more direct relationship between Google and the defamatory statement does stem from its commercial interest in providing the fastest and most convenient service to its users.
The other criteria outlined at 1 (1) (b) and (c), of reasonable care and knowledge, would see any internet business rely on the medium’s vast scale to argue that it could not be expected to devoting resources to policing the outcome of the algorithm’s operation. However, Google does already have removal policies in place, primarily as a result of pressure from copyright holders. One could also compare Google’s situation to that of other internet businesses such as Facebook, which plough resources into sifting through their content in detail to ensure it meets particular standards.
The operation of the Directive in relation to autocomplete was considered on appeal by a court in Milan in 2011 (judgment in Italian).The Italian claim was brought by an individual whose name and surname brought up the autocomplete suggestions “con man” (“truffatore”) and “fraud” (“truffa”). The Italian judges held that the defence offered by the directive did not apply as the information society services which it protects are exclusively to do with “the activity of the storage of information provided by others” . The judges continued:
“This sort of appeal is not about the hosting activities of Google, but the recurrent association with the name of the words conman and fraud – an association which, clearly, is the product of a specific way of working of the “autocomplete service”, which is a software developed by Google, and which the latter makes use of to enable users’ research via its search engine.”
The Judges went on to focus on Google’s role in developing and providing the software in their finding of liability.
In one sense, Google is hosting the combined results of all of its users’ activity, and making them available to others, carrying out precisely “the storage of information provided by a recipient of the service” which is described in the Directive. But the logic of the Italian court is difficult to fault: the provision of suggestions based on this information is a conscious, and commercially-focused, decision by the company, and one which seems to go one step beyond mere hosting. It seems unlikely that it will assist Google either in the Wulff litigation, or in this jurisdiction.
Commentators have pointed out that Google’s problems in relation to autcomplete are at least partly of its own making. The company has enacted a limited removal policy, which has undermined its claim that the facility is objective, and left it open to the inference that there is always more it could do on behalf of particular groups or individuals who object to suggestions made about them. A claim, which has now been dropped after an “agreement” was reached, was brought by SOS Racisme in France on the basis that the word “Jew” or “Jewish” was suggested after the names of certain celebrities.
There are any number of aggrieved parties who might consider this kind of claim against Google, and many argue that the company’s potential liability is an unnecessary burden, and one which could damage the interests of all internet users. However, as reputation and privacy become qualities which are shaped first and foremost online, more attention will surely be paid to the definitive role of one business, and the services it offers, in that process.
Gervase de Wilde is a pupil barrister at 5RB, and former journalist at the Daily Telegraph.