In a judgment handed down on 16 July 2009, Mr Justice Eady ruled that Google is not the publisher in law of defamatory words that appear in its search results, even though it had been put on notice that the results were allegedly defamatory.
The claimant, Metropolitan International Schools Limited (“MIS”) is a provider of adult distance learning courses in the design and development of computer games. MIS trades under the name “Train2Game”. Between 1992 and 2004 it traded under the name “Scheidegger MIS”.
MIS brought an action in the High Court against Google Inc (“Google”), Google UK Limited and a US company, Designtechnica Corporation trading as Digital Trends. MIS alleged that comments about MIS posted on forums on a Digital Trends website were defamatory. It also claimed that Google was liable as a publisher for an excerpt from the forums which appeared in a Google search engine result.
The allegedly defamatory comments on Digital Trends’ forums were, amongst other things, that MIS courses were nothing more than a scam intended to deceive honest people into parting with their money. MIS claimed that when entering the search term “Train2Game” in Google.co.uk and Google.com, the third and fourth highest search result included the snippet of text “Train2Game new SCAM for Sheidegger”. MIS claimed that this snippet was defamatory.
The judgment was made at the hearing of an application by Google to set aside an earlier order. The earlier order granted MIS permission to serve the proceedings on Digital Trends and Google outside of the UK. Google argued that the UK Court had no jurisdiction to hear the claim as Google Inc. is a US company and should be sued in California. It further argued that even if the UK is the appropriate forum to bring the claim, Google had no responsibility for publishing the words complained of and therefore MIS had no prospect of its claim succeeding (which is a requirement for granting permission to serve proceedings out of the UK).
The judge therefore focused on the question “Can the operator of a search engine be liable for publication of a defamatory statement?”
In answering this question, the judge noted that “There appears to be no previous English authority dealing with this modern phenomenon.” The only two decisions which he said are relevant to the role of internet intermediaries, are Godfrey v Demon Internet Limited and Bunt v Tilley. Google contended that it was not responsible for the snippet complained of, at least prior to notification by MIS as to the identity of the specific URLs from which the words complained of originated. The test according to Google was whether it was knowingly involved in the publication of the defamatory statement. This accords with the judgment in Bunt v Tilley, where it was held that an internet intermediary, if undertaking no more than the role of a passive medium of communication, cannot be characterised as a “publisher” at common law. Google went further and said that even if it is put on notice of any defamatory material or URLs, it still would not be liable as a matter of law.
The judge said that the “appropriate question here… is whether [Google] should be regarded as a mere facilitator in respect of the publication of the “snippet” and whether… that would remain a proper interpretation even after the date of notification.” The central point of Google’s application was “whether [Google] is to be regarded as a publisher of the words complained of at all”. The judge concluded that Google could not be regarded as a publisher of the words complained of and its role was merely that of a facilitator. In coming to his conclusion, he considered that:
- When a search is carried out via the Google search engine, there is no human input.
- When a snippet is thrown up on a user’s screen in response to a search, the user is pointed in the direction of an entry somewhere on the Web which corresponds to his search terms. It is for the user to choose that route or not. Google has no role in formulating the search terms and could not prevent the snippet appearing in response to the user’s request.
- Google does not choose the wording of any snippet. There is no intervention by any human agent as it has all been done by the “web-crawling ’robots’.”
The judge then considered whether Google could be liable if it had been put on notice of defamatory material. He referred to the case of Godfrey v Demon Internet Limited where Demon Internet was found liable for defamatory postings on its website, which it had failed to remove despite requests by the claimant to do so. Demon Internet therefore had knowledge that the words complained of were defamatory and had the ability to take the postings down. The judge distinguished the Godfrey case saying that a search engine “is a different kind of intermediary… One cannot merely press a button to ensure that the offending words will never reappear on a Google search snippet: there is no control over the search terms typed in by future users. If the words are thrown up in response to a future search, it would by no means follow that [Google] has authorised or acquiesced in that process.”
He went on to state that there were some steps that Google could take, such as putting in place a take-down policy: “There is a degree of international recognition that the operators of search engines should put in place such a system… to take account of legitimate complaints about legally objectionable material.” However, it does not follow that “between notification and “take down” Google becomes or remains liable as a publisher of the offending material.” Google had taken steps to block identified URLs so that they would not be shown in response to Google searches. However, to do this it needs specific URLs to be identified. It could not block the specific words complained of without blocking other legal material which might include the words contained in the snippet. It was therefore “unrealistic to attribute responsibility for publication to [Google] whether on the basis of authorship or acquiescence.” An injunction against Google would be “a hopelessly inadequate substitute” for bringing a claim against Design Trends to remove the offending material.
Mr Justice Eady has clarified the law in this area in some respects, but some questions remain unanswered. For example, if a search engine operator does not have a take down policy, can it be liable as a publisher? What if a search engine is notified of specific URLs displaying defamatory material and these are not removed? How quickly does a search engine operator have to remove notified URLs before it becomes liable as a publisher?
If anyone finds themself the subject of a defamatory search result thrown up by a search engine, the following options are available:
Target the source of the defamatory material i.e. the URLs where it is published;
Notify the applicable search engine of the defamatory search result as soon as possible and provide details of the URLs where the defamatory material is published, with a request that the search engine remove these from the search result; and
If the search engine does not take action seek legal advice from specialists in the area of defamation.