Google became responsible for the defamatory publication of the web matter once it had notice of the material and failed to do anything to stop it from occurring
The Victorian Supreme Court has found Google liable for online defamatory publication through the automated mechanism by which its search function produced material that associated the plaintiff's name along with the names and images of high-profile criminal figures (Trkulja v Google (No 5)  VSC 533).
If the search engine was designed by its programmers to work in this way, is the search engine provider responsible for the legal ramifications arising out of the online publication it creates? Seemingly so. The jury in a civil damages case brought against Google Inc found that the plaintiff, Michael Trkulja, had been defamed by Google's automated search function, and specifically through the manner in which that search function put together material (mainly images) that suggested the plaintiff was linked to notorious criminal identities.
While the level of damages awarded in this case was not large ($200,000), it is a significant figure given the cap on damages under the uniform defamation laws (currently $339,000).
What were the search results Mr Trkulja objected to?
Mr Trkulja complained of two types of search results produced by Google when his name was entered.
The first (the "images matter") came when he entered his name into Google Images. Along with his photo appeared the photos of prominent alleged and convicted criminals, and an article headed “Shooting probe urged November 20, 2007” with a larger photograph of Mr Trkulja. This was on the third page of results, along with the heading “Melbourne crime”. Under this heading there were nine photographs of various people either known to have committed serious criminal offences or against whom serious criminal allegations had been made.
The second matter was text search results (the "web matter"). The first page of the web matter consisted of the first ten results of 185,000 results for the search term “Michael Trkulja”. The third page of the web matter consisted of the article under the same heading with the same nine photographs and the larger photograph of Mr Trkulja as contained in the images matter.
Mr Trkjula contacted Google, asking it to block the URLs that created the defamatory material. It declined to do so.
Was Google the publisher?
Google said that it had performed a passive role in respect of the publication, because its systems were automated. The assembled material appearing on the search results page in response to Googling Mr Trkulja's name was not intended by Google to result in a defamatory publication. In other words, Google argued that Mr Trkulja needed to prove it had an intention to publish defamatory material, and Google lacked that intention because the imputations were created by an automated search system and not as a result of human intention.
"Ah, not quite so", held the trial judge, Justice Beach. Conscious intention to publish is not a requirement at common law – playing a passive role did not automatically mean Google could not be a publisher. And he went on to say that notwithstanding lack of conscious intention, Google Inc did in fact intend to publish the material complained of, even though its search system was automated. It produced the imputations because the computer programming behind the search facility had been written by humans, and that facility was intended to operate in precisely the manner the programmers intended it do when it associated Mr Trkulja's name and image with that of the criminal identities concerned.
Google Inc relied upon three English decisions in which internet service providers had not been held to be publishers (Bunt v Tilley  EWHC 407;  1 WLR 1243; Metropolitan Schools Ltd v Designtechnica Corporation  1 WLR 1743; and Tamiz v Google Inc.  EWHC 449). Justice Beach distinguished all three.
The email to Google – was Google on notice?
Google had argued even if it was the publisher of the defamatory material (which it denied), the defence of innocent dissemination would be available.
Under section 32 of the Defamation Act 2005 (Vic), it is a defence to the publication of defamatory matter if the defendant can prove that it published the matter merely as a distributor, did not know or ought to have known the matter was defamatory and the lack of knowledge was not due to any negligence on its part. A similar defence is available in other jurisdictions.
In the email sent to Google, Mr Trkulja's former lawyers complained of the images matter and web matter. It was not immediately obvious from the material they supplied which websites hosted the images (although it is not too difficult to discover them). The web matter of course listed the search results and URLS.
Justice Beach held that the jury was entitled to find on the facts that Google became responsible for the defamatory publication of the web matter once it had notice of the material and failed to do anything to stop it from occurring. Google thereafter consented to the publication of the web matter knowing it to be defamatory of Mr Trkjula.
He also held the jury was entitled to find that the defence of innocent dissemination could be made out in respect of the images matter both before and after the email was sent, although he pointed out that it could have concluded Google Inc became aware of the defamatory material which gave rise to the images matter after the email was sent: "It would not take very much effort to work out, from the page of photographs supplied to Google Inc, the identity of the website… All one had to do was click on one of the images".
Where next for Google and other search engines?
Justice Beach's decision amounts to a significant development in liability for online defamation. This can be compared to a more pragmatic approach revealed in the Tamiz case to Google's passive involvement in publication and an understanding of the difficulty faced by an organisation in effectively keeping watch over the vast amount of material published through its products.
In February, the Federal Court heard another defamation based claim involving Google, (Rana v Google  FCA 60). Rana, the applicant, alleged that Google's failure to remove racist and defamatory and material appearing on websites belonging to the Gregurev respondents, rendered Google in breach of the Racial Discrimination Act 1975 (Cth), amongst other claims brought by Rana. The Federal Court needed to consider whether the US parent company, Google Inc, or its Australian subsidiary, Google Australia, was the "publisher" of the defamatory content. A necessary adjunct for consideration was whether Google could be held liable as the publisher (rather than host) of the material of which it had no notice.
The Court held that Google Australia did not control or direct the conduct of Google Inc and that Google Australia's role in Australia was limited to that of a sales support and marketing role. It was Google Inc in the US that carried out the web hosting service of which Rana complained. It should be noted that Google's evidence in this regard was not challenged by the applicant. Justice Mansfield observed that the law regarding responsibility as a publisher for internet based publications where the website host maintains a passive role in the dissemination of defamatory material, is not settled. The judge considered international case law, including the recent decision in Trkulja. He concluded that it was at least arguable that Google Inc may be liable for the allegedly defamatory material, albeit in this case, he declined to find so for the applicant.
We can expect the big search engine operators such as Google to significantly tighten up their complaints procedures and seek greater indemnities for online publication.
As the law currently stands there is an imperative for all search engine providers to act on complaints of this nature promptly.
Whether Google and others who provide search engines will be able to head off trouble before it arises is another question. A search engine provider cannot know whether a string of search terms keyed in by users will produce defamatory results and less likely are they able to come up with an algorithm that could effectively block the URL that would produce search result material from search terms it has no possibility of knowing about or control over.
Acting in response to a complaint also generates some practical problems. For example, even if they are able to reconfigure their search system to prevent the association of identified search terms, how long must they have this computerised block in place?
We note that as part of the preamble to their current standard terms as at 1 March 2012, Google says:
"Our Services display some content that is not Google’s. This content is the sole responsibility of the entity that makes it available. We may review content to determine whether it is illegal or violates our policies, and we may remove or refuse to display content that we reasonably believe violates our policies or the law. But that does not necessarily mean that we review content, so please don’t assume that we do."
Jim FitzSimons, Alessandra Steele and Kristen Zornada