In a defamation action, there are two kinds of publisher; a primary publisher and a subordinate publisher. Until now, most internet intermediaries (website hosts, search engines etc) have been treated as subordinate publishers unless/until they knew of the defamatory material on their platform and didn't remove it. A subordinate publisher can argue a defence of innocent dissemination. That is, it can say that it was not sufficiently involved in the publication to be liable for it.
A lot of internet intermediaries rely on being subordinate publishers, and mitigate their risk of liability for defamatory user comments by operating a notice and takedown regime. That is, they rely on other people telling them if a user has posted a defamatory comment so that they can take it down.
This week the NSW Supreme Court held that news outlets are primary publishers of defamatory user comments posted on their public Facebook pages, even prior to receiving any complaint. The case involves former Don Dale detainee Dylan Voller. After news outlets published stories featuring him on their public Facebook pages in 2016 and 2017, other users posted defamatory comments about Voller on those Facebook pages. He is suing the news outlets for defamation as publishers of those posts.
The decision that the news outlets are primary publishers turned on the particular features of a public Facebook page. The key point was that the news outlets had, at least hypothetically, a way of vetting all user comments on a post prior to them being made public. The way to do this involves using filters which hide filtered comments pending review by the page owner. By filtering words like `a', `the', `it', etc. they could filter, and have the chance to vet, most if not all comments on a post.
The judge interpreted this as meaning that the news outlets (i) could easily acquire knowledge of the content of the comments before they were published to the public, and (ii) could control whether the comment is published. As a result, they were primary publishers.
The effect of the judgment is that to avoid liability, news outlets need to use the filtering function to try to hide all user comments, then vet them and not `unhide' anything that might be defamatory. That or block comments altogether, which would have drastic commercial consequences when Facebook's algorithms take account of the volume of comments on a post. News outlets rely heavily on traffic from Facebook to generate page views and ad revenue.
That is clearly going to have a chilling effect on communications. And in practical terms it's going to be difficult to craft a filter that might catch all comments and judge whether a comment is defamatory with no input from the person defamed. They might not care about a comment, even if it is defamatory. Or it might be true. Or otherwise wholly defensible.
But the page owner can't assess triviality, truth or other defences. It can't edit a comment, as it could its own stories, to remove a defamatory element. Its only choice is not to publish the whole comment it in order to avoid exposure. This is absurd, and will stifle many communications which are lawful. Arguably, this dilemma illustrates why the Facebook page owner does not have sufficient control or involvement in the publication of a user comment to be treated as a primary publisher.
Other courts have recognised a public policy interest in treating platform hosts as subordinate publishers, based on the reality of how real-time interactive webpages operate, noting that the exercise of editorial control in advance of a complaint is `antithetical' to the forum of communication. Even the High Court has referred to the need to strike a balance between free speech and the free exchange of information and ideas, and the maintenance of a person's individual reputation.
The judge in Voller didn't see it that way though. He saw the news outlets' public Facebook pages as a purely commercial exercise intended to boost readership and advertising revenue. He said the public Facebook pages had `little to do with freedom of speech or the exchange of ideas'. Take that.
So, news outlets and any other businesses with public Facebook pages, it's time to rethink the notice and takedown policy; it won't protect you anymore. At least for now. Odds are that the news outlets will appeal. This is too big a call to let stand. If nothing else, it illustrates how hard it is to apply our antiquated defamation laws to this crazy, modern world, and how drastically those laws need reform.