A landmark court decision is likely to have profound implications for hosts of public Facebook pages in Australia.
Media companies are now considered the 'publisher' of third-party comments on their public Facebook pages and can be held liable where those comments are defamatory.
This decision has alarming and profound consequences for all Facebook page hosts, who are now required to monitor and moderate comments posted in response to posts on their page.
As comments cannot be completely blocked or disabled, the exposure to liability can only be reduced, and not eliminated.
The Supreme Court of New South Wales has reached a landmark decision in Dylan Voller's defamation case against three media companies: Fairfax Media (now Nine), Australian News Channel (owners of Sky News), and News Corp.
The Court decided that media companies are now considered the 'publisher' (in a legal sense) of third-party comments on their public Facebook pages and can be held liable where those comments are defamatory.
This decision has alarming and profound consequences for media companies, who are now liable for comments posted in response to posts on their Facebook page. Although this decision did not rule on whether other Facebook page owners with public Facebook pages could be held liable in similar circumstances, the door is left open for this to occur in the future.
To reduce the risk of being sued for defamation as a consequence of allegations made in the comments section, it may be necessary for all Facebook page owners to be cautious about their Facebook posts and whether they are likely to solicit defamatory comments. If necessary, they may change the settings on Facebook posts to enable the comments to be meticulously vetted before becoming publicly available.
This will significantly impact the way social media is used and may restrict freedom of speech generally in Australia.
Media companies deemed the publishers of third party comments
During his trial, Voller was required to prove that media companies are the 'publishers' of comments posted by third parties on their Facebook posts – and it was concluded that they are.
Key to this decision was the conclusion of the judge, Justice Rothman, that it is possible to hide comments that contain particular words, and that if you use a list of extremely common words, then 'it is possible to hide, in advance, all, or substantially all, comments'. This monitoring process then involves a moderator sifting through the hidden comments and 'un-hiding' them so they can appear publicly.
Consequently, an important element of the reasoning was that if the media companies had taken pause to assess the potential consequences of the publication of the original posts, they would have found them to be likely to give rise to nasty and defamatory comments.
Now that Voller has satisfied this preliminary question, the Court will hear the remainder of the defamation case, including whether the media companies are capable of defending the alleged defamatory imputations.
Dylan Voller's defamation case
Dylan Voller is a former youth detainee at Darwin’s Don Dale Youth Detention Centre. There was extensive media coverage of Voller's mistreatment at the facility, which included articles being posted on Facebook, of which, many members of the public then commented on. Some of these third-party comments contained serious allegations about Voller. Voller did not sue the individual commenters and instead sued the media companies.
Notably, Voller had not notified the respective media companies of the allegedly defamatory comments on their Facebook posts.
How can Facebook page hosts protect themselves from defaming people on social media?
In light of this decision, hosts of public Facebook pages are required to be on high alert about potentially defamatory comments posted on their page. Although this decision was focused on the liability of media companies, it leaves open the possibility that other entities with public Facebook pages could be held to be 'publishers' of third-party comments on their posts in the future. This poses a challenge as comments on a public Facebook page cannot be entirely disabled.
While this decision still stands, hosts should consider implementing a comment moderation strategy. This could involve the following options to mitigate against the risk of being sued in similar circumstances:
- Before posting, assess the nature and subject matter of the content and whether it will be a high or low risk of eliciting comments that could be defamatory. Be cautious not to publish content that could be seen as inviting controversial comments.
- There are then two key approaches to moderating comments:
- Hide one-by-one: Monitor the comments as they are posted and 'hide' those that contain potentially defamatory allegations. This will keep it hidden from everyone except the person who wrote the comment and their friends. This means they won't know that the comment is hidden. Alternative options to this one-by-one approach are to 'delete' the comments, which means it will be permanently removed, or 'report' the comment to Facebook, which you could do as well as hiding the comment.
- Block words: Facebook settings allow you to block certain words from appearing on your page. This means that comments containing the words would need to be 'unhidden' to appear publically. As suggested above, it is possible to use this tool to hide substantially all comments that contain commentary through blocking a list of extremely common words. The comments can then be monitored and allowed to be published following an assessment that they do not contain potentially defamatory material. This is the more proactive approach.
There is also a 'profanity filter' that can be turned on to block different degrees of profanity from appearing on your page. This is measured by Facebook according to the most commonly reported words and phrases marked offensive by the community, and could assist in the early stages of implementing a comment moderation strategy.