In the wake of the recent Christchurch terror attack, during which the perpetrator was able to live-stream the attack online for 17 minutes without interruption, the Federal Government has sought to respond to growing calls for greater regulation of online platforms by criminalising - and heavily penalising - companies, and individuals, for failing to take sufficient action to prevent sharing of certain kinds of violent and objectionable content online.
In a move which has drawn substantial criticism from industry groups, policy experts and lawyers alike, the new laws were rushed through Parliament in record time, passing both the Senate and House of Representatives without amendment and little debate in the space of just two days. This legislative response by the Government has been referred to by Law Council President, Arthur Moses SC as a "knee-jerk reaction to a tragic event".
The Federal Government has enacted the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Act 2019 (Act), which amends the Criminal Code Act 1995 (Cth) (Criminal Code) and came into force on 6 April 2019.
Who does it apply to?
The changes to the Criminal Code will apply to ISPs, content service providers and hosting service providers anywhere in the world, including websites, social media platforms and content management/cloud solutions providers, but will not apply to email, instant messaging, SMS or online gaming services.
What does it apply to?
The Act regulates abhorrent violent material, being audio and/or visual material that records or streams offensive abhorrent violent conduct. Abhorrent violent conduct includes a terrorist act (as specifically defined in section 100.1 of the Criminal Code), murder, torture, rape or kidnapping. The material must be published by someone who is directly or indirectly involved in the relevant conduct and, depending on the offence, must either be recorded, or accessible, in Australia.
The Act creates 2 new kinds of offences under the Criminal Code:
1. Failure by a service provider to notify the Australian Federal Police, within a reasonable time, that abhorrent violent material relating to conduct which is occurring, or has occurred, in Australia is accessible on a service.
2. Failure by a service provider to expeditiously remove, or cease to host, abhorrent violent material that is accessible within Australia.
The changes to the Criminal Code empower the eSafety Commissioner to issue a notice giving rise to a presumption that a service provider has been reckless as to whether its service can be used to access/host material which is violent abhorrent material at the time the notice was issued, unless the service provider can prove otherwise. The receipt of a notice will in effect impose strict liability for the offence, unless a service provider acts "expeditiously" to remove the relevant material.
There are some defences available under the new provisions of the Criminal Code, including in limited circumstances of publishing offending material which relates to a news or current affairs report made in the public interest by a journalist acting in a professional capacity.
The new laws are backed by substantial penalties:
- Failure to notify law enforcement within a reasonable time that abhorrent violent material is accessible on a service can result in penalties of up to A$168,000 for individuals, or A$840,000 for body corporates.
- Failure to expeditiously remove or cease hosting content is punishable by a jail term of up to 3 years or a fine of A$2.1 million for individuals, and fines of up to the greater of A$10.5 million or 10% of global group annual turnover for body corporates.
A number of key areas of concern have been raised in relation to the legislation, including as follows:
- Preventing racism and hate speech: Clearly, the law aims to stop the spread of abhorrent violent content. However, this reactive response arguably does not address the underlying cause of the violence itself. There remains a question whether regulatory reform would be more appropriately targeted at laws governing hate speech and racial discrimination.
- Scope of removal obligation: The legislation does not define "expeditious" removal, nor is it clear whether the obligation to remove in fact amounts to a requirement to 'take down and keep down' offending content. The condensed timeframe for content removal makes it all the more difficult for service providers to make an accurate assessment as to whether or not certain content falls within some of the broadly-worded categories of content included in definition of "abhorrent violent material".
- Individual liability: It has been widely reported that social media executives risk imprisonment under the changes to the Criminal Code. In fact, the new legislation does not expressly impose this kind of liability on executives and, according to Attorney-General Christian Porter, the individual liability contemplated under the changes to the Criminal Code is instead aimed at "radical, fanatical" websites owned and operated by individuals. That said, the possibility remains that social media executives could face accessorial liability for aiding, abetting, counselling, procuring, or for otherwise inciting or conspiring in an offence under the Criminal Code by virtue of their involvement in moderating content on digital platforms.
- Extraterritorial effect: The new laws are expressly stated to apply to any service provider, whether or not they provide a service from outside Australia. The purported extraterritoriality of the new laws raises questions of enforceability, particularly in the context of potential clashes between obligations under local and foreign laws including where the laws may operate potentially undermine Australia's security co-operation with the United States.
- Impact on Telcos: The Act has created a difficult question for the carriers and carriage service providers. Section 313(1) of the Telecommunications Act 1997 requires that they do their best to prevent networks and facilities from being used in, or in relation to, offences against the Commonwealth. Even though the Act expressly does not apply to entities that merely supply a carriage service (section 474.39(1) ), regulators are taking the position that s313(1) requires that they take steps to block abhorrent violent material whether or not they have received a notice from a regulatory agency.
The Explanatory Memorandum expressly states that of the new laws aim to deter service providers from failing to take action in relation to abhorrent violent material that is accessible on their services. The objective is to ensure that service providers will proactively refer this material to law enforcement, and will also remove such material, no matter where the underlying conduct is committed. The effect of the law is to impose a positive obligation on social media and other content or hosting services to "expeditiously" remove or cease hosting certain objectionable content, without the need for notice or even awareness that the content is accessible through its service. Moving forward, businesses are likely to need to undertake a significant shift in their practices and processes to pre-emptively monitor and permanently remove of any content which appears to fit the criteria of abhorrent violent material under the Criminal Code.
Although a range of ISPs, content service and hosting service providers are now required to comply with the amendment to the Criminal Code, the requirement for businesses to proactively engage in referring and removing abhorrent violent material will have varying degrees of impact on day-to-day operations, depending on the size of an organisation and the nature of its service offering. For example, while larger tech companies may already have sophisticated content moderation processing place, smaller companies caught by the new laws are unlikely to have the resources or technical capabilities required to readily comply with these new obligations.
In his second reading speech, the Shadow Attorney-General Mark Dreyfus indicated that, if elected, the opposition will refer this law to the Parliamentary Joint Committee on Intelligence and Security for consultation and review, echoing recent developments surrounding the controversial encryption laws passed at the end of 2018. He further remarked that, while it disapproved of the lack of "proper processes" and the drafting of the legislation, the opposition "agree[d] that the streaming of the Christchurch terror attack has shown that there is a need for measures to prevent such conduct occurring in the future."