Key Takeaways:

  • From 2023 onwards, online communication services (which would include social media services and other services that make content available to users in Singapore) will be required to remove egregious content if directed by the regulator.
  • A failure to do so, may result in fines of up to SGD 1 million or blocking of their service in Singapore.
  • Designated online communication services (likely those with significant reach) will also be required to comply with additional Online Codes of Practice, which may include obligations on regular reporting and the implementation of further measures to prevent access by children and other vulnerable users to content that may cause significant harm.


On 9 November 2022, the Singapore Parliament passed the Online Safety (Miscellaneous Amendments) Bill to tackle online harms and strengthen online safety for users. This Bill will take effect from early 2023.

The Bill amends the Broadcasting Act 1994 and follows on from a public consultation launched in July 2022 on Enhancing Online Safety for Users in Singapore. This development is also part of the broader trend in regulating online content worldwide, such as the United Kingdom’s Online Safety Bill, the European Union’s Digital Services Act, and Ireland’s Online Safety and Media Regulation Bill.

Who is impacted?

The Bill imposes new obligations on “online communication services”, which broadly includes any service inside or outside Singapore that enables users in Singapore to communicate or access content via the internet, or which delivers content via the internet to end users in Singapore.

As a general rule, private messaging and closed-group communication services, including workplace message boards or forums and messaging apps, are excluded from the new obligations. However, if the chat groups have very large and open memberships and could potentially be used to disseminate egregious or harmful content, then they will not be excluded from compliance with the new obligations as they would be considered no different from non-private communications. The Bill sets out a list of factors that must be considered collectively to determine who falls under the new law, for example, the number of individuals in Singapore who can access the content and whether there are any restrictions in place as to who can access it.

What are the new rules?

The Bill is intended to regulate “egregious content”, which includes content that advocates or provide instructions on suicide or self-harm, physical or sexual violence, and terrorism, as well as matters that cause racial and religious disharmony in Singapore. If such content is used in a positive way or is educational in nature, such as being mentioned on online forums for users to share personal experiences to help others in overcoming it, it will not be considered harmful or egregious.

Online communication services will be required to block access by Singapore users to such egregious content where directed to do so by the Infocomm Media Development Authority (“IMDA”), within a stipulated timeline. During the second reading of the Bill, the Minister for Communications and Information indicated that the specified timelines would generally be ‘within hours’, and that the timeline given will be proportionate to the potential harm of the content and will be expedited for more egregious content such as terrorism-related content.

Failure to comply with a blocking direction from the IMDA is an offence and may result in a fine of up to SGD 1 million.

In addition, the IMDA may designate certain online communications services as Regulated Online Communication Services (“ROCS”). During the second reading of the Bill, the Minister clarified that ROCS would be those services with “significant reach”. The threshold for “significant reach” has yet to be defined, but it is expected that most of the widely used social media platforms will be included within this scope.

ROCS will be required to comply with additional Online Codes of Practice to be issued by IMDA. The Codes will likely include requirements on having in place internal systems and processes for preventing access to content that has a risk of causing significant harm to children and other vulnerable users, regular audits to demonstrate compliance, and regular reporting to IMDA.

During the earlier public consultation in July, it was announced that two specific codes were intended to be issued, namely: (a) the Code of Practice for Online Safety (“CPOS”); and (b) the Content Code for Social Media Services (“CCSMS”).

The proposed CPOS will require ROCS to take appropriate safeguards to limit exposure to egregious content online and the proposed CCSMS is intended to provide IMDA with the power to deal with residual egregious content.

What’s next?

The Bill will likely take effect sometime in 2023, and we can also expect further public or private consultations on the draft CPOS which has been released (along with accompanying guidelines) and the draft CCSMS which has yet to be released.

The government has also signalled its intention to closely monitor the latest developments in age verification technology to better safeguard young users online. Hence, there is an expectation that these issues will be addressed through the final versions of the new Codes to be issued. According to IMDA, the CPOS is expected to be brought into force in the second half of 2023.