Introduction

Social media services have become an integral part of our lives, developing into one of the primary platforms for leisure and communication while also providing new opportunities for work and business. However, with the prevalence of social media, one of the key concerns of regulators is that of harmful online content. Propagated in the digital space, such content can have serious real-world consequences, including the promotion of violence and self-harm, and the destabilising of mental and physical well-being.

The security of digital spaces is one of the key issues being considered by the Government. In March 2022, the Ministry of Communications and Information ("MCI") gave an indication of what changes and enhancements may be expected in the digital regulatory and compliance framework, including the introduction of codes of practice for online platforms to protect Singaporeans against harmful online content. For more information, please see our earlier Legal Update on this topic here.

These proposed codes of practice are now coming closer to fruition. On 13 July 2022, MCI issued a Public Consultation on Proposed Measures to Enhance Online Safety for Users in Singapore ("Public Consultation"). The Public Consultation sets out the proposed measures to address harmful online content on social media services, which include:

  1. Code of Practice for Online Safety, which sets out the required measures and safeguards against harmful content to be implemented by designated social media services; and
  2. Content Code for Social Media Services, which empowers the Infocomm Media Development Authority ("IMDA") to direct social media services to disable access to harmful content.

The public has until 10 August 2022 to give their feedback in this Public Consultation. This Update provides a summary of the main facets of the proposed measures and what obligations social media services may expect in the eventual codes.

Code of Practice for Online Safety

MCI has proposed to introduce a Code of Practice for Online Safety, which will cover designated social media services ("DSMS") with significant reach or impact. DSMS will be required to have certain measures and safeguards to mitigate exposure to harmful online content, including the following:

  • Community Standards – DSMS will be required to have community standards for these categories of content: (i) sexual content; (ii) violent content; (iii) self-harm content; (iv) cyberbullying content; (v) content endangering public health; and (vi) content facilitating vice and organised crime.
  • Moderation and Deletion – The content above should also be moderated to reduce users' exposure to it, such as disabling access to such content when reported by users. Child sexual exploitation and abuse material and terrorism content must be proactively detected and removed.
  • User Management – DSMS should provide users with tools and options to manage their own exposure to unwanted content and interactions, such as hiding unwanted comments on their feeds or limiting contact and interactions.
  • Safety Information – Easily accessible safety information should be provided to users, such as Singapore-based resources or contacts to local support centres. It is further proposed that relevant safety information (such as helplines and counselling information) be pushed to users that search for high-risk content.
  • Young Users – DSMS will be required to implement additional safeguards to protect young users, such as stricter community standards and tools that allow young users or parents/guardians to manage and mitigate their exposure to harmful content and unwanted interactions. Such tools could be activated by default for account users under 18 years of age, with warnings provided when settings are weakened.

           Safety information should similarly be provided to young users and their parent/guardians, including                       guidance on how to protect young users from content that is harmful or age-inappropriate, and from                       unwanted interactions.

  • User Reporting and Resolution – DSMS will be required to provide an efficient and transparent user reporting and resolution process for users to alert DSMS to content of concern. DSMS should assess and take appropriate action on user reports in a timely and diligent manner.
  • Accountability – DSMS will have to produce annual reports on their content moderation policies and practices, and the effectiveness of their measures in improving user safety. These reports will be published on the IMDA's website

Content Code for Social Media Services

The proposed Content Code for Social Media Services is intended to deal with harmful content that is not managed by the Code of Practice for Online Safety, including extremely harmful content in relation to (i) suicide and self-harm; (ii) sexual harm; (iii) public health; (iv) public security; and (v) racial or religious disharmony or intolerance.

The Content Code for Social Media Services will allow the sectoral regulator, IMDA, to direct any social media service to disable access to specified harmful content for users in Singapore, or to disallow specified online accounts on the social media service from communicating content and/or interacting with users in Singapore. 

Concluding Words

The Public Consultation reveals a wide range of obligations which may be imposed through the proposed Code of Practice for Online Safety and the Content Code for Social Media Services. Should these codes come into operation, social media services and related service providers will have to ensure that they comply with the relevant obligations by implementing the necessary measures and safeguards. This will require a review of their operations and policies for compliance.

Some issues to be considered include the criteria used to designate DSMS, whether the designations will be reviewed on a regular basis, the mechanisms of the tools that are provided to users and young users to manage their exposure, the timeline by which social media services are expected to comply with any direction to disable access, as well as and the timeline for social media services to implement these changes.

Organisations should carefully review the proposed obligations and submit any feedback to MCI, including the appropriateness of the scope of obligations and any practical issues that may arise from their implementation.

The Public Consultation is open from 13 July 2022 to 10 August 2022. Feedback may be submitted via the online feedback form here. For further information, the full Public Consultation is available here.