Ministry of Communications and Information
Ministry of Communications and Information
Consultation Period:
13 Jul 2022 - 10 Aug 2022

Detailed Description

Public Consultation on Enhancing Online Safety For Users in Singapore


1. The Ministry of Communications and Information (MCI) invites the public to provide feedback on proposed measures to enhance online safety for Singapore-based users of social media services. The public consultation will run from 13 July 2022 to 10 August 2022.


Prevalence of harmful online content on social media services

2. Social media services have transformed the way we live, work and play, bringing new and interactive opportunities for people and businesses in Singapore. However, for the good that these services bring, they can also be a place of danger. Harmful online content can lead to serious consequences in the real world. Globally, there is widespread acceptance that services distributing online content, even where it is user-generated, have a responsibility to keep their users safe from harm. 

3. While many of these services have made efforts to address harmful content, it remains a concern, especially when published on services that reach a wide audience, or when the content is targeted at specific groups of users. This includes content that:

a. Endorses acts of terrorism, extreme violence, or hateful acts against certain communities;

b. Encourages suicide and self-harm, including engaging in risky behaviours that threaten one’s life;

c. Threatens to destabilise one’s physical or mental well-being, through harassment, bullying, or the non-consensual sharing of sexual images.

4. These harmful online content can be amplified on social media services. For example:

a. Propelled by platform algorithms and user interest, content such as dangerous video challenges can go viral rapidly, leading to injuries and deaths;

b. In the case of terrorist acts, the impact of the events is made worse through the spread of videos captured through live-streaming and resharing of content.

5. Tackling harmful online content is a global issue. Countries such as Germany and Australia have enacted new laws that require online services to limit the exposure to harmful content. The United Kingdom and European Union are also working on laws to address this issue. 

6. Harmful online content also affects users in Singapore. For example:

a. Harassment and threats of sexual violence. In 2021, a poll asking people to rank local female asatizah (religious teachers) according to their sexual attractiveness was posted on social media. The post also promoted sexual violence and caused immense distress to the individuals involved.1

b. Religiously or racially offensive content that can incite religious intolerance and prejudice our racial harmony. In 2021, a Singaporean man pretended to be a woman from another ethnic group, and posted multiple racially offensive and insensitive public posts on a social media service.2

7. The negative impact of harmful online content to users is of concern: 

a. Almost half of Singaporeans polled3 by the Sunlight Alliance for Action4 in January 2022 said they have personally encountered such content. 

b. In another study, more than half of parents (54%) in Singapore reported that their children encountered inappropriate content online5. Young users may be vulnerable and lack the capacity or experience to deal with harmful online information and content, such as when they are exposed to age-inappropriate content such as sexual and violent content. 

8. We recognise that some social media services have put in place measures to protect their users. However, such measures vary from service to service. Additionally, when evaluating harmful content on social media services, Singapore’s unique socio-cultural context needs to be considered. Given the evolving nature of harmful online content, more can be done, especially to protect young users. 

Proposed Measures to Enhance Online Safety

9. To address the risks of harmful online content, MCI is considering two new measures: 

a. Code of Practice for Online Safety: Designated social media services with significant reach or impact will be required to have appropriate measures and safeguards to mitigate exposure to harmful online content for Singapore-based users. These include system-wide processes to enhance online safety for all users, and to have additional safeguards for young users6.

b. Content Code for Social Media Services: There may be content that is particularly harmful to our society, such as content that incites racial or religious disharmony or intolerance. Where such content has not been detected by the social media services themselves, we intend for Infocomm Media Development Authority (IMDA) to be granted powers to direct any social media service to disable access to such content for users in Singapore. 

Code of Practice for Online Safety

User Safety 

10. We are considering requiring designated social media services to have community standards for the following categories of content:

a. Sexual content 
b. Violent content
c. Self-harm content
d. Cyberbullying content
e. Content endangering public health
f. Content facilitating vice and organised crime

(Illustrative and non-exhaustive examples of such content are at Annex A)

11. These designated services will also be expected to moderate content to reduce users’ exposure to such harmful content, for example to disable access to such content when reported by users.

12. For child sexual exploitation and abuse material, and terrorism content, these services will be required to proactively detect and remove such content.

13. Designated social media services could also provide users with tools and options to manage their own exposure to unwanted content and interactions. These could include tools that:

a. Allow users to hide unwanted comments on their feeds

b. Allow users to limit contact and interactions with other users

14. We propose that designated social media services provide safety information that is easily accessible to users. This could include Singapore-based resources or contacts to local support centres. 

a. We also propose that relevant safety information (e.g., helplines and counselling information) be pushed to users that search for high-risk content (e.g., those related to self-harm and suicide).

Additional safeguards for young users

15. Given our concerns about the impact of harmful online content on young users, we propose for designated social media services to put in place additional safeguards to protect young users. 

16. These additional safeguards could include stricter community standards for young users, and tools that allow young users or parents/guardians to manage and mitigate young users’ exposure to harmful content and unwanted interactions. For example, tools that: 

a. Limit the visibility of young users’ accounts to others, including their profile and content;

b. Limit who can contact and/or interact with accounts for young users; and

c. Manage the content that young users see and/or experience.

17. The tools could be activated by default for services that allow users below 18 to sign up for an account. The services could provide warnings to young users and parents/guardians of young users of the implications when they choose to weaken the settings.

18. Likewise, social media services should provide safety information that is easy for young users to access and understand. The safety information should provide guidance to young users and parents/guardians on how to protect young users from content that is harmful or age-inappropriate, and from unwanted interactions. 

User Reporting and Resolution

19. Given the sheer volume of content being created and shared on social media services, there may be instances where users come across harmful content, despite the safeguards put in place by social media services. As such, we propose for designated social media services to provide an efficient and transparent user reporting and resolution process, to enable users to alert these services to content of concern. 

20. The user reporting and resolution process could:

a. Allow users to report harmful online content (in relation to the categories of harmful content outlined at para 10) to the social media service;

b. Ensure that the reporting mechanism is easy to access and easy to use.

21. As part of this process, the service should assess and take appropriate action on user reports in a timely and diligent manner.


22. We propose for designated social media services to produce annual reports on their content moderation policies and practices, as well as the effectiveness of their measures in improving user safety. These reports would be made available on the IMDA’s website for the public to view. Through these reports, users will be able to better understand how their exposure to harmful content is reduced on the services they use.  

Content Code for Social Media Services

23. The proposed measures under the Code of Practice for Online Safety are expected to deal with most of the harmful online content that Singapore users may encounter when using designated social media services. However, there may be instances where extremely harmful content remains online in relation to:

Suicide and self-harm 
Sexual harm
Public health
Public security 
Racial or religious disharmony or intolerance

(Illustrative and non-exhaustive examples of such content are at Annex B)

24. Given the concerns about the impact of such extremely harmful content, we propose for the Content Code for Social Media Services to allow IMDA to direct any social media service to disable access to specified harmful content for users in Singapore, or to disallow specified online accounts on the social media service from communicating content and/or interacting with users in Singapore. 

Working Together to Improve Online Safety

25. The aim of the proposed Code of Practice for Online Safety and Content Code for Social Media Services is to safeguard Singapore-based users on social media services, so that they can feel as safe there as they do in the real world. The Government cannot achieve this outcome alone. We will continue to work closely with stakeholders in the people, private and public sectors to strengthen online safety for all users.

We Welcome Your Feedback

26. We invite members of the public to provide their feedback in response to the above proposals by 10 August 2022. Members of the public can submit their feedback via the online feedback form at or clicking the button below. Organisations may wish to provide feedback using the email template attached.

27. We will review all feedback received and refine our proposals where appropriate. We will also publish a summary of the key feedback received, together with our response, following the end of the public consultation.


1Police investigating offensive poll ranking female Islamic teachers; President Halimah and other leaders criticise poll”, The Straits Times, 27 May 2021

2Man jailed for racially offensive tweets under pseudonym ‘Sharon Liew’”, CNA, 8 Jun 2021

3 Online poll conducted by Sunlight Alliance for Action (AfA) in January 2022 with more than 1,000 Singaporeans on the perceptions, experiences, and the prevalence of online harms in Singapore.

4 The Sunlight AfA to tackle online harms was launched in July 2021 to tackle online harms, especially those targeted at women and girls. The AfA takes a whole-of-nation partnership approach and members of the AfA include individuals across the 3P sectors, coming together with the aim of closing the digital safety gap and creating an inclusive digital space.

5 “Rising concerns about children’s online wellbeing amid increased encounters of cyber threats in 2020”, Google Survey, 9 Feb 2021.

6 Young users refer to individuals below the age of 18.