Singapore launches public consultation on online hazard codes
The Singapore Ministry of Communications and Information (MCI) has launched a public consultation on two proposed codes of practice, a code of practice for online safety and a content code for social media services, which will impose requirements for social media services to ensure the online safety of users in Singapore. The consultation will close on August 10, 2022.
Once issued, the codes will be legally binding, like other codes issued by MCI and administered by the Infocomm Media Development Authority (IMDA) under the Info-communications Media Development Authority Act 2016.
This development comes at a good time, with a significant amount of discourse taking place today on social media platforms in addition to conventional channels. Taiwan and India are also moving to formulate similar regulations to govern online and social media harm. However, unlike previous IMDA codes of practice for cybersecurity, concurrency, and fiber optic network interconnectivity, these proposed codes have a more direct impact on end users. Although the public consultation did not explicitly define “social media services”, or at least not yet, one would expect the codes to apply to the main social media platforms accessible to users at Singapore today.
The proposed codes continue the trend of new legislation setting industry-wide standards, such as the Personal Data Protection Act 2012 (PDPA) and the Protection Against Falsehood and Falsehood Act 2019. online manipulation (POFMA). While PDPA covers user data leaks and POFMA covers misrepresentations of fact, there are other scenarios that can cause significant harm online. These include insensitive and provocative racist comments and the sharing of dangerous viral video challenges. In addition to filling gaps in those areas not covered by existing legislation, the proposed codes will also ensure the availability of local support resources that will be better suited to users in Singapore.
Code of practice for online security
The first measure proposed is a code of good practice for online security. The code of practice will introduce community standards for the following content categories:
- Harassment on the internet
- Public health endangerment
- Facilitating Vice and Organized Crime
When users search for high-risk content like self-harm, the code of practice will require social media services to provide relevant safety information to those users.
The three priority areas MCI has identified for community standards are:
- Child safety
- User reports
- Responsibility for the platform
Child safety is particularly important because a noticeable proportion of users are younger, with social media services becoming more ubiquitous. In addition to adhering to Community Standards for Online Content, the proposed code of practice requires social media services to provide tools to limit exposure to specified unwanted content. The MCI offers tools allowing in particular to hide undesirable comments and to limit contacts and interactions with other users.
Child safety is ensured by introducing new safeguards specific to young users, including a stricter set of community standards and more tools to limit exposure. MCI offers enhanced tools, including limiting account visibility, limiting contact and/or interaction, and managing the content young users see and/or experience. Guarantees can be activated automatically or provided on an opt-in basis. It remains to be seen whether social media services will be required to automatically scan encrypted content for infringing content, as proposed in draft legislation in some European jurisdictions.
User reports are becoming an increasingly important moderation tool with the fast pace and high volume of social media content being created today. Although the moderation teams and algorithms of social media services are a first line of defense against harmful online content, users can still encounter harmful content that slips through the cracks. Creating a system for users to report harmful content and for social media services to take action will reduce the amount of harmful content that escapes initial detection and is presented to users.
With great power comes great responsibility. As social media services roll out more moderation tools, their practices should also be disclosed to users. Although not all users take the time to read the annual reports published by social media services (separate from the requirements proposed by the code), the compilation of reports from various services using a standard format on an IMDA page, for example, increases liability. This will then allow users to provide constructive feedback and allow MCI to gain a more accurate understanding of the various moderation practices adopted by social media services over time.
Content code for social media services
The second proposed measure is a content code for social media services. This will allow IMDA to direct social media services to disable access to specified harmful content or ban specified online accounts. As its powers here are more drastic, the IMDA will only directly intervene in the operations of social media services in cases involving extremely harmful content, including:
- Suicide and self-harm
- sexual assault
- Public health
- Public security
- Racial or religious disharmony or intolerance
Where the IMDA determines that the content is extremely harmful, social media services shall disable access to such content for Singapore users and/or prohibit the relevant online accounts from communicating content to Singapore users. The Content Code prevails, even if the offensive content does not violate the social media service’s own policies. An example would be online content that specifically disparages a religious or racial group in Singapore, which might not be flagged by social media services. Such situations tend to be more complex and therefore should be identified by IMDA, as opposed to user reports.
Both proposed codes are relatively broad in scope and are important regulatory initiatives to help users feel as safe online as they do in the real world. Social media services will need to expand their compliance capabilities as the trend of increased regulation continues globally, including now with Singapore. Social media services should also continue to communicate with their users (for example, through advertising campaigns or collaborations) regarding any additions and changes to their community tools and standards so that those users can keep pace with developments. .
Finally, social media services are welcome to assess the feasibility of implementing the two codes and submit feedback to MCI on their implementation. While some accountability requirements are not radically different from existing reports that social media services publish, more work may be needed to incorporate the code of practice into existing community standards for content. MCI invites comments on:
- Categories of harmful online content
- Existing security measures and tools
- Feedback on the proposed security measures and tools
- Additional guarantees for young users
- Effective and transparent user reporting mechanisms
- Roles of community, private sector and government