SINGAPORE, June 20 — In a bid to protect users from harmful online content, the Government is proposing two codes of practices for social media services including being able to direct such companies to disable access to specific content.

The Ministry of Communications and Information (MCI) at a press conference on Monday (June 20) said that the first proposal is for social media services with a large reach or high risk to have “system wide processes” to enhance safety for all users.

This would include having in place community standards and content moderation mechanisms to mitigate users’ exposure to sexual, violent and self-harm content.

The second proposal is for the Infocomm Media Development Authority (IMDA) to be able to direct social media services to disable access to “specified content” for Singapore users, or disallow specific online accounts on social media services to interact with or communicate content to Singapore users.

Advertisement

This may cover content relating to sexual harms, self-harm, public health, public security, and racial or religious disharmony or intolerance.

Industry consultations for the proposals started month and there will be public consultations on the proposal in July.

MCI said on Monday that the prevalence of online harms both globally and in Singapore is a major concern despite many online services working to address this issue.

Advertisement

“Such content that could propagate harm include those that endorse acts of terrorism, extreme violence, or hateful acts against certain communities, encourages suicide or self harm, or those that destabilise one’s physical or mental well-being, through harassment, bullying, or the non-consensual sharing of sexual images.

“These online harms are exacerbated when they are amplified on social media services,” said MCI.

For instance, platform algorithms based on user interest can propel content such as dangerous video challenges that can go viral rapidly, which can lead to injuries and deaths and acts of terrorism and their aftermath can also be spread through videos captured through live-streaming and re-sharing of content, it added.

MCI also pointed out that religiously or racially offensive content can incite religious intolerance and prejudice our racial harmony.

For example, it said, last year, a Singaporean man impersonated a Chinese female and posted multiple racially offensive and insensitive public posts on a social media service, denigrating minority communities in Singapore, and this has since been reported to the authorities. In 2020, a person behind the profile ‘NUS Atheist Society’ published a religiously offensive post that depicted the Bible and Quran as alternatives to be used in the event of a toilet paper shortage.

Abusive online behaviour such as harassment and sexual violence has also been prevalent, added MCI.

In 2021, a poll asking people to rank local female asatizah (religious teachers) according to their sexual attractiveness was posted onto social media.

“The post caused immense distress to the individuals involved and was found to have promoted sexual violence,” said MCI.

A survey in January conducted by the Sunlight Alliance for Action, a cross-sector alliance that tackles online dangers, found that 61 per cent of Singaporeans experienced online harms on popular social media services.

“MCI takes a collaborative approach towards governing the online space against harms,” it said.

“We recognise that the industry has taken active steps in recent years towards combating harmful online content on social media, and their contributions will be critical in shaping a safer and more responsible online space for users in Singapore.” — TODAY