otp header default image

Online Safety in Singapore – The Recent Online Safety (Miscellaneous Amendments) Bill

Share this on:

Singapore, like many countries around the world, is on a trend to regulate online content to enhance the online safety of users. UK has its own Online Safety Bill as does Ireland’s Online Safety and Media Regulation Bill and the EU’s Digital Services Act.

Singapore’s Online Safety (Miscellaneous Amendments) Bill was passed in Parliament on 9 November 2022 and came into force on 1st February 2023.  The Bill primarily makes appropriate amendments to the Broadcasting Act 1994 with the introduction of a new Part 10A. A minor clarification amendment to the Electronic Transactions Act 2010 was also made.  There is no stand-alone Online Safety Act.  The Bill was tabled after consultations with stakeholders and the public.

In the second reading of the Bill in Parliament, the Minister of Communications and Information highlighted that most online platforms do not fall within the remit of the then-existing provisions of the Broadcasting Act. As such the Bill was intended to regulate social media platforms because of their high volume of harmful content.

The new Part 10A in the Broadcasting Act empowers the Infocomm Media Development Authority (IMDA) to regulate online communication services (whether from within Singapore or outside) accessible by Singapore end-users. The measures that IMDA can take are: (a) to issue codes of practice for providers of regulated online communication services; and (b) to issue blocking directions to online communication services providers and to internet access service providers to deal with egregious content.  A new Fourth Schedule to the Broadcasting Act lists the online communication services that are within Part 10A. At present, the Fourth Schedule is limited to “social media service”.

Codes of Practice

Along with the press release by the Ministry of Communications and Information on 31 Jan 2023, the IMDA also released its draft Code of Practice for Online Safety for further consultation. The Code is expected to be implemented in the second half of 2023. Online communication services that have significant reach or impact can be designated by IMDA as regulated online communication services (ROCS). ROCS providers will be required to comply with the Code.

The draft Code has provisions requiring ROCS providers to put in place systems and processes to mitigate the risks to Singapore users (in particular children of different age groups) from exposure to harmful content and to account to its users for such measures. Harmful content is much wider than egregious content that can give rise to blocking directions by IMDA. Harmful content covers sexual or violent content as opposed to sexually violent content (per egregious content). It also includes cyberbullying content and content facilitating vice and organised crime.

The draft Code has sections on: (a) User Safety;  (b) User Reporting; and  (c) Accountability. Its key provisions are:

(a) The ROCS provider must implement community guidelines, standards, and content moderation measures to minimise users’ exposure to harmful content.

(b) Users must have access to tools to help them manage their own safety and exposure to harmful content.

(c) Users must have easy access to information related to online safety, including Singapore-based safety information.

(d) The ROCS provider must have technologies and processes in place to pro-actively detect and remove child sexual exploitation and abuse material and terrorism content.

(e) The ROCS provider must have targeted measures to minimise children’s exposure to inappropriate content, including children appropriate community guidelines, standards, and content moderation measures.

(f) Children must not be sent targeted content that is detrimental to their physical or mental well-being.

(g) The children and their parents/guardians must have access to tools to enable them to manage the children’s safety and minimise their exposure to harmful or inappropriate content. The tools must limit, not such what content the child can see, but also limit who else can see the child’s information or interact with the child. Unless access by children is restricted, children must be provided with their own accounts where the default settings are robust and more restrictive appropriate to the age of the children.

(h) Users must be able to report concerning content or unwanted interactions. The mechanism must be easy to use and transparent.

(i) Such users’ reports must be assessed and appropriate action taken in a timely and diligent manner, depending on the severity of harm. Action taken can include taking down the content and warning or banning the account that posted the content.

(j) Where the report is not frivolous or vexatious, the reporting user must be informed of the decision and action taken. If action is taken against the user who posted the content, that user must also be informed of the decision and action taken. These must take place without undue delay. The users have the right to ask for a review of the decision and action taken.

(k) The ROCS provider must submit to IMDA annual reports on the measures that are put in place to combat harmful and inappropriate content. The report should include: (i) how much and types of harmful or inappropriate content they encountered ; (ii) what steps have been taken to mitigate Singapore users’ exposure to harmful or inappropriate content; and (iii) what action has been taken on user reports. The report will be published on IMDA’s website.

The draft Code is accompanied by Guidelines that provide non-exhaustive examples of harmful content for all users and inappropriate content for children.

Failure to comply with the codes of practice without a justifiable reason can result in a financial penalty not exceeding S$1 million or directions to remedy the failure.

Blocking Directions

If IMDA finds egregious content on online communication services, directions can be issued to the online communications provider and to internet service providers to disable access to such content by Singapore end-users. Egregious content includes content advocating or instructing self-harm or suicide; physical or sexual violence; terrorism; child sexual exploitation; public health risk in Singapore; or likely to cause racial or religious disharmony in Singapore.

There are 3 types of directions that IMDA can issue:

(a) A direction to an online communication service provider to disable access by Singapore end-users to the egregious content. As an example, to block a post on a social media site from being viewed by a Singapore user through a browser or mobile device.

(b) A direction to an online communication service provider to stop the delivery or communication of egregious content to Singapore end-users. As an example, to block an instant message containing egregious content or a link to egregious content from being sent to Singapore users.

(c) A direction to an internet access service provider to block access by Singapore end-users to an online communication service if the provider of that online communication service fails to comply with an IMDA direction. This can mean that the entire service is blocked and not just the post or message with the egregious content.

Failure to comply with a blocking direction to an online communication service provider can result in a fine not exceeding S$1 million and a further fine of not more than S$100,000 per day for a continuing offence.  Failure to comply by an internet access service provider can result in a fine not exceeding S$20,000 per day up to a maximum of S$500,000.