On 4 March 2019, Minister Richard Bruton TD announced that he will introduce an Online Safety Act to regulate harmful content online and ensure children are safe online. The Act will also implement the revised Audiovisual Media Services (AVMS) Directive (which Member States are required to implement by 19 September 2020). The Minister stated that the era of self-regulation in regard to online safety is over. It is proposed that an Online Safety Commissioner would oversee the new system. The Department of Communications, Climate Action and Environment is seeking views on the proposed legislation, and has launched a six-week consultation period which is open until 15 April 2019.

The proposed Online Safety Act will provide for:

  • New online safety laws applicable to Irish residents and
  • Regulation of Video Sharing Platform Services (e.g. YouTube), On Demand audiovisual services (eg. RTÉ Player, Netflix) and Traditional TV

The Minister acknowledges that putting in place a new national regulatory structure for dealing with the removal of harmful online content is a complex task for a number of reasons, including the existing legislative and regulatory measures in place in relation to specific content (for example in the area of Data Protection, Terrorist Content, Child Sexual Abuse Material or other criminal content); the role of An Garda Síochána in relation to illegal content such as child sexual abuse material and terrorist related content, and the eCommerce Directive which provides that online services are not liable for hosting illegal content of which they are not aware.

New Online Safety Laws applicable to Irish residents

The Minister intends to give the Online Safety Commissioner the powers necessary to ensure that the content which is most harmful can be removed quickly from online platforms. The views of stakeholders are sought in regard to the content that should be considered “harmful content” and which online platforms should be in scope. The Minister asks whether the following examples of “harmful content” are sufficient and appropriate:

  • Serious cyber-bullying of a child;
  • Material promoting self-harm or suicide, and
  • Material designed to encourage prolonged nutritional deprivation.

Online platforms are already required to remove content which is a criminal offence under Irish and EU law to disseminate when they are notified of it, such as material containing incitement to violence or hatred, content containing public provocation to commit a terrorist offence, offences concerning child sexual abuse material or concerning racism and xenophobia.

Powers of Online Safety Commissioner

The Government proposes assigning the Online Safety Commissioner with the following powers:

  • Certify that the approach of services to operating an Online Safety Code is “fit for purpose”.
  • Require regular reports from services ( i.e. content moderation, review, adjudication of appeals etc).
  • Assess whether the measures which a service has in place are sufficient in practice (i.e. by conducting audits).
  • Issue interim and final notices to services in relation to compliance failures, and seek Court injunctions to enforce such notices.
  • Impose administrative fines in relation to compliance failures.
  • Publish the fact that a service has failed to comply or cooperate with the regulator.
  • Require content takedown within a set timeframe, where a user is dissatisfied with the response they have received from a service provider.
  • Seek criminal proceedings be brought against the service provider.

Regulation of Video Sharing Platform Services (VSPS)

The revised AVMS Directive requires significant changes in regard to the way Ireland regulates audiovisual content both online and offline. A VSPN will be regulated in the country where it is established, without the need for further regulation in other EU Member States in which it offers its services. The revised Directive does not prescribe the content which is not permissible, rather it establishes principles (protection of minors from potentially harmful content; incitement to hate speech or violence; and criminal content) and requires a national regulator to be appointed to ensure that services have appropriate measures in place to meet those principles (such as parental controls and age-verification). It also requires VSPS to have a complaints mechanism in place where a user can make a complaint regarding content which is hosted on the service. The Minister proposes that the rules applicable to VSPS will also apply to platforms in respect of other user-generated content for Irish residents (e.g. photos, comments, or other material which is not audiovisual in nature). The consultation seeks views on what type of regulatory relationship should exist between a VSPS established in Ireland and the regulator; how should the Irish regulator determine whether the methods put in place by VSPS are sufficient to meet the relevant principles; and on what basis should the regulator monitor and review the measures which VSPS have in place.

Regulation of On-Demand Audiovisual Services and Traditional TV

The revised AVMS Directive requires a number of changes in regard to regulation of on-demand audiovisual services and traditional TV, including closely aligning the rules and requirements for traditional TV and on-demand audiovisual media services. The consultation seeks views on what type of regulatory relationship should exist between an on-demand audiovisual media service established in Ireland and the relevant Irish regulator, and whether the same content rules should apply to both traditional TV and on-demand services.

Regulatory Structure

The Minister seeks views on the most appropriate regulatory structure. At present the Broadcasting Authority of Ireland (BAI) is the National Regulatory Authority under the AVMS Directive for traditional TV, including the publicly funded broadcasters (RTÉ and TG4), and commercial television broadcasters such as Virgin Media Television. The BAI also regulates the traditional radio broadcasters who are established in Ireland.

The Minister proposes two ways in which an Online Safety Commissioner may be established:

  1. Restructuring and reforming the BAI as a Media Commission, along the lines of the multi-Commissioner Competition and Consumer Protection Commission. The Online Safety Commissioner could operate within that structure.
  2. Two Regulatory bodies, one of which would involve restructuring the BAI and assigning it responsibility for content which is subject to editorial controller (traditional television and radio broadcasting and on-demand services). The second online safety regulator would be a new body responsible for online content that is not subject to editorial controls (such as social media and video sharing platforms etc.)

Next Steps

Once the public consultation has concluded, the Minister will review the submissions and finalise a proposal. The Minister will then bring draft heads of bill to government for approval and publication.