On 1 March 2018 the European Commission (‘EC’) published its recommendation on measures to effectively tackle illegal content online. It contains a set of operational measures to be taken by hosting service providers and Member States before the EC determines whether to propose legislation. The Recommendation comes as a follow-up to the Communication of 28 September 2017 on tackling illegal content online, in which the EC had announced to monitor progress in tackling illegal content and assess whether additional measures were needed to ensure the swift and proactive detection and removal of illegal content online.

Scope

The Recommendation covers all forms of illegal content ranging from incitement to hatred and violence, child sexual abuse material, counterfeit products and copyright infringement. For terrorist content it contains intensified requirements. The Recommendation targets hosting service providers such as Youtube, Facebook and Instagram.

Notice and action

Platforms are encouraged to install clear notification systems for illegal content in order to guarantee the efficacy of ‘notice and action’ procedures. The notifications systems shall be user-friendly, easy to access and shall include fast-track procedures for ‘trusted flaggers’. There is a narrow time frame for removing terrorist content. The EC considers terrorist content to be most harmful in the first hours after the upload, therefore it is supposed to be removed within one hour after it has been flagged.

Proactive measures

The Recommendation recommends the use of automated means “only where appropriate and proportionate”. Proactive measures are meant to be used in particular for terrorism content and for content which can be identified as illegal without regard to the context, such as child sexual abuse material or counterfeited goods. Otherwise, substantial human judgement will often be required.

Safeguards

Automated means carry the risk of removing (legal) content unintentionally. Following the Recommendation the platforms are challenged by the need to protect legal content, and for that matter, fundamental rights, freedom of expression and data protection rules. To ensure that decisions to remove content are accurate and well-founded, the Recommendation provides that platforms should position safeguards. This includes human oversight and verification. Furthermore, content providers should be informed about the removal of content and should have the opportunity to contest it.

Cooperation with Member States

The Recommendation states that hosting providers should immediately inform law enforcement authority if there is evidence of a serious criminal offence or a suspicion that illegal content is posing a threat to life or safety. This is supposed to facilitate prosecution. Member States shall install the correspondent legal obligations as well as fast-track procedures for notices submitted by competent authorities. In regard to terrorist content, competent authorities and platforms are supposed to cooperate closely and conclude working arrangements, where appropriate (including Europol).

Outlook

The EC intents to monitor the actions taken in response to the Recommendation and determine whether additional steps are necessary (including legislation). In the meantime, the EC is expected to continue its analysis of the matter and to launch a public consultation in the coming weeks. This will be an opportunity for platforms and other stakeholders to contribute their position to the process. Furthermore, in order to allow for the monitoring, Member States and companies will be required to submit relevant information within six months for illegal content and within three months for terrorist content.