The Commission yesterday (28 September 2017) released non-binding guidelines that push online platforms to take measures to increase the proactive prevention, detection and removal of illegal content, including in particular content that incites hatred, violence or terrorism. The full communication can be viewed here.

Despite claiming that the Communication provides clarification to platforms on their liability when they take “Good Samaritan” proactive steps to identify and remove illegal content, the Communication has adopted more of a “stick” than “carrot” approach. The Commission:

  • expects online platforms to take swift action;
  • will monitor their progress; and
  • will assess whether further measures are needed – essentially suggesting that if platforms do not adopt adequate voluntary measures the Commission may regulate.

Whilst the press release focuses on extreme content, the Communication is far broader in scope to include all forms of online platform and all forms of illegal online content, including defamation, harassment, unlawful data processing, and intellectual property infringements.

Expected voluntary measures

The Commission set out a raft of measures that they expect all online platforms to implement voluntarily by May 2018. They include:

  • Detection and notification – Platforms should:
    • cooperate more closely with competent national authorities;
    • appoint contacts to ensure they can be contacted rapidly to remove illegal content;
    • have mechanisms to allow users to flag illegal content and to invest in automatic detection technologies;
    • work with “trusted flaggers”, which are specialised entities with expert knowledge on what constitutes illegal content; and
    • use and develop automated tools to assist with removal of illegal content.
  • Effective removal – In order to promote effective removal mechanisms platforms should:
    • remove the illegal content as quickly as possible (in cases such as incitement to terrorist acts, there should be specific timeframes);
    • clearly explain their content policy and issue transparency reports, detailing the notices received to users;
    • introduce safeguards to prevent the risk of over-removal of content; and
    • take measures to dissuade users from repeatedly uploading illegal content.

How the measures sit alongside the “hosting defence” in the E-Commerce Directive

The Communication is very clear in its view that proactive algorithmic measures can be put in place without online intermediaries losing their liability exemption under the E-Commerce Directive. It considers the current regime in Articles 12-15 of the E-Commerce Directive already provides sufficient protection in this regard. It summarises as follows:

The Commission is of the view that proactive measures taken by those online platforms which fall under Article 14 of the E-Commerce Directive to detect and remove illegal content which they host – including the use of automatic tools and tools meant to ensure that previously removed content is not re-uploaded – do not and in of themselves lead to a loss of the liability exemption.”

In particular, the taking of such measures need not imply that the online platform concerned plays an active role which would no longer allow it to benefit from that exemption. Whenever the taking of such measures lead to the online platform obtaining actual knowledge or awareness of illegal activities or illegal information, it needs to act expeditiously to remove or disable access to the illegal information in question to satisfy the condition for the continued availability of that exemption.”

In other words, the Commission has simply clarified the current commonly held position under the E-Commerce Directive, whilst seeking to side-step arguments that it is calling for the kind of general monitoring that is prohibited under Article 15 of the Directive.

Despite such “clarification”, by implementing the new measures, platforms will essentially accelerate the “knowledge” element of the hosting defence and therefore expose themselves to potential liability if they don’t act as quickly to remove the illegal content. For all the Commission’s talk of “facilitating” good practices, the Communication makes no mention of any incentives for platforms to carry out these voluntary measures and increase their risk in this way.

Next steps

The Commission has stated that this Communication is a first step, and that follow-up initiatives will depend on the online platforms’ implementation of the guidelines by May 2018. Many in the online community will see this as a missed opportunity to incentivise voluntary measures rather than simply dangle the threat of regulation. Others see it as a failure by the EU to act decisively and are pushing for greater regulation.

Against this backdrop, platforms must carefully consider their next steps – a failure to act effectively may well result in binding legislation. In particular they must:

  • take careful note of the new measures which are likely to increase their monitoring responsibilities;
  • ensure that their policies and procedures for dealing with infringing content are tighter than ever; and
  • make sure that the Commission is aware of the self-regulatory actions that many platforms are already undertaking to address some of the concerns in a more pragmatic matter.