Following the increasing public and regulatory concerns regarding illegal, abusive and misleading information which is easily published on user generated content platforms, the European Commission recently published new guidance on Tackling Illegal Content Online - Towards an enhanced responsibility of online platforms. The document provides a set of guidelines and principles directed towards online platforms and their role and responsibilities in dealing with illegal content online.

Although the guidance neither changes the legal framework and nor is binding, its aim is to guide online platforms in cooperation with national authorities, EU Member States and other relevant stakeholders on the ways by which they can implement the good practices for preventing, detecting, removing and disabling access to illegal content, increasing transparency, as well as the protection of fundamental rights. The main elements of this guidance are as follows:

  • Online platforms should have the necessary resources to understand the legal frameworks in which they operate, and cooperate closely with law enforcement and other competent authorities where appropriate, notably by ensuring that they can be rapidly and effectively contacted for requests in order to remove illegal content;
  • Online platforms should cooperate closely with ‘trusted flaggers’ (i.e., specialized entities with expertise and dedicated structures for detecting and identifying such content online). This cooperation should provide for mutual information exchange, thereby expediting the removal process over time;
  • Online platforms should deploy easily accessible and user-friendly reporting mechanisms, which enable the notification of content, considered to be illegal, that the platforms might host. The mechanisms (through the issue of an appropriate notice), should be sufficiently precise and adequately substantiated in order for the platforms to be able to take a swift and informed decision for any follow up. Users should not be required to identify themselves in the notices, unless this information is required to determine the legality of the content;
  • Online platforms should utilize and develop automatic detection and filtering technologies and adopt effective proactive measures in order to detect and remove illegal content online, as quickly as possible, and not simply reacting to notices that they receive. Removal on a quick basis is particularly important and can be subject to specific timeframes where serious harm is at stake (e.g., if content is inciting the commission of any terrorist act). If in the context of the removal of illegal content, platforms find evidence of any criminal activity, this should be reported to the law enforcement authorities;
  • Online platforms' term of services should include clearly explained removal policies. In addition, the platforms should publish periodic transparency reports which provide detailed information regarding the number and types of notices they have received;
  • Online platforms should restore the content that was removed without any undue delay or allow for the re-upload by the user, without prejudice to the platform's terms of service, when a counter-notice provides reasonable grounds to consider whether or not the notified information or activity is illegal;
  • Online Platforms should put in place measures to dissuade users from repeatedly uploading illegal content. The guidance also encourages platforms to use and develop automated technologies to prevent the re-appearance of illegal content.