In Glawischnig-Piesczek, C-18/18, the CJEU has ruled that online intermediaries (such as social media platforms) can be ordered to remove from their platforms worldwide all content which is deemed identical or equivalent to that which is held to be illegal.
In late 2017, the Austrian politician, Mme Eva Glaswichnig-Pieszcek, brought a claim against Facebook Ireland Ltd (Facebook) for an order that Facebook remove a public comment that a user had posted on the platform and which, Ms Glaswichnig-Pieszcek claimed, defamed her. The Vienna Commercial Court and, on appeal, the Vienna Higher Regional Court agreed and Facebook was ordered to remove the post. However, the exact scope of the order was disputed. The case was appealed to the Austrian Supreme Court which then made a reference to the CJEU.
The key questions for the CJEU to consider were: in addition to removing the original post itself, could host providers be ordered to remove other identical content from their platforms? Could the order go one step further and require host providers to remove equivalent content too and, if so, what exactly was meant by the term ‘equivalent’? Finally, should the order only apply in the country in which proceedings were brought or could it have wider geographical reach?
The relevant law
At issue was the interpretation of the E-Commerce Directive (200-31-EC) (the Directive). Under the Directive:
- a host provider (such as Facebook) is not liable for information shared on its sites if it has no knowledge of its illegal nature or if it acts expeditiously to remove or disable access to that information as soon as it becomes aware of it (Article 14(1))). However, this doesn't prevent a host provider from being ordered to remove or block access to illegal information (Art 14(3)); and
- host providers have no general obligation under the Directive to monitor their platforms for illegal content (Article 15(1)) however there may be exceptions to this in "specific case[s]" (Rectial 47).
The main consideration for the CJEU, then, was just how far an online intermediary's removal obligations go before breaching the no general monitoring obligation in the Directive.
In a decision handed down on 3 October, the CJEU ruled that the Directive does not prevent a court from ordering a host provider to:
- remove or block access to content that is identical to the original unlawful content;
- remove or block access to content that is equivalent to the original unlawful content. Equivalent means content which is “essentially unchanged” from the original unlawful content such that the host provider does not have to carry out an independent assessment of that content (thereby enabling it to use automated search tools/technologies), and where the content contains the specific elements specified in the order; and
- remove of block access to the content referred to in 1 and 2 above worldwide within the framework of relevant international law.
The CJEU's ruling is significant in further clarifying the obligations of online intermediaries under the Directive in respect of the content they host. The ruling is another example of the general movement by the courts and regulators to define the obligations of online platforms and the extent to which they are liable for such content. In the UK, the decision follows other attempts to draw these lines such as the government's Online Harms White Paper which was published earlier this year (see RPC's commentary on this here).
The decision has attracted some criticism because, on the face of it, it gives European courts the power to apply takedown requests internationally, including in countries where the original content may not itself be illegal. However, the CJEU did caveat this part of their decision by saying that such worldwide orders are permissible "within the framework of international law", although just what international laws the CJEU was referring to is unclear. Further, from a practical perspective there may well still be difficulties in trying to enforce these kinds of orders outside of the EU and EEA (if there aren't the same mutual recognition systems for court orders).
Going forwards, the courts will need to set out clearly and carefully in the order what content they consider to be 'equivalent' to the original illegal content. Quite where the line is draw in relation to a host provider’s obligations to monitor and search their platforms in relation to specific content is likely to be subject to further litigation.