Introduction
GDPR
e-Commerce Directive
e-Privacy Directive

Copyright in the Digital Single Market Directive
Terrorist Content Online Regulation
Member state legislation against hate speech
Proposed Digital Services Act package


Introduction

Social media is a powerful tool. It is available immediately, to everyone, and it allows users to interact in ways that are completely unprecedented.

As with any other powerful tool, social media can be used malevolently. Any content can be made available on social media, including highly controversial or illegal content such as fake news and discriminatory or offensive content. This is why social media invariably features at the centre of so many political and geopolitical events.

Currently, in the European Union, there is no regulation that is applicable specifically to social media platforms. The regulatory framework is fragmented into multiple regulations.

GDPR

One such regulation is the EU General Data Protection Regulation (GDPR),(1) which has been applicable since May 2018. The GDPR has introduced various obligations – in particular, by granting additional rights to data subjects. The GDPR is often said to have set the standard for data protection regulation across the globe. All aspects of the GDPR are applicable to social media platforms, as data controllers, given the fact that they process a lot of personal data.

e-Commerce Directive

The EU e-Commerce Directive(2) is a set of rules on commercial communications for online contracting. An important aspect of the EU E-commerce Directive is the principle of limitation of liability for intermediary service providers.

Under certain conditions, intermediaries are exempted from liability for the content posted on their pages. Service providers hosting illegal content must remove or disable access to it as soon as possible after they become aware of it. Liability only comes into play if a service provider fails to take down unlawful content after they are notified of it.

e-Privacy Directive

The EU e-Privacy Directive,(3) which has been amended several times since it was enacted, regulates the confidentiality of communications and marketing communications. However, in light of the GDPR, and considering the major evolution of communication technology, the e-Privacy Directive has become outdated.

In 2017, the European Commission proposed a text for a new e-Privacy Regulation. However, the text is still going through the legislative process, so nothing is yet in effect.

Copyright in the Digital Single Market Directive

More recently, the EU Directive on Copyright in the Digital Single Market(4) introduced obligations for online content sharing platforms to prevent infringing content. This includes, for example, the use of technology to spot and prevent the upload of infringing content.

Terrorist Content Online Regulation

In April 2021, the EU Terrorist Content Online Regulation(5) was passed to tackle the spread of unlawful content promoting or facilitating terrorist activity. The regulation provides a legal framework to ensure that hosting service providers that make content available to the public address the potential misuse of their services for the dissemination of terrorist content online.

Online platforms must take down content within one hour of receiving a removal order from an EU member state authority. They must also take appropriate proactive measures when they are exposed to terrorist content, in line with the level of exposure, the size of the platform, its capabilities and its resources. Financial penalties for platforms can reach 4% of a platform's turnover, which is a similar level to the penalties available under the GDPR.

Member state legislation against hate speech

With respect to content that is related to (non-terrorist) hate speech on social media, there is currently no overarching EU level regulation. Certain member states have, however, adopted their own regulations against hate speech.

In particular, in Germany and Austria, the regulations aim to simplify the deletion procedures and include a complaints procedure to avoid situations of over-blocking – clearly, it is important to find a balance to uphold freedom of expression. In France, a similar law was rejected by the French Constitutional Council precisely because the law was considered to have a disproportionate impact on freedom of speech – in particular, because content could be taken down by private platforms with no intervention of the judicial authorities.

Proposed Digital Services Act package

Because of the regulatory patchwork and the lack of harmonisation at the EU level, the European Commission has developed the Digital Services Act package, which includes the Digital Services Act (DSA) and the Digital Markets Act (DMA). The DSA will create additional obligations for digital service providers, including specific obligations for "very large platforms", which are defined as platforms with over 45 million users.

Content liability
With respect to content liability, the DSA will essentially reproduce the liability safe haven that is currently provided for in the e-Commerce Directive. Therefore, a hosting service, if it has actual knowledge of illegal activity or content, must act expeditiously to remove it, or it will be held liable.

Reporting obligations
Very large platforms will also have specific reporting obligations to enforcement authorities in certain cases – for example, when people's safety is at stake.

Accountability
Like the GDPR, the DSA will introduce elements of accountability and financial fines. Such fines can reach 6% of the provider's annual turnover, which is even higher than those that can be issued under the GDPR. This shows that the DSA is not just an invitation for platforms to include mechanisms against hate speech; it actually creates binding obligations with potential fines at stake.

Notification systems
Online platforms must put mechanisms in place to enable any individual or entity to notify them of the presence of illegal content. Such mechanisms must be easy to access and user-friendly. The DSA also introduces the notion of "trusted flaggers" for very large platforms. The platforms will have to treat notifications from trusted flaggers as a priority, based on an internal complaints handling system that they must put in place.

Information and transparency obligations
The DSA also increases information and transparency obligations, including, for example, the obligation for very large platforms to provide transparency over the main parameters of the decision-making algorithm that is used to offer content. This is in response to the aforementioned concern over freedom of speech.

Next steps
A provisional political agreement between the European Council and the European Parliament on the DSA and the DMA was reached on 24 March 2022. The final version of the DSA is not expected until 2023, and it is likely that it will be delayed beyond that as well. Many modifications may need to be made, so it remains to be seen when it will enter into force.

For further information on this topic please contact Nasser Ali Khasawneh, Vincent Denoyelle, Christine Khoury or Edouard Burlet at Eversheds Sutherland by telephone (+44 20 7919 4500) or email ([email protected], [email protected], [email protected] or [email protected]). The Eversheds Sutherland website can be accessed at www.eversheds-sutherland.com.

Endnotes

(1) Regulation (EU) 2016/679

(2) Directive 2000/31/EC.

(3) Directive 2002/58/EC.

(4) Directive (EU) 2019/790.

(5) Regulation (EU) 2021/784.