Last week, a plenary session of the European Parliament has voted to adopt changes to the Directive on Copyright in the Digital Single Market (or the Copyright Directive for short), that was first introduced by the European Parliament Committee on Legal Affairs on 20 June 2018.

The Copyright Directive contains a number of sections that have been extremely polarising – critics argue that the Copyright Directive will encourage a form of censorship while supporters argue that the Copyright Directive will protect the works of artists and creators. One of the most contentious parts of the Copyright Directive is Article 13.

Article 13 arguably makes “online content sharing service providers”, which are platforms that store and give access to materials uploaded by users, liable for copyright infringement committed by the users. Examples may include Google, YouTube and Facebook. The effect of Article 13 is unknown, but it has been speculated by critics that this would mean that the platforms will introduce an “upload filter”, where intermediaries will scan every piece of content uploaded by users and then check it against a database of protected material. If the content is found to have infringed copyright, then the content will be removed from the platform.

The Australian content industry has praised the reforms. For example, the Australian Independent Records Association has told trade press website The Industry Observer that “the moves bring balance to the industry”, stating:

It’s quite simple: Article 13 rebalances the licensing framework in Europe and creates a fair and equitable environment for creators who rely on the European economy to sustain their careers and businesses moving forwards,” said Maria Amato, CEO at AIR. “We applaud the work of our colleagues at IMPALA for their sustained efforts on this very important issue.

The Australian reports that French President Emmanuel Macron said the vote was a “great advance for Europe”, and also reports that the Federation of European Film Directors, the Federation of Screenwriters in Europe and the Society of Audiovisual Authors also welcomed Wednesday’s vote.

So why be concerned? 13 reasons!

User-rights blind: Regardless of whether you sit on the “fair use” or “fair dealing” side of copyright law, an automated system is going to struggle – if not be incapable – of making subjective decisions about the nature of use. This could affect educational use, media commentary and criticism, political commentary, and other uses. In this way, the proposal has been described as “user-rights blind” in that it may disregard a person’s legitimate means of using copyright material in favour of resolving the dispute between rights holders and large technology companies.

Remixes, mash-ups, sampling: A fraught area of controversy in music law is the issue of sampling and remixes – particularly in transformative works. How will such an automated system determine whether a remix, mash-up or sample has been legitimately deployed in a sound recording? For recording artists like The Avalanches or Girltalk, who rely predominantly on an incredible array of sampled sound records and musical works, will a system be able to differentiate between what is licensed, what is not a substantial part, what is a fair use or fair dealing, and what is not?

User-generated content: Won’t somebody think of the memes? Australian copyright law, as does the European Union’s copyright law, protects the use of copyright material for parody and satire. Again, it seems unlikely that the automated system will be able to differentiate between unlawful use of copyright material and satirical use or use as/in parody.

Unintended consequences for aggregators: Many service providers store and aggregate user-generated content. As software code repository, GitHub, has argued:

False positives (and negatives) are especially likely for software code because code often has many contributors and layers, often with different licensing for different components. Requiring code-hosting platforms to scan and automatically remove content could drastically impact software developers when their dependencies are removed due to false positives.

Disproportionate technology burden: The requirement to implement a system capable of analysing user-uploaded content disproportionately affects different service providers. Small and medium sized enterprises, including start-ups, are unlikely to be able to develop or implement technology with the scale required to proactively analyse the data they hold. As such, they are likely to require a third party to perform this function for them.

Surveillance: As Julia Reda argues, given the disproportionate technology burden described above:

The proposal requires the installation of what amounts to surveillance technology. Due to high development costs, content monitoring technology will likely end up being outsourced to a few large US-based providers, strengthening their market position even further and giving them direct access to the behavior of all EU users of internet platforms.

Even with the GDPR amendments, many would be uncomfortable with the aggregated analysis of user-uploaded content in this manner.

Independent creators: Julia Read also argues that “platforms will receive instructions as to what content to automatically remove” by the larger commercial rights-holders. This may disproportionately affect not only smaller platforms, but smaller independent creators, who will have less resources to effectively challenge automatic removal of their content – particularly where they will have relied on a fair dealing or fair use, have a licence or authorisation not recognised by the filter – or just simply do not infringe.

Deferral to “best practices”: To address some of these concerns, particularly around smaller operators, the amended draft proposes that stakeholders draft best practices:

When defining best practices, special account shall be taken of fundamental rights, the use of exceptions and limitations as well as ensuring that the burden on SMEs remain appropriate and that automated blocking of content is avoided.

If the Australian experience is anything to judge by, asking rights holders and intermediaries to negotiate these kinds of industry wide agreements tends to be, well, pained. Whatever did happen to the Copyright Notice Scheme?

Transparency: There are no obligations on the online platforms to inform the users on how a particular filtering system works or what rights users have. The complexity of copyright law generally means that users may not have the necessary information to allow them to defend themselves when automated filters or content recognition identifies their content as (potentially) unlawful.

Court-facilitated dispute resolution: Paragraph 2(b) of the amended proposal requires Member States to implement a judicial mechanism for review:

Unfortunately, as both lawyers and their clients know – litigation can be expensive and time-consuming, and a review of individual decisions may be beyond reach for many.

Licence to kill: Apart from the pressure that could be placed on content sharing providers to err on the side of caution, and implement content recognition technologies at the point of upload, the Directive may also pressure some providers to seek a broad licence from rights holders to offset risk.

Compatibility with current EU law: Whistle-blower website Statewatch published a leaked EU Council document, in which a number of European countries are strongly questioning its compatibility with European fundamental rights, and requiring platforms to monitor content could potentially contradict the intermediary liability protections (or, safe harbours) in European law.

Freedom of expression: Human rights lawyers have also expressed overarching concerns about a filter-system’s potential to impinge on freedom of expression. For example, the United Nations Special Rapporteur, David Kaye, sent a lengthy letter to the European Commission outlining his concerns about Article 13, arguing:

I am very seriously concerned that the proposed Directive would establish a regime of active monitoring and prior censorship of user-generated content that is inconsistent with Article 19(3) of the ICCPR.

This has followed previous warnings by the UN Special Rapporteur that proposals to address digital piracy through website blocking and content filtering may result in restrictions that are not compatible with the right to freedom of expression and the right to science and culture.

…and the rest!

Advocacy group Communa also has expressed other concerns with the Copyright Directive as a whole:

  • Under Article 11, press publishers will obtain a new right allowing them to control how end-users access and reference press publications.
  • Under Article 3, rights-holders may have the right to prevent anyone other than scientific researchers from using computers to analyse information contained in otherwise legally accessible works.
  • Under the new Article 12(a), sports events organizers would become copyright holders allowing them to prohibit anyone from sharing photos or other recordings of sports events.
  • Finally under the new Article 13(b), extracted below, image search engines would need to obtain licenses for even the smallest preview images that they display as search results.

Communa’s interpretation would be contested – but the arguments are worth considering.

What’s next?

The Copyright Directive and the agreed amendments will now enter the stage of “trilogue negotiations” with the EU Council and Commission.

The result of these private discussions will determine the final text the European Parliament will have to vote on early next year. In most cases, the result of the trilogue negotiations is confirmed, but when that’s not the case the entire process of changes, negotiations, and a vote, starts over.

If the Copyright Directive is eventually adopted by the EU Parliament, the individual Member States will have to implement it into local law, which (naturally) is another hurdle that has to be passed.

The final view? I’m undecided – watch this space. If you watch anything else, just make sure it’s a legitimate streaming platform!

The amended Article 13

Article 13 – title

Text proposed by the Commission


Use of protected content by information society service providers storing and giving access to large amounts of works and other subject-matter uploaded by their users Use of protected content by online content sharing service providers storing and giving access to large amounts of works and other subject-matter uploaded by their users

Article 13 – paragraph 1

Text proposed by the Commission


Information society service providers that store and provide to the public access to large amounts of works or other subject- matter uploaded by their users shall, in cooperation with rightholders, take measures to ensure the functioning of agreements concluded with rightholders for the use of their works or other subject-matter or to prevent the availability on their services of works or other subject-matter identified by rightholders through the cooperation with the service providers. Those measures, such as the use of effective content recognition technologies, shall be appropriate and proportionate. The service providers shall provide rightholders with adequate information on the functioning and the deployment of the measures, as well as, when relevant, adequate reporting on the recognition and use of the works and other subject-matter. Without prejudice to Article 3(1) and (2) of Directive 2001/29/EC, online content sharing service providers perform an act of communication to the public. They shall therefore conclude fair and appropriate licensing agreements with right holders.

Article 13 – paragraph 2

Text proposed by the Commission


Member States shall ensure that the service providers referred to in paragraph 1 put in place complaints and redress mechanisms that are available to users in case of disputes over the application of the measures referred to in paragraph 1. Licensing agreements which are concluded by online content sharing service providers with right holders for the acts of communication referred to in paragraph 1, shall cover the liability for works uploaded by the users of such online content sharing services in line with the terms and conditions set out in the licensing agreement, provided that such users do not act for commercial purposes.

Article 13 – paragraph 2(a)

Text proposed by the Commission


New paragraph Member States shall provide that where right holders do not wish to conclude licensing agreements, online content sharing service providers and right holders shall cooperate in good faith in order to ensure that unauthorised protected works or other subject matter are not available on their services. Cooperation between online content service providers and right holders shall not lead to preventing the availability of non- infringing works or other protected subject matter, including those covered by an exception or limitation to copyright.

Article 13 – paragraph 2(b)

Text proposed by the Commission


New paragraph Members States shall ensure that online content sharing service providers referred to in paragraph 1 put in place effective and expeditious complaints and redress mechanisms that are available to users in case the cooperation referred to in paragraph 2a lead to unjustified removals of their content. Any complaint filed under such mechanism shall be processed without undue delay and be subject to human review. Right holders shall reasonably justify their decisions to avoid arbitrary dismissal of complaints. Moreover, in accordance with Directive 95/46/EC, Directive 200/58/EC and the General Data Protection Regulation, the cooperation should not lead to any identification of individual users nor the processing of their personal data. Member States shall also ensure that users have access to an independent body for the resolution of disputes as well as to court or other relevant judicial authority to assert the use of an exception or limitation to copyright rules.

Article 13 – paragraph 3

Text proposed by the Commission


Member States shall facilitate, where appropriate, the cooperation between the information society service providers and rightholders through stakeholder dialogues to define best practices, such as appropriate and proportionate content recognition technologies, taking into account, among others, the nature of the services, the availability of the technologies and their effectiveness in light of technological developments. As of [date of entry into force of this directive], the Commission and the Members States shall organise dialogues between stakeholders to harmonise and to define best practices and issue guidance to ensure the functioning of licensing agreements and on cooperation between online content sharing service providers and right holders for the use of their works or other subject matter within the meaning of this Directive. When defining best practices, special account shall betaken of fundamental rights, the use of exceptions and limitations as well as ensuring that the burden on SMEs remain appropriate and that automated blocking of content is avoided.