What do audiovisual service providers need to consider following the UK’s implementation of key aspects of the revised Audiovisual Media Services Directive (AVMS)?

The key takeaway

New rules apply to UK on-demand platform services (ODPSs) and video-sharing platforms (VSPs) that require concrete steps to be taken to proactively protect children and the general public from harmful online content.

The background

The European Commission has released guidelines on the interpretation of some aspects of the AVMSD, which are an interesting insight on how the European Commission evaluates the scope and application of the AVMS Directive. One of its core purposes is to regulate illegal and harmful online content and it extends these rules to cover certain social media platforms, if the provision of programmes and user-generated videos constitutes an “essential functionality” of these services. The guidelines provide a list of relevant indicators that can be used to assess the essential character of the audiovisual functionality of a platform.

The background

On 28 November 2018, the revised Directive (2018/1808/EU) amending the AVMS Directive (2010/13/EU) was published to better reflect the current media landscape and create a more level playing field between traditional television and on-demand and video-sharing services. The new Audiovisual Media Services Regulations 2020/1062 (the Regulations) implement the revised Directive and made the necessary amendments to the Broadcasting Acts of 1990 and 1996 and the Communications Act 2003. See our Autumn 2020 Snapshots for previous discussion on the guidelines issued in relation to implementation.

The development

The main changes to UK law introduced through the Regulations are to:

  1. Align the rules on protection from harm for ODPSs with that of linear TV

Content standards and advertising rules for on-demand program services are amended to bring them in line with those for broadcast television. Services are required to protect minors from harmful content using measures proportionate to the potential harm, including through selecting the time of the broadcast, age verification tools or other technical measures.

  1. Quota of 30% share of European works

    The Regulations reinforce obligations to promote European films and TV shows in on-demand services by introducing a 30% quota for European works on services. European works are works originating from certain European countries, or from qualifying co-productions involving those states. Exemptions apply where the service has a low turnover or a low audience or it is impracticable or unjustified for the requirements to apply because of the nature or theme of the service.

  2. Introduce new rules for Video-sharing platforms

The Regulations extend EU standards on illegal and harmful content to VSPs and requires providers to take “appropriate measures” to achieve specified protection purposes. These purposes are:

  • to protect minors from content and advertising that might impair their physical, mental or moral development;
  • to protect the general public from content and advertising that incites violence or hatred towards people with certain protected characteristics; and
  • to protect the general public from content and advertising that is a criminal offence under EU law to circulate (ie terrorist content, content containing child sexual exploitation and abuse, and racist/xenophobic content).

Appropriate measures”, for these purposes, include having in place and applying certain terms and conditions of service for users, establishing and operating flagging and reporting mechanisms, age verification systems, systems to rate the content and easy-to-access complaints procedures, and the provision of parental control systems. The Regulations introduce greater controls for content which is under the direct control of service providers, specifically commercial communications (ie advertising) that are marketed, sold or arranged by service providers.

Why is it important?

The House of Lords previously made statements on online content responsibility and accountability of platforms in relation to the spread of harms. The Regulations take a step in the direction of regulation that seeks to require organisations to demonstrate accountability for content on their platforms. The Regulations make provision for enforcement powers of Ofcom, including the power to give enforcement notices and to impose a financial penalty of up to 5% of “applicable qualifying revenue” or £250,000, whichever is the greater and the power to suspend or restrict a service. Ofcom can also charge a fee and demand relevant information from service providers, for example, to determine its jurisdiction over a particular service, whether a service has notified itself or in order to determine the appropriate fee.

Any practical tips?

Organisations within scope are encouraged to engage with Ofcom to clarify any uncertainties during their implementation journey so as to inform future guidance and to collaborate on what best practice will look like. VSPs in particular should ensure they complete the following:

  • determine whether their online services are within the scope of the Regulations, which may involve consulting with Ofcom. The deadline by which to confirm to Ofcom whether the Regulations apply is 6 May 2021; and
  • assess whether their existing compliance frameworks are sufficient to be deemed compliant under the Regulations and identify technical and organisational measures that may be required to ensure compliance.

Note that many of the rules introduced by the Regulations will be superseded by the proposed Online Harms Bill which will address a wider range of harmful content across a broader range of media and platforms. The implementation date of the Bill is still to be confirmed, which may well mean that Ofcom seeks to apply the Regulations more intensely, to fill what it has identified as a regulatory gap.