We originally published an in July 2022, and have refreshed the article to include new information below.

There is increasing public pressure on internet companies to intervene with content moderation, particularly to tackle disinformation, harmful speech, copyright infringement, sexual abuse, automation and bias, terrorism and violent extremism. The new Online Safety Act is the British response to such public demand.

The Online Safety Act received Royal Assent on 26 October 2023, giving Ofcom powers as online safety regulator in the UK. Online platforms around the world will get the first detail of requirements for complying with the Online Safety Act on 9 November, when Ofcom says it will publish its first draft codes of practice and enforcement guidance for consultation. Ofcom has published a timeline with a comprehensive implementation schedule extending over three years.

Dame Melanie Dawes, Ofcom’s Chief Executive, said: “These new laws give Ofcom the power to start making a real difference in creating a safer life online for children and adults in the UK. We’ve already trained and hired expert teams with experience across the online sector, and today we’re setting out a clear timeline for holding tech firms to account. Ofcom is not a censor, and our new powers are not about taking content down. Our job is to tackle the root causes of harm. We will set new standards online, making sure sites and apps are safer by design. Importantly, we’ll also take full account of people’s rights to privacy and freedom of expression. We know a safer life online cannot be achieved overnight; but Ofcom is ready to meet the scale and urgency of the challenge.”

While there are undeniably legitimate policy objectives for state intervention in online content moderation, some argued that there is the risk of an adverse effect on competition. For example, government regulation could unwittingly serve to benefit large incumbents by raising the industries’ cost of doing business, with these companies being better positioned to bear the costs. Not so for the UK’s Competition and Markets Authority (CMA) and Ofcom.

The CMA and Ofcom issued a joint statement some time ago, which explores the potential synergies and obstacles arising from the increasing interactions between competition and online safety.

Underlying this statement is a firm belief that, if regulators adopt a joined-up approach, the UK consumer will greatly benefit from better overall online safety and better quality services. By making it easier for consumers and advertisers to switch platforms, the CMA and Ofcom say, competition interventions are able to strengthen incentives for platforms to improve their online safety procedures. However, the joint statement also acknowledge that interventions to enhance online safety could make it more difficult for new firms to compete with established services, showing how sometimes the agendas for online safety and competition may be conflicting. Moreover, platforms which acts as gateways for businesses to access customers can take on a ‘quasi-regulatory’ role and impose safety standards on other business which restrict competition. Therefore, clear and transparent online safety standards, and collaboration among regulators under the auspices of the Digital Regulatory Cooperation Forum (or DRCF, including also the Financial Conduct Authority and the Information Commissioner Office, in addition to the CMA and Ofcom), can ensure that firms meet safety requirements without unnecessarily distorting competition.

This thought-piece looks back at the CMA/Ofcom joint statement and illustrates how effective compliance with the newly adopted Online Safety Act will require a joined up approach with UK competition law.

The interactions between competition and online safety in digital markets

  • Online services have created significant benefits for users by making it easier to access content and people online, while digital technologies have decreased the costs of disseminating this content.
  • But the features of online services can give rise to competition and safety harms. Platforms sell advertising to monetise users’ attention and data, and have innovated in how they place advertising.
  • Some platforms now have significant market power, which may have a detrimental effect on consumers and can contribute to online safety risks, such as through harmful content being spread rapidly or though algorithmic content selection increasing polarisation.
  • Mitigating these safety and competition harms requires distinct policy tools.
  • But these features of online services can link safety and competition concerns – for example, user interfaces intended to increase the price of switching can negatively impact competition and make it difficult for users to leave.
  • This creates scope for regulators to share knowledge. The CMA and Ofcom, alongside other DRCF members, have published a discussion paper on algorithmic harms and how to audit them.
  • They are also sharing knowledge on how consumer engagement is affected by choice architecture in relation to online services.

Competition and online safety: three main scenarios

  • This section of the joint statement explores the scope for interactions between competition and online safety under these broad categories: policy synergies, policy tensions and unnecessary constraints.
  • Policy synergies: in a truly competitive market, consumers should have more autonomy in deciding which platform would best safeguard them online. As a result, to remain in business, providers would be incentivized, if not forced, to meet the consumers’ demands.
  • Informing users in relation to the risk of coming across harmful content on various platforms could help drive competition to offer greater safety, if it empowers users to leave in search of better protection.
  • By creating more choice, hopefully consumers will engage most with those platforms best at safeguarding their commercial interests.
  • The CMA’s online platforms and digital advertising market study found that a lack of transparency in the market was undermining competition in the sector – advertisers would like control over their brand image by knowing the types of content they are being advertised next to.
  • Policy tensions: with respect to policy tensions, the main obstacle for regulators is the possibility that strict online safety regulations increase the barriers to entrance, thus leading to a calcified market of only large players.
  • As online safety and competition regimes have distinct policy aims, sometimes interventions in support of one objective may adversely impact outcomes for the other. The CMA and Ofcom consider it vital to mitigate such impacts where possible.
  • The CMA and Ofcom have determined that, where Big Tech platforms possess significant bargaining power, conduct requirements could ensure terms imposed on publishers are reasonable.
  • This could be in relation to remuneration received for content, or publishers’ access to the data regarding how users’ view their content.
  • Unnecessary constraints: with respect to unnecessary constraints, issues mainly arise with gateways platforms who privately set their security, privacy and safety standards.
  • Gateway platforms are requiring other firms to comply with certain standards to interact with their platform, which needs to be monitored as it can affect wider market participants and competition.
  • The CMA and Ofcom need to consider alternative safety solutions if platforms adopt safety precautions which unnecessarily restrict competition – once Ofcom takes on its new position as the online safety regulator this will require further engagement.
  • The latest report of the CMA’s mobile ecosystems market study provides many examples of Apply using online safety to justify restrictions that lessen user choice or competition.
  • One example is Apple’s restriction of cloud gaming apps on its App Store. Apple stated that it was not able to apply its parental controls to cloud-streamed games and that each game required an App Store product page to display privacy details. However, the CMA found that this led to a sub-optimal experience and some parental controls could be applied to a cloud gaming app as a whole, alleviating safety worries.

Future Directions

  • By uniting their concurrent powers as competition and consumer enforcement authorities in the communications sector, the two regulators will be more adjourned on the technological and commercial developments in the industry, thereby also encouraging greater dialogue between the industry and the regulators. Ultimately, a coordinated approach will help the CMA and Ofcom to obtain the best results for UK consumers.
  • The Online Safety Act requires that Ofcom considers the impact of proposed online safety interventions on firms of different sizes and capacities, specifically small businesses.
  • As the CMA discharges its duties as a cross-sectoral competition and consumer authority, it may need to consider the impact of its interventions on consumer safety.
  • The CMA will continue to work closely with other regulators, such as the DRCF, and with the FCA and ICO on online fraud.
  • Generally the CMA and Ofcom expect that the upcoming online safety regime will require deeper collaboration and coordination between them.
  • They will share knowledge about technological and commercial developments and their implication for regulation – digital markets areas where the CMA and Ofcom share a common interest include algorithms, ad tech, cloud and the metaverse.
  • The CMA will work together to identify opportunities and challenges for online safety and competition, specifically in relation to synergies, tensions and unnecessary constraints, as identified above.