The notion of so called “dark patterns” has recently garnered increasing attention, with their use in online sectors being scrutinised by regulators around the world. But what are they? What are the risks that their use poses to consumers? And what legal implications should businesses that utilise them be aware of? In this article, find out. We also provide an update on the UK consumer law reforms following the Autumn Statement last week.

Online choice architecture and dark patterns – what are they?

Dark patterns are a subset of a broader term, “online choice architecture” (OCA). If OCA is something that you have not heard of, it is something that you will have at least experienced personally. Essentially OCA is the designs, systems, and procedures that a website or software developer or operator implements to influence its end users’ decision making. OCA is a neutral term – it isn’t necessarily a bad thing and can in fact be helpful to a consumer. There are, however, certain forms of OCA that are viewed by regulators as deceptive, confusing or misleading and that potentially cause consumer harm. These, we call “dark patterns”.

The Competition and Markets Authority (CMA) – the body responsible for policing competition and consumer law compliance in the UK – published a report in April of this year on the use of OCA (‘Online Choice Architecture: how digital design can harm competition and consumers’). In their report, the CMA set out a taxonomy of OCA practices, which are a helpful guide to the types of OCA and their (often quite colourful) categorisation.

Based on the CMA’s taxonomy, OCAs (and in turn dark patterns) can be divided into three categories:

  1. Firstly, “choice structure”, which is the way in which choices are structured to dictate what a consumer sees and how difficult or time consuming it is to make a choice. Potentially harmful examples of this include “sludge” (which involves the creation of excessive or unjustified friction that makes it difficult for consumers to get what they want or to do as they wish, such as making it difficult to cancel a subscription) and “dark nudges” (which involves making it easy or frictionless for consumers to take an action, but with the risk of inadvertent or ill-considered decisions, such as one click purchases)).

  2. Secondly, “choice information”, which is how information is presented and framed to highlight certain things or make it harder to understand. Potentially harmful examples include “drip pricing” (where, during the customer journey, initially only part of a price for a product or service is shown, with the full price only revealed later) and “information overload” (whereby the consumer receives so much information about a product or service that information about the most relevant attributes is difficult to find and access).

  3. Finally, “choice pressure”, which can result in a consumer being put under some form of pressure (such as time pressure or through the display of messaging) in order to exert influence over a choice he or she may make. For example, “10 other users are looking at this product”, or “only 5 tickets left at this price”. Where true, such messaging can be helpful to a consumer, but clearly this is not the case where they are misleading.

Increasing regulatory scrutiny at home and abroad

OCAs and dark patterns have increasingly started to receive legislative and regulatory scrutiny, both on the Continent and in the UK.

The European Union

In the EU, an express ban on dark patterns has recently been introduced by way of the Digital Services Act (DSA), which was adopted in October 2022. The DSA applies to all digital services that connect consumers to goods, services, or content, and imposes new obligations for online platforms to reduce online harms and risks, introduces additional protections for consumer rights, and places platforms under a new transparency and accountability framework.

Article 25 of the DSA provides as follows in respect of dark patterns:

Providers of online platforms shall not design, organise or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions.

Only such practices which are not (indirectly) already covered by the Unfair Commercial Practices Directive or the GDPR are, however, covered by this prohibition. The examples of practices caught by the DSA given (rather broadly) in its recitals are said to include “exploitative design choices to direct the recipient to actions that benefit the provider of online platforms, but which may not be in the recipients’ interests, presenting choices in a non-neutral manner, such as giving more prominence to certain choices through visual, auditory, or other components, when asking the recipient of the service for a decision”.

Given how new the law is, however, there are no precedents to go by in terms of what specific types of practice will fall foul of the Article 25 prohibition. We do, however, expect greater regulatory scrutiny and investigations in this area, now the law is in force. Examples where guidance is expected include: (a) giving more prominence to certain choices over others when the consumer needs to make a decision; (b) repeatedly requesting that the consumer makes a choice where it has already been made, particularly where pop-ups are used; and (c) making it more difficult to terminate a service than subscribing to it.

This all follows the EU Commission’s publication of a 300 page Behavioural study on unfair commercial practices in the digital environment Dark patterns and manipulative personalisation in May of this year, which is driving action in the European Union. Unsurprisingly, the study stretches some of the common understanding of so-called dark patterns and made a number of recommendations, including: (1) adjustments to current legislation, including prohibiting certain practices expressly and introducing a positive fair/neutral design obligation; (2) guidance for businesses; and (3) stronger enforcement and use of class actions/collective redress (consistent with the general direction across the EU and in the UK). On the back of this, and in addition to the DSA, an EU consultation on dark patterns more generally is anticipated shortly. It is also worth noting that the European Data Protection Board published some guidelines earlier this year, which offers practical recommendations as to how to avoid dark patterns in social media interfaces that infringe on GDPR requirements (specifically, in respect of ensuing fair processing, transparency, and data minimisation).

The UK

Whilst there are no UK laws that specifically reference dark patterns, this is not to say that such activities can be used freely without concern in the UK. Depending on how a particular activity is undertaken, a range of laws and rules could be breached indirectly.

The CMA’s focus to date has been on the impact that dark patterns may have on both competition and consumer protection laws. In respect of the latter, the focus has been on unfair commercial practices, which are prohibited by the Consumer Protection from Unfair Trading Regulations 2008 (CPUT Regulations). There is also scope for these practices to put a business in breach of advertising rules and data privacy laws, in the form of the UK Code of Non-Broadcast Advertising (the CAP Code) and the Data Protection Act 2018 and UK General Data Protection Regulation, respectively. Indeed, the Advertising Standards Authority also issued a guidance note earlier this year on dark patterns and how which techniques could fall within the remit of advertising regulation.

On 30 November 2022, the CMA announced it has launched an investigation into the Emma Group in relation to its use of online urgency claims, such as countdown clocks. In making the announcement, the CMA said: "Today’s announcement marks the start of a new programme of consumer enforcement work focused on so-called ‘Online Choice Architecture’ and aimed at tackling potentially harmful online selling practices, including pressure selling tactics such as urgent time limited claims." It also threatened: "..the CMA will use its full range of powers to ensure that misleading selling practices are tackled from all angles."  It is therefore clear that it is very likely we will see more CMA investigations into these practices in the near future.

Update on UK consumer law reforms

Earlier this year, the UK Government tabled proposals to vastly enhance the powers the CMA has - enabling it to directly enforce eye wateringly high fines of up to 10% of global annual turnover for consumer law breaches.

In the Autumn Statement last week, it was announced that the Government would bring forward the Digital Markets, Competition and Consumer Bill in the third parliamentary session (likely early next year). The announcement confirmed the Bill would include new powers for the CMA to promote and tackle anti-competitive practice in digital markets (including updated fine thresholds) and would include specific provisions designed to tackle so-called “subscription traps” and fake reviews. Whilst it remains to be seen whether the long-promised sanctions for consumer law breaches will be included, the direction of travel in respect of consumer law in the UK (like in the EU) is clear.

Final comments

The idea of “compliance by design” – in much the same way as “privacy by design”, planning a businesses’ processes and use of technology with consumer law compliance at its heart – is gaining traction. With dark patterns an area of increasing regulatory interest, and of interest to claimant law firms looking to drum-up their next class action, consumer-facing businesses would be wise to start to consider their uses of such practices, and whether changes should be made now, before regulatory intervention and wider claims ramp up in earnest.