Unfortunately, it is all too common in the online world that someone will try to manipulate you to make decisions you otherwise wouldn’t. One example of this is the use of “dark patterns.” We can find it in almost every online environment, especially in social media. In an attempt to address this issue, the European Data Protection Board has issued new guidelines on dark patterns in the context of GDPR1 (Guidelines). Below, we have prepared a summary of the key points.

What exactly are dark patterns?

The Guidelines define dark patterns as interfaces and user experiences implemented on social media platforms that lead users into making unintended, unwilling and potentially harmful decisions regarding the processing of their personal data. In other words, dark patterns are a kind of user experience technique that employs psychology to manipulate users to make decisions unfavorable to them where their personal data is concerned. Since dark patters can represent GDPR infringements, they should be avoided when designing user experiences.

Types of dark patterns

Dark patterns can take various forms. The Guidelines describe the following types:

1. Overloading occurs when social media users are confronted with a large number of requests, choices or are bombarded with information. The purpose of this practice is to induce users to share more data or to unintentionally allow personal data to be processed against the data subject’s expectations. This is done in the following ways:

  1. Continuous prompting – Repeatedly asking users to provide data or give their consent can wear down users to the point where they give in and agree to provide access to data that isn’t necessary for the purpose of the processing, so that they may use the platform in an uninterrupted manner. Example: A social media platform asks a user for his phone number every time he logs into his account. The request appears even though the user declined to provide a phone number during registration.
  2. Privacy maze – This type of overloading can be seen when information, such as a privacy notice, is not easily accessible, and users are forced to navigate through multiple pages of the website to find it.
  3. Too many options – The user of the social media platform should be able to easily check the privacy settings. For this purpose, the platform provider should compile such information in one easily visible place. Multiple tabs dealing with data protection or within the choices on data privacy can overwhelm the user. With too many options to choose from, the user is unable to make any choice at all or, in the cluster of information, will overlook some settings.

2. Skipping can happen when the interface has been designed in such a way that users are manipulated to forget about all or selected aspects of data protection issues. Two activities may be recognized as skipping:

  1. Deceptive snugness – Privacy by design should be reflected in the default data privacy settings. If the social media provider defaults to the most intrusive privacy settings possible, we are dealing with the Deceptive Snugness dark pattern. Example: When registering an account and providing a birthdate, users are asked to indicate the group of recipients with whom they will share this information. As the default setting, “all platform users” is indicated.
  2. Look over there – This type of skipping involves the introduction of competing or non-relevant elements into a document concerned with data protection issues. This can affect how users perceive the information and may distract them so that they ignore or skip the privacy options in favor of the competing element. Example: When providing information about cookies, operators of online platforms often benefit from the ambiguity of the word cookies, such as when asking for consent in a humorous way and applying the culinary meaning.

3. Emotional Steering uses vocabulary, grammar as well as visual measures to convey information so as to evoke a specific emotion in users to manipulate their behavior. Example: A social media platform writes user prompts so as to encourage the sharing of more personal data than is required: “Tell us about your amazing self! We can’t wait, so come on right now and let us know!”

4. Hidden in plain sight uses visual elements to trick the user into choosing less restrictive privacy settings. This can be done by the use of small fonts or font color that blends into the background. Structuring the message in this way may cause the user to overlook important information from a data protection point of view.

5. Hindering is a tactic whereby the user is delayed or prevented from familiarizing themselves with the relevant privacy information. Two types of activities fall under this dark pattern:

  1. Dead end – Users generally set the level of personal data protection when they first create their account on a social media platform. If, during this process, the user cannot access information about the processing of personal data, it is a dead end.
  2. Longer than necessary – This type of hindering doesn’t always conflict with provisions of the GDPR. Consequently, the use of this dark pattern should be considered case by case. Important is that users are not compelled into activating options that are more data-sharing invasive. Example: It is not possible to directly opt out from targeted advertisement processing, even though consent (opt in) requires only one click.

6. Fickle is based on an unreadable, unclear interface design. Consequently, it is difficult for the user to understand the purposes of the data processing. Lacking hierarchy and decontextualising are examples of fickle:

  1. Lacking hierarchy – This type of dark pattern occurs when the same data protection information appears several times but is presented differently. It serves to confuse users, who are unable to understand how their data is being processed. Example: Information on the rights of the data protection subject is available in several places in the Privacy Policy. Most rights are explained in the “Your rights,” section; however, the right to lodge a complaint with the supervisory authority as well as the contact details to the authority are provided in various other sections that relate to other elements of privacy protection.
  2. Decontextualizing – Important privacy information is located on a page different from its context, so that users would have difficulty finding it.

7. Left in the dark is a method of communication with the user designed to conceal information or data protection controls or leave users unsure about how their data is processed. Examples of these practices are:

  1. Linguistic discontinuity – The service is dedicated to users in their language, while information on privacy protection is not provided in the official language(s) of the country where the users live.
  2. Conflicting information – As a result of receiving conflicting information, users are unable to determine the consequences of their actions and become confused. In the end users keep the default privacy settings.
  3. Ambiguous wording or information – The terminology used is deliberately vague and ambiguous so that users are unsure how their data is processed or how to control it.

Dark patterns and GDPR:

It should be emphasized that dark patterns may, but do not have to, violate provisions of the GDPR. The EDPD Guidelines are only recommendations. Their release should be seen as a positive step aimed at strengthening the protection of natural persons. They provide important guidance for data controllers, allowing them to assess their practices from a GDPR compliance point of view. For data subjects the Guidelines can serve to raise awareness about the doubtful and often controversial actions of data controllers that may be used against them.

Moreover, as stressed by the EDPB, dark patterns may not only constitute unlawful interference in the sphere of privacy of social media users, they can also violate consumer protection regulations. That is why it is so critical for businesses operating in the e-commerce sector to analyze their activities in order to determine whether any could be assessed as deceptively or manipulatively influencing data subjects toward making more privacy invasive decisions. By staying vigilant in identifying the existence of dark patterns, and taking the necessary corrective measures, businesses can avoid costly court and administrative proceedings, not only before the Data Protection Authorities but also Consumer Protection Authorities