With the rampant use of social-media and online platforms as advertising and marketing tools, we are no strangers to the methods deployed by websites and online platforms to grab a consumer’s attention. We all have experienced those moments where we are casually scrolling on Instagram and come across an advertisement for exactly what we were looking for or even thinking about!
Little to our knowledge, what is being advertised to us in the name of ‘personalized experience’ is in fact its notorious cousin named ‘dark patterns’. These tailored experiences which at the face of it seem innocuous and thoughtful are in fact manipulative schemes employed by website owners, online platforms and such leading a consumer into giving up their data, time and money either by way of misleading advertisements or entrapment of consumers in a vicious cycle of continued subscriptions/memberships etc.
What are Dark Patterns?
The term ‘dark patterns’ which was first coined in 2010 by the London based – UX designer Harry Brignull, has now acquired a commonplace status in the digital marketing and advertising industry. However, majority of the consumers and internet shoppers remain unaware of the misleading and exploitative aspect of UI/UX interfaces deployed in digital advertising and marketing, not-so-fondly referred to as dark patterns.
Harry Brignull defined Dark Patterns as “a user interface that has been carefully crafted to trick users into doing things that are not in their interest and usually at their expense”.
Simply put, dark patterns is a user interface that has been crafted to trick or manipulate users into making choices that are detrimental to their interest. Dark patterns manifest in different shapes and forms i.e. practices, such as, drip pricing, trick questions, nagging, disguised ads, bait and switch, roach motel etc.
One of the more serious ramifications of the disguised use of dark patterns by websites is to trick customers into granting consent of being tracked, or for having their data being used in a manner not expected or desired. These dark patterns are often presented in a manner where the consumer is lead to believe that they get to decide the extent of data shared or actively control the privacy settings, when in fact the interface is designed to intentionally steer consumers toward the option that gives away the most personal information.
THE GLOBAL SCENE
In the United States of America, the Federal Trade Commission (FTC) is armed with the responsibility of protecting consumer interests as well as to also educate them on the ever evolving risks associated with digital media and online marketing and advertising.
In a recent report, the FTC noted how companies are increasingly using sophisticated design practices known as “dark patterns” to trick or manipulate consumers into buying products or services or giving up their privacy.
The report, Bringing Dark Patterns to Light, found dark patterns used in a variety of industries and contexts, including e-commerce, cookie consent banners, children’s apps, and subscription sales. The report also lists down the popular dark pattern tactics being used by such industries:
- Misleading Consumers and Disguising Ads
- Making it difficult to cancel subscriptions or charges
- Burying key terms and junk fees
- Tricking consumers into sharing data
The report also highlighted the FTC’s efforts to combat the use of dark patterns in the marketplace and reiterated the agency’s commitment to taking action against tactics designed to trick and trap consumers.
Similarly, the European Data Protection Board (EDPB) came out with Guidelines on Dark patterns in social media platform interfaces: How to recognize and avoid them. These Guidelines provide best practice recommendations to designers and social media platform providers on how they can assess and avoid dark patterns in social media interfaces that violate the requirements of the General Data Protection Regulation (GDPR).
The EDPB has recognized six major categories of dark patterns:
Overloading: Users are provided with too much information to push them to provide more personal data than necessary.
Skipping: Deceptive designs that distract users from worrying about the protection of their personal data. Here, the most invasive features and options are already enabled by default.
Stirring: Wordings or visuals that are presented in a way that influences users’ emotional state to lead them to act against their data protection interests. This dark pattern has a higher impact on children and other vulnerable categories of data subjects. For example, users are more likely to overlook or have difficulty reading small font sizes or text written in colors that do not contrast sufficiently.
Hindering: Providing misleading information to users to either push them to provide unnecessary personal data or influence their decision by holding them up and questioning their initial choices.
Fickle: Unclear designs that make it hard for the user to navigate the different data protection control tools or understand the purpose of the processing.
Left in the dark: Interfaces that hide information or data protection tools or leave users unsure of how their data is.
THE INDIAN CONTEXT
To address the growing concerns surrounding dark patterns as well as to educate and sensitize the consumers, Advertising Standards Council of India (ASCI) has employed a twelve member task force comprising of different stakeholders armed to examine the various dark pattern practices prevalent in India. Based on the deliberations made and discussions held by the task force, ASCI is looking to extend the ASCI Code on misleading ads to cover dark patterns, such as:
Drip pricing: Ambiguous or incomplete depiction of price, in particular, excluding non-optional price components, such as, taxes, duties etc. which are only revealed at the very end of the buying process, thereby creating ambiguity around the final price as well as preventing easy price comparisons. Quoted prices must include non-optional taxes, duties, fees and charges that apply to all or most buyers.
Bait and Switch: When an ad directly or indirectly implies one outcome based on the consumer's action, but instead serves an alternative outcome, the same would be considered misleading
False urgency: Stating or implying that quantities of a particular product are more limited than they actually are
Disguised advertisement: An advertisement that is of a similar format as editorial or organic content must clearly disclose that it is an ad. Examples could be influencer posts, paid reviews, and ads placed in a manner to appear like content.
Checkbox treachery: Obfuscatory checkboxes in the form of opt-in or opt-out checkboxes that businesses use to give customers notional control over how their contact data is used.
Sneak-in basket: When consumers purchase something, additional products are added into the basket of the consumer, without their knowledge.
Privacy: Interfaces that trick users into sharing more information than they intended to. Users may give up this information unknowingly or through practices that obscure or delay the option to opt out of sharing their private information.
Confirm-shaming: Repeatedly asking users for the same thing. There is often no option to make it stop, with the hope of eventually breaking users and getting them to agree to sharing data or agreeing to unfair terms.
ASCI has also invited the public and other stakeholders to provide their inputs before the extension of the code to such practices is implemented.
In 2021, ASCI called for more transparency in social-media campaigns, urging influencers to clearly and unequivocally disclose promotional content so as to enable consumers to make an informed decision. Disguised advertisement is also an identified dark pattern which ultimately affects consumers’ decision making. The ASCI recently noted that approximately 29% of advertisements processed in the year 2021-22 were disguised advertisements by influencers. While cryptocurrency and personal care topped the list, the disguised advertisement was seen to be prevalent in other categories like fashion, e-commerce, food & beverage, finance etc. as well.
The covert nature of dark patterns makes its identification and consequent resolution challenging. However, as the perversion of online space by dark patterns directly affects the consumer experience, careful monitoring of such practices is crucial.
From the foregoing discussion, it is evident that one of the more challenging aspects associated with identification of dark patterns is creating a distinction between persuasive advertising and manipulative tactics. In order to identify and eliminate instances of dark patterns, cooperation between all stakeholders i.e. brand owners, website designers etc. is needed. The brand owners and advertising platforms must work together to keep the consumers’ interests at the forefront wherein advertising is carried out in a transparent and ethical manner as opposed to by manipulation.