top of page

Identifying Anticompetitive Harms from Deceptive Design in India

by Isha Suri


This article explores and maps deceptive design practices in different types of digital platforms and the anticompetitive harms arising from these identified practices.




“Profit is sweet, even if it comes from deception.”

--Sophocles


Introduction

Digital markets have compelled us to reimagine the way humans navigate their lives. While ostensibly they enable efficient economic organisation by reducing information asymmetries, they also pose complex challenges for consumers and enforcement authorities alike. Use of deceptive design is one such challenge. According to a report released by the Federal Trade Commission (2022) incorporating ‘deceptive design’ or ‘dark patterns’ to manipulate users into certain choices may have adverse repercussions (FTC, 2022). Examples include signing up for recurring bills or buying insurance along with a particular transaction (The Verge). For context, a study carried out by the Consumer Policy Research Centre (CPRC) in Australia found that nearly 83% of Australians have lost money, lost control of their data or have been manipulated by a business as a result of a website or app using design features aimed at influencing their behaviour. (CPRC, 2022).


A neologism coined nearly a decade ago by user design specialist Harry Brignull, the term “dark pattern” refers to design practices that trick or manipulate users into making choices they were originally unlikely to make and that have the potential to cause harm. Harry also set up darkpatterns.org a crowdsourced platform with the intention of identifying and compiling a set of deceptive design practices (Deceptive Design). Since then, numerous deceptive design practices have been identified by researchers operating in the space.


The table below provides a brief overview of the identified deceptive design practices:



Dark Patterns and Anticompetitive Harms

Deceptive design practices or dark patterns can cause harm in numerous areas including but not limited to consumer protection, privacy, and competition (Web Foundation, 2022). The scope of this article would be limited to discussing anticompetitive harms that may emanate due to use of dark patterns. Research suggests that deceptive design practices have the wherewithal to weaken or distort competition by shifting the incentive from competing on price and quality of products to less beneficial attributes such as ‘price displayed’ upfront or pressurising consumers to buy products through false claims of scarcity (CMA, 2022). Furthermore, dark patterns can be particularly harmful in cases where an entity has significant market power or is a monopoly since the monopolist in question can employ dark patterns to maintain, leverage or exploit their market position (CMA, 2022). This piece seeks to map interlinkages between dark patterns and antitrust through a review of existing empirical evidence, scholarly literature, and regulatory interventions.


Traditionally the use of ‘dark patterns’ has not been viewed from the lens of antitrust, presumably due to ‘consumer welfare’ being measured largely through effects on ‘consumer prices’ (Khan, 2017). Thus, apparently low prices and ostensibly free services have enabled tech firms to evade antitrust scrutiny until recently. Furthermore, it is received wisdom that firms compete by advertising, designing products, and employing tactics meant to persuade consumers (Day, Stemler 2020). However, it has been demonstrated through investigations in various jurisdictions that dominant digital platforms employ exclusionary means such as self-preferencing to impair competition (Day, Stemler 2020). This is also facilitated because platforms not only serve as gatekeepers of critical infrastructure but are also integrated across markets, successfully leveraging their dominance in one market to monopolize another ancillary market. For instance, there is growing evidence to suggest that Amazon copies products sold on its platform and then buries listings of the copied rival products (Day, Stemler 2020). Consequently, its rivals’ listings for products are demoted (Dudley Renee, 2020) and it can restrict competitors’ access to more prominent areas of the website. Evidence suggests that the more popular a firm, more likely it is to employ at least one dark pattern. For example, the deceptive design website has compiled nearly 400 examples of deceptive design practices being employed by websites and Google, Facebook, Amazon and LinkedIn (Owned by Microsoft) account for the most commonly complained-about companies (Deceptive Design).


In addition to violating an individual’s right to informational privacy through unwanted collection and use of private information, dark patterns also hurt aspects of ‘decisional privacy’, which refers to the invasion of internal decision making. Dark patterns and other forms of online manipulation are effective because they make an individual’s reaction resemble an exercise of free will when in reality they are manipulated or tricked by the interface design to sign up for a service benefitting the business owner (such as an in-app credit system being preselected). (Day, Stemler 2020)


There is also research to suggest that while services can employ some dark patterns equally across modalities, many dark patterns vary between platforms, and that these differences burden individuals with varying experiences of autonomy, privacy, and control (Johanna Gunawan et al. 2021). To illustrate this variation better, the following section would highlight examples of dark patterns having the potential to harm competition within different types of digital platforms. Platforms are classified based on the taxonomy proposed by Cennamo (2019) and classify them as:


Multisided transaction platforms

Since multisided transaction platforms facilitate transactions between consumers and providers of goods and services in exchange for a price -based consideration, these markets tend to be price sensitive. A study released by the EU on dark patterns identified hidden information/false hierarchy, countdown timer/limited time message, preselection, and roach motel as the most prevalent dark patterns in marketplaces and e-commerce websites/apps (EU Dark Patterns Report, 2022). Practices such as drip pricing (where only a part of the total cost is advertised) are employed to lure consumers at the initial stages of the transaction by not disclosing additional mandatory charges such as delivery costs, service fee among others. Studies have found that due to drip pricing consumers end up paying more than what they would have paid if the prices were shown upfront (FTC, 2022). One of the reasons for consumers agreeing to pay high charges is users are likely to feel so invested in the process that they justify additional charges by completing the transaction to not waste their effort (Mathur A., 2019). Thus, consumers end up paying more, forcing us to question the underlying assumption that platforms have enabled ‘lower prices’ for consumers. Companies also use misleading scarcity claims creating a false sense of urgency. In addition to consumer harm all these practices also impact competition by limiting comparability between businesses and increasing search costs (CMA, 2022).




Figure 2. False Hierarchies where one option is made to stand out on e-Commerce Platforms. The first Image is an advertisement with a gold border accent highlighting it over other results which are better rated. Also, advertisement in the top result is almost merged with the background making it hard to discern that it is an advertisement.


Information Platforms

Information platforms facilitate categorization, search, and sharing of relevant information between exchange of information between users. Examples would include search engines, social media platforms, peer to peer travel information platforms, among others. Research by the Norwegian Consumer Authority in 2018 has demonstrated that these platforms nudge users towards privacy intrusive options and use complicated language, default settings, and make it harder to choose privacy preserving options (Forbrukerradet, 2018).


Since there is no monetary transaction involved, platforms predominantly compete on “quality of services” being offered (Kathuria, Suri, 2022). With companies such as DuckDuckGo using privacy as a parameter to distinguish itself within the market, privacy must be viewed as a non-price parameter of competition (Kathuria, Suri, 2022). Competition regulators across the globe are recognising privacy as a non-price parameter for competition. For instance, the Competition Commission of India (2021) took suo moto cognisance of changes to Whatsapp’s privacy policy which mandated users to share their data with Facebook. The CCI also elaborated in its Market Study on the Telecom Sector that competition analysis must also focus on the extent to which a consumer can ‘freely consent’ to action by a dominant player. Abuse of dominance can take the form of degrading privacy protection and therefore fall within the ambit of antitrust as a low privacy standard implies lack of consumer welfare (CCI, 2021).


Conclusion

As elaborated above, competition authorities have viewed the objective of competition law to enhance and protect consumer welfare viewed narrowly from the perspective of economic efficiency. However, certain African countries such as South Africa, Namibia, Malawi, Botswana, and Swaziland also include ‘public interest’ considerations in some form within competition regulation (Meyer, N., 2017).


The following table provides a brief overview of anticompetitive harms that can emanate from deceptive design practices based on our discussion above:

From the foregoing, it can be inferred that use of deceptive design practices can have an adverse effect on competition within the market, when viewed from the lens of price as well as non-price parameters of competition. However, in order to tackle some of the problems posed by digital markets generally including dark patterns, it is important to move beyond higher market output and lower prices as the sole objective of antitrust regulation. Albeit limited, there is also some literature probing the impact of these practices on collective welfare . Mathur et al (2021) demonstrate four types of collective welfare that can be diminished through dark patterns:

  • dark patterns harm competition by generating switching costs to consumers or even obstructing their choices

  • dark patterns hide the true costs of products from consumers and prevent them from comparison shopping

  • dark patterns can undermine consumer trust in markets and hurt companies who engage in legitimate and honest practices

  • dark patterns enable private companies to amass large databases with detailed profiles about individuals that it can use for advertising. But that information in turn can also be repurposed to create products that can undermine societal values.


In addition to being a consumer protection issue, enforcement authorities and policy makers must also ensure that the anticompetitive harms emanating from these manipulative design practices are recognised and acted upon. Undertaking more research documenting evidence of such harmful practices can be a good starting point. For instance, the UK's Competition and Markets Authority released a report in 2022 documenting potential consumer harms caused by ‘dark patterns’ in online choice architecture. We also need to work towards a multi-pronged regulatory framework, similar to the one being developed in the EU. While the Digital Services Act (DSA) bans the use of dark patterns by online platforms meeting the threshold requirements under the DSA. The EU has also included a prohibition on use of dark patterns under the Digital Markets Act (DMA). which seeks to ensure fair competition. The DMA predominantly seeking to ensure fair competition within the market acknowledges the role ‘dark patterns’ can play in circumventing obligations put in place by the DMA, and specifically forbids gatekeeper firms from using design to present end-user choices in a non-neutral manner, or using the structure, function or manner of operation of a user interface or a part thereof to subvert or impair user autonomy, decision-making, or choice. Through this the EU seeks to acknowledge that in addition to being a consumer protection issue, dark patterns can also be employed by large entities (gatekeeper firms in EU’s instance) to distort competition in the market. The Canadian Competition Act in its recent amendments has also recognised the competition related harms emerging from deceptive marketing practices and included ‘drip pricing’ (a commonly used dark pattern), and included new criminal and civil provisions on drip pricing.


There is a growing discussion around recognising choice architecture playing an important role in compliance with consumer protection and competition law (CMA, 2022). As technology continues to intersperse every aspect of modern lives, siloed regulatory response will not serve us well. Closer home, Indian policymakers and regulators would be well-advised to improve coordination between various departments dealing with digital economy related issues. While overlapping jurisdictions between various regulatory authorities and policymakers such as the CCI, Consumer Protection Authority, Telecom Regulatory Authority of India (TRAI), among others cannot be entirely eliminated, it must be harmonised through better regulatory design and improved lines of communication (CCI, 2021). Furthermore, the government must also commission research to understand the impact of deceptive design on consumer behaviour in order to address the issues in a meaningful manner and address lacunae that exist on account of regulatory grey area. The government has also recently set up a Committee on Digital Competition Law (CDLC) to examine the need for a separate law on competition in digital markets. The Committee should also study the impact of deceptive design practices by leading players or ‘Systemically Important Digital Intermediaries (SIDIs)’ on competition within the digital markets during its deliberations.



 

Download the full article here:


ABOUT THE AUTHOR


Isha Suri

Isha Suri is a Research Lead at the Centre for Internet and Society (CIS). Her areas of interest include Telecom Policy, Competition Law, Internet Governance, Intellectual Property Rights, and Platform Governance. She is an Electrical Engineer and holds an LL.B. (Hons.) with a specialisation in Intellectual Property Law from the Indian Institute of Technology, Kharagpur. Prior to CIS, she worked in the telecom and technology policy team at ICRIER, a New Delhi based think tank. During her time at ICRIER she was part of the team commissioned by the Competition Commission of India to work on the Market Study on the Telecom Sector in India. She has also worked with MeitY, Government of India and National Internet Exchange of India as a consultant.


SUGGESTED CITATION:

Isha Suri. (2023). Identifying Anticompetitive Harms in Deceptive Design Discourse. Unpacking Deceptive Design Research Series. The Pranava Institute <https://www.design.pranavainstitute.com/post/identifying-anticompetitive-harms-in-deceptive-design>











Comments


bottom of page