By Titiksha Vashist
How do we unpack the impact of deceptive design practices on people in Global South countries? How does deceptive design intersect with privacy, competition in markets and consumer protection? This series looks at the understudied harms, communities and experiences written by researchers, artists and civil society.
Deceptive designs, also called ‘dark patterns’ or deceptive patterns are prevalent across the web today- embedded across digital interfaces (screens, voice) as we learn, gain information, socialise and shop online. These UI/UX choices are called ‘deceptive’ because of their impact on the user. Deceptive patterns have been shown to lead to consumer harms such as loss of privacy, financial loss, loss of time, cognitive burden, psychological distress, and decline of trust online, among others, by way of manipulating or deceiving users into making decisions and choices they may not otherwise take. These patterns impair decisional autonomy, obscure information in online markets and trick users to behave in certain ways which are often beneficial to platforms which deploy them. Often a part of default settings on apps and websites, these practices are often in-built into online experiences and platforms we navigate.
Academics and researchers across fields of privacy and data protection, human-computer interaction, digital rights and responsible design have gathered ample evidence since 2010 (when the term ‘dark patterns’ was first coined) to understand how deceptive design can cause active harms, especially impacting marginalised communities, including senior citizens, women and gender minorities, children, families with lower incomes, and people who are new to the digital sphere. Research on deceptive design focuses on how these patterns can be classified and understood, in order to bring them into a shared taxonomy, given that deceptive patterns come in several dynamic shapes and forms, and often evolve rapidly. Classification and taxonomies help create linguistic consensus, and aid the evolution of legal and policy language around the term, thus facilitating policy and regulatory movement.
Fig. Mapping personal and structural consumer harms resulting from deceptive design practices.
While deceptive design is increasingly being recognised in countries across the world as a multi-faceted issue, research suggests that it impacts not just privacy of consumers, but also has systemic impacts on the digital economy. The last two years have been significant in policy shifts pertaining to deceptive design with the Federal Trade Commission in the US taking up the issue, and the Digital Services Act in the EU taking initial steps to regulate deceptive patterns. In a significant legal win, the Italian Data Protection Authority issued a decision against Ediscom, specifically referring to “dark patterns” in the hearing. According to Prof. Christiana Santos, a legal scholar working on deceptive design, this use of the term in a legal sense for the first time, sets a precedent for further use of the term in case law and regulatory decisions. The Australian Competition and Consumer Commission (ACCC), which released its consultation on introducing laws against ‘unfair business practices’ by companies such as Amazon. Research shows that subscription traps, such as those on e-commerce websites impact 3 out of 4 consumers in Australia. Australian consumers have to navigate confusing language and deliberately confusing UX to unsubscribe from services, unlike Europe, where unsubscribing was made a simple two-step process in 2022 after the European Commission stated that Amazon had breached the unfair commercial practices directive.
The use of deceptive design has also been linked to anti-competitive practices in the digital marketplace, and may pave the way for large companies to use personal data. Apart from privacy, several consumer harms have attracted regulatory attention from consumer protection agencies in several countries, including Norway, the European Union, the United States, India and Australia. However, as technology products and companies expand their presence across the globe and open new markets and opportunities, deceptive design, too, becomes a global challenge. The conversation around deceptive design has largely focussed on the European and American experience of deception online. Harms of deceptive practices manifest differently in various social and political contexts, including Global South countries like India. Communities of users in these contexts may find themselves increasingly vulnerable, especially taking into account the English-first nature of the internet, and the fact that millions of people are coming online for the first time, often without exposure to digital literacy. These exacerbate vulnerability along language, socio-economic position, access to education, etc.
This research series aims to expand the scope of the conversation around deceptive design as it happens outside dominant contexts and forefront perspectives from diverse regions, sectors and experiences.
The 'Unpacking Deceptive Design Series' was envisioned as a collaborative space for researchers, artists, civil society advocates, and interested individuals of the public to contribute from diverse disciplinary perspectives and fill knowledge and awareness gaps on the issue. The series invited contributions reflecting on deceptive design practices as it intersects with competition in digital markets, data protection and privacy, consumer protection online, financial security, human rights and social security across jurisdictions among other topics. The research series is part of The Design Beyond Deception project by The Pranava Institute, and this book serves as a companion to those who may benefit from the Manual for Designers created as the core output of the project.We have been fortunate to receive essays from researchers working in this space, who have contributed to the discussion in unique ways.
In her essay titled ‘Crafting a Definition for Deceptive Design/Dark Patterns Is Harder Than It Seems’, design researcher and critical designer Caroline Sinders asks the fundamental question- what makes an aspect of design manipulative or deceptive?The lack of a common definition of deceptive design or patterns, the ubiquitousness of these patterns, and finally- their ever-morphing nature makes Caroline suggest that in order to regulate such practices, design and context must be taken into account. Sinders draws from her work with the Information Commissioner’s Office (ICO) in the UK to bridge the gap between design and regulation, by working on evolving a fundamental definition.
Turning to the global south context, Monami Dasgupta, Vinith Kurian and Rajashree Gopalakrishnan painstakingly gather evidence of deceptive patterns in India’s fintech apps. Their analysis breaks down the user journey of nine popular apps from four financial services categories - lending, insurance, investments, and neo-banking, guided by the OECD taxonomy of deceptive patterns to map the patterns observed against the various types of harm they may cause to the user. These findings are a first of its kind evidence linking online deceptive patterns to possible harms in India’s rapidly growing fintech sector. This research becomes crucial, especially if one takes into account the impact of these practices in tier-two and tier-three towns in India, and corroborates with instances of documented financial loss reported.
Is deceptive design limited only to visual interfaces? The increase in use of voice interfaces, especially in India and non-English speaking countries shows how speech can increase the net of digital communication and allow more people to access services online. Research shows that much of the developing world uses voice interfaces in regional and local languages for search, evidenced by Google Search usage data. Add to this technologies like Amazon Alexa and IoT devices which are flooding the market due to their accessibility and often low-cost. Saumyaa Naidu and Shweta Mohandas explore how deceptive design practices as they play out in voice interface technologies, and may often make it hard for consumers to unsubscribe to services through voice (while they can subscribe using voice command), and make discoverability beyond defaults a challenge. This essay locates Deceptive Design in Voice Interfaces, and analyses its impacts on inclusivity, accessibility, and privacy.
While 2022 was a big year for policy tackling deceptive design at the global level, including in the US and the EU, Sacha Robahmed and Noor Chaabene explored the status of deceptive design reporting, policy and regulation in South West Asia and North Africa (SWANA). Before there can be policy change and increased awareness in the SWANA region, there needs to be a way to identify, gather and share data on deceptive design practices. Where are SWANA’s internet users experiencing being tricked, fooled, or deceived by deceptive design practices? What types of deceptive design are they facing? And are technology companies doing anything about deceptive design practices - are they implementing new ones or remedying existing practices? These questions serve as important starting points for their essay titled Exploring the Potential of App Reviews to Identify Deceptive Design Practices in Arabic-Speaking Countries.
Can accessibility and overlay tools that are expected to enhance accessibility for visually impaired users create deceptive interfaces? Maitreya Shah, Fellow at the Berkman Klein Center at Harvard University examines the user interface (UI) design strategies of accessibility overlay tools, and their implications on the access of the internet for people with disabilities. Shah also makes a pertinent point. People with disabilities form a large community online, and with a greater reliance on technology compared to their non-disabled counterparts. But the design of the internet often makes them more vulnerable to harm, and hinders access in a big way. This also makes them more vulnerable to deceptive patterns. For instance, screen readers are being used to access the web, and anything that gets tempered in that experience would impact the user. Therefore, factoring vulnerability becomes crucial for communities and groups which have a greater reliance on technology, and designing for accessibility and trust must go hand in hand.
Finally, do deceptive designs impact issues such as competition in digital markets? Isha Suri in her piece ‘Identifying Anticompetitive Harms from Deceptive Design in India’ investigates how multi-sided platforms and information platforms deploy deceptive design, often to skew markets in their favour, creating anti-competitive harms based on personal data collection, cost switching, and recommendation algorithms on platforms. Suri also recommends approaches for policymakers which are necessary to tackle an issue as multi-sided as deceptive design.
As technologies continue to evolve and newer interfaces emerge, deception can take different forms when it manifests itself in new technologies. Adopting principles of ethical technology design and human-rights centered design is crucial while building new technologies in the future. The boundaries of technology are constantly and rapidly changing, giving rise to interfaces beyond just the digital screen. Immersive digital experiences like AR, VR can be potential grounds for new forms of deception.The rapidly-evolving pace of generative AI models for user interfaces, trained on existing interfaces that are riddled with deception, is likely to exponentially increase both the volume and novelty of deceptive design in the digital sphere. This is not limited to apps, websites or voice interfaces alone, but also more immersive digital experiences like AR/VR, where the line of deception becomes increasingly blurred and more complex to categorise and therefore harder to regulate. It is therefore imperative that regulatory measures do not limit themselves to existing interfaces and their taxonomies but instead locate deception within human-technology interaction as a whole to design a collective future that is beyond deception.
Bibliography
OECD (2022), "Dark commercial patterns", OECD Digital Economy Papers, No. 336, OECD Publishing, Paris, https://doi.org/10.1787/44f5e846-en.
Authority, Competition and Markets. “Online Choice Architecture: How Digital Design Can Harm Competition and Consumers.” GOV.UK, April 5, 2022. https://www.gov.uk/government/publications/online-choice-architecture-how-digital-design-can-harm-competition-and-consumers.
OECD (2021), Roundtable on Dark Commercial Patterns Online: Summary of discussion, https://one.oecd.org/document/DSTI/CP/CPS(2020)23/FINAL/en/pdf.
“REGULATION (EU) 2022/2065 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 19 October 2022 on a Single Market For Digital Services and Amending Directive 2000/31/EC (Digital Services Act).” EUR-Lex Access to European Union Law, October 2022. https://eur-lex.europa.eu/eli/reg/2022/2065/oj.
Santos, Cristiana, and Ariana Rossi. “The Emergence of Dark Patterns as a Legal Concept in Case Law.” Internet Policy Review, July 31, 2023. https://policyreview.info/articles/news/emergence-of-dark-patterns-as-a-legal-concept.
Australian Competition and Consumer Commission. “Digital Platform Services Inquiry - September 2022 Interim Report - Regulatory Reform.” Australian Competition and Consumer Commission, November 11, 2022. https://www.accc.gov.au/about-us/publications/serial-publications/digital-platform-services-inquiry-2020-2025/digital-platform-services-inquiry-september-2022-interim-report-regulatory-reform.
“Consumer Protection: Amazon Prime Changes Its Cancellation Practices to Comply with EU Consumer Rules.” European Commission - European Commission, 2022. https://ec.europa.eu/commission/presscorner/detail/en/ip_22_4186.
Deceived by design- How tech companies use dark patterns to discourage us from exercising our rights to privacy, 2018. https://fil.forbrukerradet.no/wp-content/uploads/2018/06/2018-06-27-deceived-by-design-final.pdf.
DUPED BY DESIGN Manipulative online design: Dark patterns in Australia, 2022. https://cprc.org.au/wp-content/uploads/2022/06/CPRC-Duped-by-Design-Final-Report-June-2022.pdf.
“California Privacy Rights Act.” California Legislative Information, 2021. https://leginfo.legislature.ca.gov/faces/codes_displayText.xhtml?division=3.&part=4.&lawCode=CIV&title=1.81.5.
“As Internet User Numbers Swell Due to Pandemic, UN Forum Discusses Measures to Improve Safety of Cyberspace - United Nations Sustainable Development.” United Nations. Accessed October 3, 2023. https://www.un.org/sustainabledevelopment/blog/2021/12/as-internet-user-numbers-swell-due-to-pandemic-un-forum-discusses-measures-to-improve-safety-of-cyberspace/.
Kellogg, Sean. “How Us, EU Approach Regulating ‘Dark Patterns.’” How US, EU approach regulating “dark patterns,” December 1, 2020. https://iapp.org/news/a/ongoing-dark-pattern-regulation/.
“Official Legal Text.” General Data Protection Regulation (GDPR), September 27, 2022. https://gdpr-info.eu/.
“Privacy by Design and by Default.” Privacy by design and by default | European Data Protection Board, 2022. https://edpb.europa.eu/our-work-tools/our-documents/topic/privacy-design-and-default_en.
Staff in the Bureau of Competition & Office of Technology. “FTC to Ramp up Enforcement against Illegal Dark Patterns That Trick or Trap Consumers into Subscriptions.” Federal Trade Commission, August 8, 2022. https://www.ftc.gov/news-events/press-releases/2021/10/ftc-ramp-enforcement-against-illegal-dark-patterns-trick-or-trap?utm_source=govdelivery.
Lomas, Natasha. “The Web Foundation Is Working to Counter Deceptive Design.” TechCrunch, March 23, 2022. https://techcrunch.com/2022/03/23/deceptive-design-patterns-project/?guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=AQAAAA3eTqJ4_J7vh5bFjUvGZYf7TT1X0tx5flshEuyfyaiZ8mfUhOYOwp-zC2SFYPYVxyguBtWz59EdqkDU1kCB8FomJiFQ14c5j_nlXLW32OmZADLtVokOLrEJe2PfdvVSa4JF8pS7ysvwgqdBIfDVtXFowP9u7KxA_cXkKFydxhLT&guccounter=2.
Vashist, Titiksha, Shyam Krishnakumar, and Dhanyashri Kamalakannan. Design Beyond Deception, 2023. https://www.design.pranavainstitute.com/.
Download the full article here:
ABOUT THE AUTHOR
Titiksha Vashist
Co-Founder and Lead Researcher at
The Pranava Institute
SUGGESTED CITATION:
Titiksha Vashist. (2023). Unpacking Deceptive Design: Centering Trust and Consumer Safety in Digital Interactions. The Pranava Institute <https://www.design.pranavainstitute.com/post/unpacking-deceptive-design:-centering-trust-and-consumer-safety-in-digital-interactions>
Commentaires