Dark patterns, also called deceptive patterns, are intentionally crafted UX designs aimed at manipulating users into actions they didn’t originally intend. Whether it’s making unintended purchases, sharing personal data, or subscribing to services, these tactics exploit users' cognitive biases for the benefit of businesses.
While they may yield short-term results like increased sales or engagement, dark design patterns erode trust and harm users in the long run. Studies reveal that their prevalence is growing, making it crucial for designers to understand and avoid these manipulative practices. At Eleken, we believe that the foundation of effective UX design lies in creating transparent, user-friendly interfaces that respect users’ autonomy.
In this article, we’ll explore what are dark patterns, examine their prevalence, and break down 21 types of dark patterns with real-world examples. We’ll also discuss their impact on users and businesses, and offer actionable solutions for creating ethical, user-friendly designs.
What Are Dark Patterns?
Dark patterns, also known as deceptive patterns, are unethical design strategies used in user interfaces to trick or manipulate users into unintended actions. Coined by UX designer Harry Brignull in 2010, the term refers to a range of techniques that exploit users’ cognitive biases or lack of attention to achieve business objectives at the expense of user experience.
For example, a user might unknowingly subscribe to a service due to vague language or preselected options, or be forced to share personal data to access basic functionality. These dark UX patterns deliberately obscure choices, create confusion, or use emotional triggers to nudge users toward decisions they wouldn’t otherwise make.
Examples of dark patterns include:
- Misdirection: Highlighting irrelevant features to obscure critical details.
- Roach Motel: Making it easy to sign up but almost impossible to cancel.
- Sneak into Basket: Adding items to a cart without user consent.
By prioritizing short-term gains, companies that use dark patterns risk damaging trust, inciting user frustration, and attracting regulatory penalties. Designers, therefore, must understand psychology in UX and critically evaluate their work to avoid such manipulative practices.
How Prevalent Are Dark Patterns?
Dark patterns in UX are alarmingly widespread, infiltrating a significant portion of digital platforms. A 2019 study conducted by researchers at Princeton University and the University of Chicago examined 11,000 popular e-commerce websites. The findings revealed that 10% of these sites employed deceptive practices, ranging from hidden fees to sneaky preselected options that manipulated users into making unintended choices.
Further research by Zurich University in the same year focused on 240 widely used Google Play add-ons. This study uncovered even more troubling results, with 95% of the analyzed apps using dark pattern designs. These ranged from forcing unnecessary data sharing to automatically enrolling users in subscriptions without clear consent.
In 2022, a report by the European Commission revealed that the prevalence of dark patterns had grown even further, with 97% of popular apps used by the EU consumers displaying dark UX patterns. This upward trend highlights the increasing reliance on these tactics across industries, driven by a desire to maximize conversions and outpace competitors.
The persistence of dark patterns examples can largely be attributed to two factors.
First, many businesses prioritize short-term gains, such as boosting sign-ups or purchases, over ethical design. A/B testing often highlights flows with dark design patterns as more “effective,” encouraging their adoption.
Second, the competitive landscape fosters copycat behavior, as companies mimic their rivals’ strategies, including dark patterns UX, to remain relevant.
This growing trend underscores the need for designers to recognize the risks of dark patterns and advocate for transparent, user-first approaches, as well as simplicity in design.
How Do Dark Patterns Affect Users?
Dark patterns have profound and far-reaching effects on users, often eroding trust, causing frustration, and disproportionately harming vulnerable groups.
When users encounter deceptive interfaces, their trust in the brand diminishes. Manipulative tactics, such as hidden fees or forced actions, leave individuals feeling betrayed, making it unlikely they will return to the product or recommend it to others. Trust, once broken, is exceedingly difficult to rebuild and can have long-term consequences for a company’s reputation.
Additionally, dark patterns create stress and frustration by complicating user interactions. Confusing language, repeated prompts, or misleading designs force users to spend extra time navigating a platform, often leading to feelings of irritation and dissatisfaction. These negative experiences can leave users with lasting resentment toward the product and its creators.
Vulnerable groups are disproportionately affected by dark pattern UI elements. People with low digital literacy, cognitive impairments, or disabilities often struggle to recognize manipulative designs. For instance, visually impaired users may miss low-contrast disclaimers. Manipulative UX for seniors might make elderly users inadvertently share personal data due to unclear prompts. These patterns exploit those who are least equipped to navigate them, amplifying their harmful impact.
By prioritizing clarity and fairness and avoiding bad UX examples, designers can prevent these outcomes and create experiences that foster trust, satisfaction, and inclusivity. The ethical responsibility lies with those who design interfaces to protect users from these manipulative tactics.
21 Dark Patterns Examples
Dark patterns come in many forms, each designed to manipulate users into actions they might not otherwise take. While there are various ways to classify and identify these deceptive tactics, this guide draws directly from an internal lecture by Eleken’s UI/UX designer Anastasiia Soroka. Drawing on her extensive experience and real-world examples, Anastasiia outlined 21 examples of dark patterns and provided her colleagues with actionable insights on how to spot, understand, and avoid these manipulative practices.
1. Roach Motel
This pattern makes it easy for users to sign up for a service but excessively difficult to cancel. It’s designed to trap users, exhausting their patience to keep them subscribed.
Examples: Amazon Prime’s cancellation process is a well-known instance. Users must navigate multiple pages and steps to terminate their subscription, often facing unclear language and unnecessary friction. Similarly, some gym memberships require in-person visits or mailed requests to cancel.
Solutions: Ethical design would prioritize a straightforward and transparent cancellation process, offering users the ability to unsubscribe with minimal effort.
As Anastasiia says:
The more we keep the user from leaving, the more they want to leave. Instead of hindering them, we should focus on solving the root of the problem.
2. Bait and Switch
Bait and switch occurs when users are promised one outcome but receive something entirely different.
Examples: Windows updates are a classic example. Many users who clicked the “Close” button to dismiss the update prompt accidentally triggered the installation process instead. This frustrated users and damaged Microsoft’s credibility.
Solutions: Honor user decisions and ensure all prompts and actions align clearly with user expectations to avoid confusion and betrayal of trust.
3. Disguised Ads
Disguised ads blend advertisements with genuine interface elements, tricking users into engaging with them.
Examples: Fake “Download” buttons on software websites are a notorious example. Users click these buttons thinking they’ll get their intended file but are instead redirected to advertisements or malware.
Solutions: Clearly separate advertisements from interface elements by using distinct colors, labels, and styling, ensuring users can easily distinguish between the two.
4. Forced Continuity
This pattern involves automatically transitioning users from a free trial to a paid subscription without adequate notification.
Examples: Many subscription services, including fitness apps and streaming platforms, fail to notify users when a free trial ends. For instance, some users of Adobe Creative Cloud have reported being charged for annual plans they didn’t intend to continue.
Solutions: Provide users with timely reminders before renewals and design an accessible and simple cancellation process.
5. Tricky Wording
This pattern uses vague or misleading language, often with double negatives, to confuse users into unintended actions. For example, subscription forms might say, “Check this box to opt out,” leading users to believe they are opting in.
Examples: Anastasiia highlighted an example from Sky where users had to click a checkbox to refuse a package: “You need to click to refuse. Of course, most users will not read it all and will think this is the subscription for the package.”
Solutions: Designers should aim for clarity by using simple, straightforward copy that leaves no room for misinterpretation.
6. Misdirection
This tactic draws users’ attention to one element to obscure another critical detail.
Examples: Airlines frequently highlight low fares but hide taxes and additional fees in fine print, revealed only during checkout. Similarly, some online stores use bold visuals to promote discounts while minimizing information about restrictions.
Solutions: Present all relevant information equally and ensure users can easily see associated costs and conditions.
7. Sneak into Basket
This pattern adds items to a user’s shopping cart without their explicit consent.
Examples: Sports Direct, a UK retailer, has been criticized for automatically adding magazines to customer carts at checkout. Users often don’t notice these additions until they reach the payment stage.
Anastasiia says: “The user chose some clothes and was also given a magazine in the basket. It’s fine if it’s a free gift, but if it costs money, even a small amount, it’s not a good practice.”
Solutions: Ensure that add-ons are presented as optional and require explicit user action to include them.
8. Confirmshaming
Confirmshaming uses guilt-inducing language to pressure users into making a particular decision.
Examples: The most prominent case Anastasiia mentions is MyMedic, a health products website that uses pop-ups with options like “No, I’d rather bleed to death” to discourage users from declining their services. This manipulative tactic preys on users’ emotions, often causing discomfort.
Solutions: Use neutral or positive language that respects user autonomy and avoids unnecessary emotional manipulation.
9. Nagging
Nagging involves persistent prompts that repeatedly interrupt the user experience.
Examples: Instagram’s 2018 notification pop-ups didn’t offer a permanent dismissal option, frustrating users who wanted to avoid enabling notifications.
Solutions: Allow users to permanently dismiss or disable prompts to ensure a smoother and more respectful experience.
10. Hidden Costs
Hidden costs reveal unexpected fees or charges at the checkout stage, undermining transparency.
Examples: Ticketing platforms like Ticketmaster often add substantial service fees after users have proceeded to the final payment screen. These surprise charges create frustration and erode trust.
Solutions: Display all costs upfront, ensuring users have a clear understanding of the total price from the beginning.
11. Privacy Zuckering
This pattern manipulates users into sharing personal data without clear consent.
Examples: Facebook requested users’ phone numbers for two-factor authentication, only to later use them for friend suggestions and targeted ads. This misuse sparked widespread criticism and eroded user trust.
Solutions: Clearly explain why data is being collected and ensure users provide informed, explicit consent.
12. Fake Urgency
Fake urgency leverages artificial deadlines or scarcity to push users into making impulsive decisions.
Examples: E-commerce sites often display countdown timers that reset upon page refresh, creating a false sense of urgency. Booking platforms use messages like “Only 1 room left!” to pressure users into immediate action.
Anastasiia also recounted experiences with online booking platforms:
I remember looking at a course with a timer saying, ‘Only today!’ Then the next day, the timer had renewed. It’s fake urgency, and it drives unnecessary pressure.
Solutions: Communicate genuine availability and deadlines, avoiding exaggerated or dishonest claims.
13. Friend Spam
Friend spam accesses a user’s contact list to send unsolicited messages, often without explicit consent.
Examples: LinkedIn faced backlash in 2015 for sending automatic invitations to users’ email contacts, which many users didn’t realize they had authorized. This resulted in a $13 million fine.
Solutions: Request explicit permission before accessing contact lists and explain exactly how the data will be used.
14. Preselection
Preselection involves defaulting to the most expensive or unfavorable option without user input.
Examples: Many travel booking sites preselect travel insurance, adding it to the total unless users actively opt out.
Solutions: Avoid preselecting options or default to the most neutral or cost-effective choice for users.
15. Interface Interference
Interface interference uses poor design elements, such as low contrast or obscure placement, to hide critical information.
Examples: Tesla displayed disclaimers about autopilot upgrades in faint, hard-to-read text, leaving users unaware of key terms.
Solutions: Follow accessibility standards to ensure all important details are clear, visible, and easy to understand.
16. Price Comparison Prevention
This pattern obscures pricing information, making comparisons difficult.
Examples: Mobile carriers sometimes display data limits in inconsistent formats (e.g., per day versus per month), confusing users and complicating comparisons.
Solutions: Standardize pricing formats and ensure clarity to support user decision-making.
17. Emotional Manipulation
Emotional manipulation exploits feelings like fear or guilt to influence user behavior.
Examples: Websites frequently use messages like “Hurry, only 1 left in stock!” to create unnecessary panic and drive impulsive purchases.
Solutions: Communicate honestly and respectfully, avoiding manipulative tactics that exploit user emotions.
18. Misleading Subscription
This pattern fails to disclose recurring payments clearly, surprising users with ongoing charges.
Examples: Figma added fees for additional team members without adequately notifying users upfront. This left users frustrated and caught off guard.
Solutions: Clearly communicate subscription terms, renewal dates, and any potential additional charges.
19. Unclear Language
Unclear language uses vague or ambiguous phrasing to confuse users about their choices.
Examples: “Click here to opt out” is less clear than a direct “No, thank you” option. Some software installation prompts also use vague language to upsell additional features.
Solutions: Use precise and simple language to ensure users understand their actions.
20. Forced Action
Forced action requires users to complete unrelated tasks to proceed.
Examples: Some websites bundle newsletter sign-ups with terms and conditions acceptance, giving users no option to proceed otherwise.
Solutions: Separate unrelated actions and provide users with independent options for each task.
21. Obstruction
Obstruction adds unnecessary friction to prevent users from completing certain actions, such as cancellations.
Examples: Epic Games required users to navigate complex workflows to cancel purchases or subscriptions, leading to a $245 million fine from the Federal Trade Commission.
Solutions: Streamline workflows to ensure users can complete actions like cancellations without unnecessary barriers.
Conclusion
Dark design patterns are more than just questionable design choices—they represent a betrayal of user trust and a short-sighted approach to business. While they may yield temporary gains in revenue or engagement, the long-term consequences, including user dissatisfaction, reputational damage, and even legal penalties, far outweigh any perceived benefits.
To avoid these pitfalls, designers and businesses must adopt a more ethical and user-centered approach:
- Balance business and user needs: While achieving business goals is important, it should never come at the cost of manipulating or misleading users. Striking a balance ensures sustainable success and fosters goodwill among your user base.
- Prioritize ethical design: Transparent, user-centered designs build trust, encourage loyalty, and create lasting relationships. Ethical design practices demonstrate respect for users and position your brand as a trusted leader in the market.
- Educate clients: Many clients may not fully understand the risks of employing various dark patterns examples, including regulatory fines and brand damage. Designers have the responsibility to advocate for ethical choices, presenting evidence and insights that align user satisfaction with business objectives. Tracking UX design KPIs can help designers prove their point and businesses – measure success without resorting to manipulative practices.
By adopting these principles, designers can contribute to a digital landscape that prioritizes honesty, empowerment, and respect for the user.
Let Eleken Help You Design with Integrity
At Eleken, we specialize in creating intuitive, user-friendly and human-centered designs that balance business goals with ethical practices. Whether you’re building a new product or refining an existing one, our expert team is here to help you craft experiences that foster trust and loyalty.
Contact us today to learn how we can help elevate your design strategy.