Dark patterns manipulate users into unwanted actions. Companies like Amazon and LinkedIn have paid millions in fines for using them, yet they spread through third-party tools and growth marketing programmes that disguise manipulation as optimisation. Simply avoiding dark patterns isn't enough. When competitors hide costs and create false urgency, ethical businesses appear less competitive. This pressures everyone towards manipulation. Active resistance means transparent pricing, symmetrical user journeys, and removing artificial pressure. As regulation tightens and consumers wise up, businesses that respect user autonomy gain genuine competitive advantage.
.png)
Dark patterns are deliberately misleading design choices that manipulate users into actions they wouldn't otherwise take. Rather than respecting user autonomy, these techniques exploit cognitive biases, create false urgency, obscure information, or make undesirable options deliberately difficult to execute.
The term "dark patterns", coined by UX designer Harry Brear in 2010, encompasses a wide range of manipulative practices: hidden costs that appear at checkout, pre-ticked boxes for unwanted subscriptions, confusing unsubscribe processes, countdown timers creating artificial scarcity, and interfaces that make cancellation vastly more difficult than sign-up.
Several high-profile companies have faced significant consequences for deploying dark patterns. Amazon paid $30.8 million in 2023 to settle Federal Trade Commission charges over its Alexa voice assistant and Ring doorbell practices, which made it unnecessarily difficult for users to delete their data. The FTC separately sued Amazon for $25 million over its Prime cancellation process, which required users to navigate through multiple pages deliberately designed to discourage cancellation.
LinkedIn paid $13 million in 2015 for using dark patterns in its growth tactics, repeatedly accessing user email accounts and sending invitations that appeared to come from the user without clear consent. Dating platform Match.com faced FTC action in 2019 for sending notifications about messages from accounts the company knew were likely fraudulent, encouraging users to subscribe to read these fake messages.
These aren't isolated incidents but symptoms of a broader industry approach that prioritises short-term conversion metrics over user trust and long-term relationships.
Many businesses implement dark patterns unintentionally, often through third-party tools and services marketed as conversion optimisation. Personalisation platforms come pre-configured with countdown timers and stock level warnings, frequently displaying false information. A/B testing tools suggest 'best practices' that include hidden opt-outs and confusing consent flows. Email service providers default to pre-checked marketing consent boxes. Cookie consent management platforms are specifically designed to make accepting all cookies the easiest option whilst burying granular controls.
Growth marketing courses and certification programmes teach these techniques as standard practice, rarely acknowledging the ethical implications. Agencies position dark patterns as sophisticated conversion rate optimisation, wrapping manipulation in the language of user experience improvement and then adding the theatre of AB tests to infer it's what users want. (When in fact AB tests can only tell us what users did)
The pressure to demonstrate quick wins from experimentation programmes creates an environment where ethics become secondary to statistical significance. When leadership demands to see conversion rate improvements within a quarter, it becomes tempting to implement the tactics that reliably move numbers upward, even when those tactics undermine user autonomy.
Simply choosing not to implement dark patterns is insufficient. Ethical businesses must actively oppose them because dark patterns create market conditions that disadvantage honest competitors.
When one ecommerce site hides the total cost until checkout whilst yours displays it transparently, you appear more expensive in comparison. When competitors create false urgency to drive immediate purchases whilst you provide accurate information, your conversion rate suffers. When others make cancellation deliberately difficult to inflate retention metrics whilst you offer clear, simple processes, your churn appears worse.
This creates a race to the bottom where businesses feel compelled to adopt manipulative practices simply to remain competitive. Breaking this cycle requires active resistance, not passive non-participation.
Opposing dark patterns means more than avoiding specific techniques. It requires committing to design principles that respect user autonomy:
Make information clear and accessible at the point users need it. If there are costs, show them. If there are limitations, explain them. If something will auto-renew, say so explicitly.
Design processes to be symmetrical. If users can subscribe in three clicks, they should be able to unsubscribe in three clicks. Don't make it easier to opt in than opt out.
Remove artificial pressure. Real scarcity is fine to communicate, but invented countdown timers, crap wheel-of-fortune games and fabricated stock warnings erode trust once users recognise the manipulation.
Test user interfaces with people who aren't trying to complete your desired action. If users struggle to find the opt-out, the decline button, or the cancellation link, redesign until these options are genuinely easy to locate.
Measure success beyond conversion rates. Track trust indicators, repeat purchase rates, customer lifetime value, and organic referrals rather than optimising solely for immediate conversions.
Research increasingly demonstrates that dark patterns damage long-term business performance. Users who feel manipulated are less likely to return, less likely to recommend your business, and more likely to leave negative reviews. The short-term conversion gains rarely offset the long-term reputation costs.
More importantly, regulatory pressure is mounting. The EU's Digital Services Act explicitly bans many dark patterns. The UK's Competition and Markets Authority has published guidance on online choice architecture that effectively prohibits manipulative design. California's privacy laws restrict several common practices. Businesses building on dark patterns face growing legal and regulatory risk.
Beyond compliance, there's a competitive opportunity. As consumers become more aware of manipulative design, businesses that demonstrably respect user autonomy can differentiate themselves meaningfully. Trust becomes a genuine competitive advantage rather than a marketing platitude.
Standing against dark patterns means auditing your existing user journeys for manipulative elements, even if they were implemented with good intentions. It means questioning vendor recommendations when they rely on artificial urgency or hidden complexity. It means having difficult conversations with stakeholders about why certain 'best practices' are actually unethical.
It also means advocating for change beyond your own organisation: calling out dark patterns when you encounter them, educating clients and colleagues about their long-term costs, and supporting regulatory efforts to establish clearer boundaries around acceptable persuasion.
Another web is possible, but only if ethical businesses actively work to create it rather than passively accepting the manipulative status quo. The question isn't whether we can afford to take this stand, but whether we can afford not to.