You’ve almost certainly already fallen into the Dark Pattern trap yourself. Ever signed up for a service that you didn’t really want or bought something “ac­ci­den­tal­ly”? These are both typical examples of the dark side of user ex­pe­ri­ence design (UX design for short). De­vel­op­ers and site owners often de­lib­er­ate­ly make their apps and websites “user-un­friend­ly” to further their own interests. And major companies like Booking.com, Apple, and Amazon are no exception. They all use Dark Patterns to influence users.

Let’s take a closer look at how UX designers use Dark Patterns and what they hope to achieve.

What are Dark Patterns in UX design?

The term “Dark Patterns” was coined by London-based UX designer Harry Brignull in 2010. He defined it as follows:

Quote

“Dark Patterns are tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something.” – Harry Brignull, Source: https://www.dark­pat­terns.org/

In other words, Dark Patterns are designed to trick people into acting against their own interests. They’re es­pe­cial­ly popular in the field of neu­ro­mar­ket­ing, where combined with knowledge of human behavior and per­cep­tion, they serve to influence consumers in order to achieve a par­tic­u­lar goal. The first thing de­vel­op­ers of Dark Patterns exploit is the fact that people have a limited capacity to take in new in­for­ma­tion. Most of us only skim-read long texts, meaning we easily overlook or mis­in­ter­pret mis­lead­ing wordings or deceptive rep­re­sen­ta­tions.

The following video uses concrete examples to il­lus­trate what Dark Patterns are and how they work:

What are the different types of Dark Patterns and how are they used?

Companies and website owners use different types of Dark Patterns depending on what they want to achieve. In fact, they often use several Dark Patterns at the same time to reinforce the overall effect and hide their true in­ten­tions. The following describe some of the different types of Dark Patterns used by UX designers.

  • Roach Motel: The idea behind this type of Dark Pattern is to lead users quickly into a par­tic­u­lar situation and then make it really hard for them to get back out. Companies often use this trick to get people to sign up for premium sub­scrip­tions. The sign-up process is fast and simple, but the can­cel­la­tion options are usually hidden in some part of the website where the user wouldn’t naturally think to look.
  • Bait and Switch: This type of Dark Pattern is a kind of decoy tactic. The user thinks they’re doing one thing but ends up doing something com­plete­ly different.
  • Trick Questions: This kind of ambiguous question is often found on forms, the idea being to trick people into giving an answer they didn’t really mean to give. This type of Dark Pattern relies on the fact that most users will simply scan a text rather than going over it with a fine-tooth comb.
  • Sneak into Basket: Here, sites sneakily add items to a customer’s basket by in­cor­po­rat­ing pre­s­e­lect­ed check­box­es or confusing opt-out choices in the checkout process.
  • Disguised Ads: Disguised ads are adverts that are designed to look like something else and thus trick the user into clicking on them. For example, they’re often made to look like part of the page content or the nav­i­ga­tion pane.
  • Privacy Zuckering: This term was in­tro­duced by the Elec­tron­ic Frontier Foun­da­tion (EFF) and was named after Facebook CEO Mark Zucker­berg. Privacy Zuckering involves per­suad­ing users to disclose more than they really want to. For example, Facebook was known for making its privacy settings de­lib­er­ate­ly confusing so as to get as much data from users as possible. The General Data Pro­tec­tion Reg­u­la­tion has since made it harder for companies to obtain data through fraud­u­lent practices. For example, you now have to actively consent to the pro­cess­ing of your personal data.
  • Hidden Costs: Have you noticed that online shops often don’t tell you about taxes, delivery costs or other ad­di­tion­al charges until you get to the very last page? These sites are relying on the fact that most users tend to complete their order anyway by the time they’ve got to this point.
  • Price Com­par­i­son Pre­ven­tion: To make it hard for consumers to compare prices, online retailers often hide the in­di­vid­ual prices of products, such as by selling bundles of goods or services without in­di­cat­ing the cor­re­spond­ing unit price. Mobile phone providers were well known for using this type of Dark Pattern as far back as the early 2000s.
  • Mis­di­rec­tion: The purpose of this Dark Pattern is to draw a user’s attention from one part of the content to another.
  • Forced Con­ti­nu­ity: Lots of companies ask people to provide their payment details to activate free trial sub­scrip­tions. After the trial has ended, the sub­scrip­tion au­to­mat­i­cal­ly switches to a paid version without any reminder being sent to the customer. What’s more, the companies usually make the can­cel­la­tion process very confusing and time consuming in the hope that the customer will just give up and let the sub­scrip­tion continue.
  • Friend Spam: Here, an app or product will ask a user to give their email address or social media details on the pretext of checking for friends on their behalf. However, once the user has approved the request, their email address is used to send all of their contacts a spam message ad­ver­tis­ing the company with the aim of at­tract­ing new users.
  • Confirm-shaming: This type of Dark Pattern aims to make the user feel bad about a par­tic­u­lar decision. For example, on a message prompting you to sign up for a newslet­ter to get a 15% discount on your purchase, the “decline” button might be labeled “No thanks, I don’t want to save any money”.

Real examples of Dark Patterns used online

Many Dark Patterns fall into a rather gray area as far as the law is concerned, and some are actually illegal. Either way, they leave something of a bad taste behind once you know about them. We’ve rounded up a few real-world examples to il­lus­trate the tricks to watch out for.

Booking.com

Booking.com attempts to play on customers’ emotions by dis­play­ing messages like “Only 1 room left!” and including fully booked options in the search results. This type of no­ti­fi­ca­tion is displayed even if it’s just the rooms allocated to Booking.com that are all taken, rather than every room in the hotel. Other similar alerts like “2 other people looked for your dates in the last 10 minutes” also impress a sense of urgency on potential bookers, making them think they might be about to miss out. The European Com­mis­sion has stepped in and ordered Booking.com to strip its site of all such ma­nip­u­la­tive tech­niques by June 2020 at the latest.

Quote

“As a market leader, it is vital that companies like Booking.com meet their re­spon­si­bil­i­ties in this area ...” – Didier Reynders, European Com­mis­sion­er for Justice and Consumers. Source: https://ec.europa.eu/com­mis­sion/press­corner/detail/en/ip_19_6812

LinkedIn

LinkedIn was behind what has become the best known example of Friend Spam. During the reg­is­tra­tion process, LinkedIn asked users to grant access to their email accounts, claiming that doing so would boost the new user’s career prospects by creating a “strong network”. In actuality, LinkedIn then used the new user’s email address to send in­vi­ta­tion emails to all of their contacts. This behavior resulted in LinkedIn being fined $13 million in damages in 2015. Given the number of people on LinkedIn at the time, that worked out as around $10 per member.

Ryanair

In late 2010, Ryanair tried to use Dark Patterns to sell more travel insurance. At one stage of the booking process, users were presented with a field labeled “Buy AXA travel insurance”. Instead of a simple “Yes” or “No” choice, users had to pick their answer from a drop-down list of countries. At first glance, it therefore looked like you had to buy the insurance, but on closer in­spec­tion, hidden among the endless list of countries there was a “No Travel Insurance Required” option.

Microsoft

When Microsoft launched Windows 10, they used the “Bait and Switch” type of Dark Pattern to encourage users to upgrade their operating system. In the Update Center, they presented Windows 10 as arequired update” which wasn’t true at all. This triggered anger and outcry among users and earned the scandal the name “Up­grade­gate”.

For some more in­ter­est­ing examples of Dark Patterns, check out Harry Brignull’s Hall of Shame on Twitter.

Go to Main Menu