The article “The Dark (Patterns) Side of UX Design” by Colin Gray et al provides a compendium of techniques designed to induce online consumers to make purchases, commit to regular subscription payments, pay more than the initial price, prevent consumers from considering competitor offerings, and that enable platforms to use consumers’ private information or monitor their online activity. Through studies of online platform designs Gray et al identify a range of tactics. Here’s a summary.
A website that invites me to “accept all cookies” usually wants my data to be shared with third parties. Were I to stop and check all “cookie settings” as demanded, or agree to the conditions of service offered, then that would inhibit rapid browsing across multiple websites. Nagging is where websites and apps present popup dialogues that I have to click to dismiss. The only way to get rid of these persistent reminders is to select a button that indicates I agree to some condition that may unfavourable to me: my privacy, more advertisement, oversharing, etc.
The usual tactic is to make it very easy for the consumer to commit to a course of action, but much more difficult to get out of it. So you can start paying a subscription, but to cancel it is more difficult. I notice that reputable and considerate vendors boast the converse, turning easy returns into a unique selling point, if you can trust that.
I know some vendors try to pressure buyers into paying for extended warranties, but I never accept that additional cost. More common “sneaky” practices include those that require you to divulge credit card details before accepting a free one month subscription. It is easy to forget the time limit. Beyond the month the customer starts paying. By then I might have lost track of how to unsubscribe. Contrary to that practice, I have found it very easy to terminate subscriptions to patreon.com.
They describe this interference as “Manipulation of the user interface that privileges certain actions over others” (5). They see this as a diverse set of ways to confuse the user with hidden options, such as discrete checkboxes with pre-selected choices. I think there are many instances where interactions are unclear and confusing. Interface designs and standards keep changing, and interfaces look different depending on the device you are using. Even enthusiasts who monitor online practices can succumb to confusion. I suspect that many platforms exploit the fact that not all consumers know what they are looking at.
This is “Requiring the user to perform a certain action to access (or continue to access) certain functionality” (5). The example Gray et al provide is in gaming. In Candy Crush, the player is induced to aim for high scores, but in order to meet the aim the player needs to pay to boost their capabilities.
Gray et al put the focus on the ethical responsibility of designers in dealing with pressures to create systems that attempt to deceive consumers.
“Given that many persuasive strategies—or even dark patterns— can be used for good or ill, we must attend to how the selection of such strategies relates to UX designers’ ethical responsibility as practitioners, and how this exemplifies their design character” (9).
- Gray, Colin M., Yubo Kou, Bryan Battles, Joseph Hoggatt, and Austin L. Toombs. 2018. The Dark (Patterns) Side of UX Design. CHI ’18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems: 1-14. Montréal, Canada: ACM.