FTC study finds ‘dark patterns’ used by most subscription apps and websites

0


The U.S. Federal Trade Commission, in collaboration with two other international consumer protection networks, on Thursday announced the results of a study on the use of “dark patterns” — or manipulative design techniques — that can threaten users’ privacy or coerce them into buying products or services or taking other actions they wouldn’t otherwise take. In an analysis of 642 websites and apps that offer subscription services, the study found that most (about 76%) used at least one dark pattern and about 67% used more than one.

Dark patterns refer to a range of design techniques that can subtly encourage users to take some kind of action or put their privacy at risk. They are particularly popular among subscription websites and apps and have been an area of ​​focus for the FTC in past years. For example, the FTC sued dating app giant Match for fraud that included making it difficult to cancel subscriptions through the use of dark patterns.

The release of the new report may indicate that the FTC plans to pay more attention to this kind of consumer fraud. The report comes at a time when the US Justice Department is suing Apple for its alleged monopoly on the App Store – a marketplace that generates billions of dollars in billing and sales for digital goods and services, including those that come through subscription apps.

The new report, published on Thursday, discusses several types of dark patterns, such as sneaking, interference, harassment, coercion, social proof, and others.

The study found that stealth subscriptions were one of the most common dark patterns, meaning the inability to turn off auto-renewal of subscriptions during the sign-up and purchase process. 81 percent of the sites and apps studied used this technique to ensure that their subscriptions automatically renewed. In 70% of cases, subscription providers did not provide information on how to cancel a subscription, and 67% failed to provide a date by which the consumer must cancel in order to avoid being charged again.

Obfuscation is another common problem found in subscription apps; it makes it more difficult or tedious to perform a certain action, such as canceling a subscription or bypassing a sign-up for a free trial, where the “X” to close the offer is grayed out and somewhat hidden from view.

Nagging involves repeatedly asking a consumer to do something the business wants them to do. (Although it’s not a subscription app, an example of nagging is how TikTok often repeatedly prompts users to upload their contacts to the app, even if the user refuses.)

Forced action means forcing the consumer to take some sort of step to access specific functionality, such as filling in their payment details to take part in a free trial – something that was required by 66.4% of the websites and apps included in the study.

Social proof, meanwhile, uses the power of the crowd to influence a consumer to make a purchase, usually by displaying metrics related to some sort of activity. This is particularly popular in the e-commerce industry, where a company will display how many other people are browsing the same product or adding it to their cart. For subscription apps, social proof can be used to motivate users to enroll in a subscription by showing how many other people are doing the same.

The study found that 21.5% of the websites and apps they examined used notifications and other forms of social proof to encourage consumers to subscribe.

Sites may also try to create a sense of urgency in consumers to motivate them to make a purchase. This is often seen on Amazon and other e-commerce sites, where people are alerted to low stock, prompting them to checkout quickly, but it is less commonly used to sell subscriptions.

Interface interference is a broad category that refers to the methods by which an app or website is designed to prompt a consumer to make a decision that is favourable to a business. This can include pre-selecting items, such as longer or more expensive subscriptions – as 22.5% of those studied did – or using a “false hierarchy” to present options more favourable to the business more prominently. The latter was used by 38.3% of businesses in the study.

Interface interventions can also include what the study refers to as “confirmshaming” — which means using language to evoke emotion in order to manipulate a consumer’s decision-making process, such as “I don’t want to miss out, subscribe me!”

The study was conducted from January 29 to February 2 as part of the International Consumer Protection and Enforcement Network’s (ICPEN) annual review, and included 642 websites and apps that offer membership. The FTC reported that it is assuming the chairmanship role at ICPEN for the 2024-2025 timeframe. 27 officials from 26 countries participated in the study, which used dark pattern descriptions established by the Organization for Economic Cooperation and Development. However, the scope of their work was not to determine whether any of the practices were unlawful in the affected countries; that is up to individual governments to decide.

The FTC participated in ICPEN’s review, which was also coordinated with the Global Privacy Enforcement Network, a network of more than 80 privacy enforcement authorities.

This is not the first time the FTC has investigated the use of dark patterns. In 2022, it also wrote a report detailing a range of dark patterns, but it was not limited to just subscription websites and apps. Instead, the older report looked at dark patterns across a variety of industries, including e-commerce and children’s apps, as well as different types of dark patterns, such as those used in cookie consent banners and others.



Source link

About Author

Leave a Reply

Your email address will not be published. Required fields are marked *