NCC publishes a report on tech companies’ use of “dark patterns”
Today, the Norwegian Consumer Council (NNC), a consumers group active on the field of digital rights, has published a report on how default settings and “dark patterns” are used by techs companies such as Facebook, Google and Microsoft to nudge users towards privacy intrusive options.
The term “dark patterns” refers to the practices used to deliberately mislead users through exploitative nudging. The NNC describes them as “features of interface design crafted to trick users into doing things that they might not want to do, but which benefit the business in question, or in short, nudges that may be against the user’s own interest”.
The General Data Protection Regulation (GDPR) requires services to be developed according to the principles of data protection by design and data protection by default and obliges companies to make a lawful use of their users’ data. With the entry into operation of the GDPR last May, the three companies had to update the conditions of use of their services, which they did by using a wide variety of “dark patterns”. The report focuses on five of them which overlap with each other and that together form the big picture of how companies mislead users to “chose” invasive instead of data protection-friendly options. This is done by putting in place the following mechanisms:
1. Default settings
Facebook and Google hide and obscure the privacy settings, making it much easier and visible for the user to accept the most intrusive options.
2. Taking the hand of the user to mislead him
Usually, the services push users to accept unnecessary data collection through a combination of positioning and visual cues. Facebook and Google go a step further by requiring a much larger amount of steps to limit data collection, in order to disincentive citizens to protect themselves.
3. Invasive options go first
All three companies presented as the positive option the settings that maximise data collection, creating doubts on the user and even ethical dilemmas. The companies do not explain the full consequences of their choices but frame their messages focusing on the theoretical positive sides of allowing wider data collection, such as the improvement of the user experience.
4. Rewards and punishments
A typical nudging strategy is to use incentives to reward the “right” choice, and punish choices that the service provider deems undesirable. The reward is often described as “extra functionality” or a “better service” (without making clear what this means in practice), while the punishment might be the loss of functionality or the deletion of the account if they decline, which has been the strategy of Facebook and Google. 5. Time pressure: When it came to completing the settings review, all the three services put pressure on the user to complete them at a time determined by the service provider. This was made without a clear option for the user to postpone the settings review and not making clear either whether the user could still use the service or not.
The report concludes that these service providers are just giving users the “illusion of control” while nudging them toward the options more desirable for the companies.
Read more:
DECEIVED BY DESIGN: How tech companies use dark patterns to discourage us from exercising our rights to privacy (27.06.2018)
https://fil.forbrukerradet.no/wp-content/uploads/2018/06/2018-06-27-deceived-by-design-final.pdf
GDPR: noyb.eu filed four complaints over “forced consent” against Google, Instagram, WhatsApp and Facebook (25.08.2018)
https://noyb.eu/wp-content/uploads/2018/05/pa_forcedconsent_en.pdf
GDPR explained
https://gdprexplained.eu/
(Contribution by Maria Roson, EDRi intern)