On 27 November 2018, seven members of the European Consumer Organisation (BEUC) have launched complaints with their national Data Protection Authorities (DPAs) about Google potentially infringing the General Data Protection Regulation (GDPR). The complaint addresses an abusive business practice that is unfortunately wide-spread on the internet: designing websites and interfaces in a way that makes turning off privacy intrusive settings much harder than turning them on.
In the case of Google’s processing of location data, BEUC has, for instance, identified the hiding of rejection options in remote locations of the settings or the way that the click flow guides customers through a setup of a Google product as infringements of the GDPR’s “informed consent” principle. The design of so called “choice architectures”, meaning the way in which different choices are presented to an individual, is, in fact, not endemic to the digital world: Originating in behavioural science, the so called “nudging” of people towards certain (ideally benign) behaviours is in contrary frequently implemented in very different spheres ranging from architecture to social policies. But while the ethics of nudging are already heatedly discussed when it comes to practices that supposedly benefit individuals, choice architectures that are designed to lure citizens into agreeing to certain terms they are not fully informed about seems nothing else but intentional deception.
What makes this all worse in the online context is that the surreptitious guiding of individuals to accept invasive privacy settings while not informing about objections is only the first nudge of many: From shopping online to political debates, there is a vast set of economic and political actors that have a deep interest into subtlely pushing what citizens buy, do, see and say online. Being misled in our privacy choice architectures is therefore the outset to being misled with more and more nudges that are often, without our agreement, personalised to us.
Cases such as Google’s practices in regard to location data emphasise that the invasiveness and micro-targeting of the online tracking community begins with the lack of transparency and asymmetry of the information and choice presented to individuals on the internet. Citizens’ anxieties and fears from manipulation deriving from malicious practices will ultimately drag responsible and irresponsible businesses down alike: In recent Eurobarometer surveys 67% of internet users were concerned that the personal data people leave on the internet is used to target the political messages they see, and 40% of citizens avoid certain websites because they are worried about their activities being monitored.
If trust is not restored in the Digital Single Market through fair and respectful business models that rely on informed and meaningful consent, the economic potential of the European data economy will stay below its own potential. It is therefore high time and of crucial importance that the Data Protection Authorities that received complaints by BEUC now set a clear sign that with the introduction of the GDPR, manipulation is, and can never be, informed consent.
NCC publishes a report on tech companies’ use of “dark patterns” (27.06.2018)
My Data Done Right launched: check your data! (07.11.2018)
The GDPR Today – Stats, news and tools to make data protection a reality (25.10.2018)
(Contribution by Yannic Blaschke, EDRi intern)