Unpacking digital fairness: What Europe must do now to end the tech industry’s most nefarious tactics

The EU plans to propose a Digital Fairness Act to better protect consumers from deceptive design practices, social media addiction, and pervasive online tracking. We unpack what this means and what the European Commission should do to end Big Tech’s most nefarious tactics.

By EDRi · October 24, 2024

What is the Digital Fairness Fitness Check report and how will affect the upcoming Digital Fairness Act?

On 3 October, the European Commission unveiled its highly anticipated Digital Fairness Fitness Check report. The report is part of an ongoing effort to evaluate whether current EU legislative measures align with consumer protection policy goals. It assesses the impact of three Directives that form the foundation of EU consumer protection laws:

  1. the Unfair Commercial Practices Directive,
  2. the Consumer Rights Directive, and
  3. the Unfair Contract Terms Directive.

The report is likely to have a major influence on the Commission’s planned Digital Fairness Act proposal, which will have considerable impact on our digital rights.

Okay, what does the Fitness Check report say?

The report stops short of recommending concrete actions but still paves the way for the planned Digital Fairness Act. It is expected that this Act will attempt to tackle harmful commercial practices, such as deceptive and addictive interface designs, influencer marketing, and online profiling through which tech corporations exploit people’s vulnerabilities for profit. This development also signals the death knell for the ePrivacy Regulation proposal, which will probably be divided into three separate legislative frameworks: one on web cookies, one on digital advertising, and one on data retention.

While dividing up these topics might help break the deadlock that the ePrivacy Regulation faced in the Council, this fragmentation threatens to hide the substantial overlap between state and corporate surveillance and the need to address both sectors’ exploitation of people’s personal data.

The Fitness Check report argues that the effectiveness of existing consumer laws is compromised by issues such as legal uncertainty and fragmentation. These challenges are exacerbated by the speed with which tech companies are throwing new products and services onto the market without properly assessing their impact on society and, in particular, fundamental rights and freedoms.

The report also highlights a growing boldness in sophisticated non-compliance by tech firms, something EDRi has been denouncing for years. Many private actors, notably Big Tech companies and ad tech intermediaries, have become increasingly adept at creating the illusion of compliance with the law while sidestepping real accountability in the name of ‘innovation’ or ‘competitiveness’. A very common example of this compliance illusion are the so-called cookie banners, an invention by the ad tech industry to give people the appearance of a choice that just keeps exploiting their personal data regardless of whether or not they consent to it.

Big Tech’s nefarious practices have become widespread

The Fitness Check report identifies five critical categories of problematic practices often found in consumer-facing apps and online platforms that require targeted attention and reflect the growing power imbalance between tech firms and users:

  1. Deceptive design: Manipulative interface designs and functionalities that mislead users to make choices they might not otherwise have made, often undermining informed consent. These practices frequently leverage behavioural data to guide users toward decisions that benefit companies first.
  2. Addictive design: Strategies that exploit human psychology, encouraging compulsive use and diminishing users’ autonomy over their digital habits. Tracking users’ behavioural patterns enables companies to optimise these designs and further entrench users in cycles of addiction.
  3. Personalisation: Intrusive techniques that exploit intimate knowledge about individual users in order to target them with specific content and advertisements. They give companies a distinct advantage in manipulating users, exploiting their vulnerabilities and discriminating against them. Personalisation raises concerns around privacy, autonomy, and fairness, especially when profiling occurs without clear consent or transparent information about how personal data is being used.
  4. Social media commerce: Influencer marketing and other commercial practices that often lack transparency and make it difficult for consumers to discern genuine content from promotional material, with a particularly harmful impact on minors and mental health.

The above practices are widespread and underscore the urgent need for stronger regulation to ensure that digital rights are upheld, including privacy, data protection, accountability, transparency, and user empowerment in online spaces. Consider, for instance, the ‘Pay or Okay’ model, where individuals must either pay for their right to privacy or accept inferior privacy in exchange for nominally free services. Pay or Okay reinforces inequalities and undermines privacy as a basic entitlement for everyone, regardless of their financial means. The model disregards the established frameworks for data protection, competition, and consumer practices.

Consumers are people with rights beyond consumer law

The EU’s digital fairness initiative is welcome and should be used to develop a clear and enforceable framework that ensures strong consumer protection, in line with the EU’s digital fairness goals. EDRi fully supports stronger consumer protection standards, and commends BEUC’s efforts in advocating for a fairer ecosystem for consumers and pushing for a comprehensive legislative framework (PDF). However we also recognise that people are more than just consumers; these practices have far-reaching impacts on individual and collective rights, as well as on the overall well-being of society and democracy.

That is why this initiative should directly tackle the massive commercial surveillance industry and its exploitative market practices, which tech corporations have so successfully and deeply embedded into our every-day lives. Take, for example, the Cookie Pledge, an initiative by the European Commission aimed at promoting positive change in commercial surveillance by encouraging companies to reduce intrusive tracking. The initiative had many positive aspects in protecting users’ privacy and empowering them with greater control over their data. The Pledge eventually failed to achieve its goals, mostly because its voluntary nature depended on companies’ willingness to comply—many of which showed little interest in doing so. This is why a new digital fairness legislation must include all the parts of the ePrivacy Regulation proposal that are necessary to effectively regulate commercial surveillance.

Here are the principles that should guide this effort, in line with EDRi’s demands regarding the ePrivacy Regulation proposal and our past work on tracking advertising:

  • Exploitative practices like tracking ads and tracking walls should be phased out, and other forms of user tracking, should be robustly regulated.
  • Processing of electronic communications data, i.e. information relating to messages and interactions sent or received through digital devices and platforms, should be limited to strictly defined, legitimate, and proportionate purposes, ensuring that users have clear knowledge of how their data will be used and the ability to withdraw their consent at any time without detriment.
  • Deceptive design practices across all digital platforms and services should be prohibited in favour of design standards that respect people’s rights, foster trust, and promote a user-centred digital ecosystem.
  • Manipulative design features that foster addiction, particularly those targeting vulnerable groups like children and adolescents, should be prohibited.
  • Personalisation practices should be transparent, user-centric and rooted in meaningful consent. This requires clear disclosures about how personal data is being used to tailor content, services and advertisements, as well as genuine user control over the extent and nature of the personalisation. Users should have easy access to opt-out mechanisms without facing any form of coercion or manipulation. Any personalisation must be non-discriminatory and not exploit vulnerabilities or reinforce biases.
  • In order to increase legal certainty, any regulatory framework should include clear and precise definitions that uphold the integrity of the Court of Justice’s case law and prevent loopholes that can be used for non-compliance.
  • Simplifying existing rules should never mean lowering the bar with regard to fundamental rights. Rather, it should focus on clarifying and streamlining regulations to ensure they are precise in their definitions, accessible to all, and do not contain loopholes weaknesses in their enforcement.

The European Commission’s Digital Fairness Fitness Check report serves as a critical reminder that the digital landscape demands our unwavering attention. Issues such as deceptive and addictive interface designs, and invasive tracking practices are not just technicalities; they directly impact the enjoyment of fundamental rights in the daily digital lives of billions of people.

That is why EDRi advocates for robust regulations that address the complex web of commercial surveillance, protect individuals from data-driven exploitation, and foster a fair and just digital space for everyone.

Itxaso

Itxaso Domínguez de Olazábal

Policy Advisor

Jan Penfrat

Senior Policy Advisor