Breaking the extractive digital business model: a rights-based Digital Fairness Act

EDRi’s new position paper addresses the growing threat of manipulative and unfair platform design in the EU’s digital environment. It examines how deceptive interfaces, exploitative personalisation, and addictive design practices are embedded in today’s digital economy and why existing laws fail to tackle their structural roots. Grounded in a rights-based analysis, the paper argues that the Digital Fairness Act must deliver systemic change by embedding fairness by design and by default into the digital infrastructure that shapes our lives.

By EDRi · February 23, 2026

Data extraction and behavioural profiling as the engine of digital power

Digital environments today are driven by pervasive data extraction and behavioural profiling, set to maximise engagement, capture attention and exploit any vulnerability for commercial gain.

These practices are not incidental features of online services; they are central to their economic model and they shape the architecture of digital spaces. Interfaces are engineered to trigger psychological responses, often described as addictive design. Users are nudged, steered, or misled into decisions they would not otherwise make, practices widely known as deceptive design. At the same time, personalisation systems target content and services in ways that exploit users’ specific vulnerabilities, a form of unfair personalisation.

Despite growing awareness and mounting evidence, these practices remain largely unregulated in the EU. The result is a digital ecosystem where users are manipulated by default, fairness is treated as optional, and the power imbalance between platforms and people continues to deepen.

This is why EDRi is releasing a new position paper on the Digital Fairness Act (DFA), a vital opportunity to structurally address manipulative and unfair design practices online. Drawing from a fundamental rights-based analysis, the paper shows how today’s digital systems structurally exploit users and offers a clear path forward to embed fairness by design and by default in EU law.

Read the position paper

From nudging to engineering behaviour

Digital services are intentionally designed around behavioural prediction and manipulation. Because of that, all users are structurally exposed to harm and vulnerability is built into the system.

When recommender systems and personalisation engines operate as opaque gatekeepers to visibility, opportunity, and information, meaningful participation is compromised. Digital systems are essential infrastructures that shape how people learn, work, communicate, and participate in society. Users can be unknowingly steered toward polarising or distressing content, pressured into revealing sensitive information, or systematically excluded from access to opportunities.

At a broader level, these systems reshape our social contract, reproduce bias, deepen social divides, and distort the information ecosystems on which democracy depends, often invisibly and with limited accountability.

The boundaries between user and citizen, consumer and worker, steadily erode. For this reason, regulation cannot be confined to data protection or consumer rights alone. The stakes are democratic, economic, and social.

The case for a new legislation

In this paper, EDRi makes the case for a Digital Fairness Act as a structural intervention that rebalances power in the digital environment and establishes fairness by design and by default, as a legal requirement.

Existing legislative frameworks address some elements of these harms but only partially and reactively. They tend to focus on visible surface-level unfairness – misleading terms, flawed consent mechanisms, insufficient transparency – while leaving largely untouched the underlying infrastructure that enables and sustains these harms.

Asymmetries of knowledge between platforms and users, behavioural targeting as a default business model, opaque algorithmic systems that structure visibility and opportunity, and interface designs that undermine meaningful consent are not peripheral issues. These are central features of platform power, and were identified by the European Commission itself in the Consumer Law Fitness Check as urgent areas requiring regulatory intervention.

The Digital Fairness Act must address the underlying architecture of manipulation and exploitation in digital platforms and establish a new baseline for justice, accountability, and user autonomy. It must act as a structural intervention, establishing fairness by design and by default, not as an optional feature, a compliance exercise, or a marketing claim.

We are calling on the European Commission to deliver a bold and future-proof Digital Fairness Act, one that reins in manipulative design, reverses power asymmetries, and affirms that fairness is not negotiable in the digital age. If digital systems function as essential social infrastructure, fairness must be built into their foundations rather than retrofitted after harm has already occurred.

Read the position paper