A fair digital future at risk: EDRi’s contribution to the Digital Fairness Act
The European Commission closed its Call for Evidence for the upcoming Digital Fairness Act (DFA) on 24 October 2025. EDRi urged the Commission to tackle deeply harmful forms of manipulation: addictive design, deceptive design, and unfair personalisation, which undermine people’s fundamental rights to privacy, data protection, autonomy and equality. EDRi calls for strong, binding rules that embed fairness-by-design, ban exploitative features, and reinforce Europe’s digital rulebook against growing deregulatory pressure.
What is the Digital Fairness Act (DFA)?
The Digital Fairness Act (DFA) is the EU’s forthcoming law to modernise consumer protection for the digital age. It follows the Digital Fairness Fitness Check, an evaluation by the European Commission which found that current EU rules, such as the Unfair Commercial Practices Directive and Consumer Rights Directive, cannot adequately address the techniques companies now use to shape people’s behaviour online.
Today’s online environments are filled with systems that continuously learn how to keep users engaged, spending, and sharing data. This often happens through design choices that manipulate attention, emotion, or decision-making rather than supporting free and informed choices. The DFA aims to close these gaps, ensuring that digital products and services respect people’s rights rather than exploit their vulnerabilities.
What counts as manipulative design?
Manipulative design refers to a range of interface tricks or features intentionally built to steer people toward decisions that benefit the service provider rather than the person using it. These include:
- Addictive design exploits psychological triggers to maximise engagement – for example, infinite scroll, variable rewards, or streaks that pressure users to return.
- Deceptive design hides or complicates key information, makes refusal difficult, or misleads through visual tricks, pre-ticked boxes, or confusing toggles.
- Unfair personalisation relies on profiling to target or exclude people, shaping what they see, pay, or can access. It can exploit emotional, financial, or situational vulnerabilities, often without meaningful consent or transparency.
These tactics are not minor nuisances. They distort choice, reinforce inequality, and undermine people’s ability to act freely and confidently online.
Why this matters for EDRi
For EDRi, fairness in the digital environment is inseparable from fundamental rights. Manipulative and exploitative design normalises control and surveillance. It affects not just how people shop, but how they interact, work, and participate in public life.
The DFA is essential to make fairness a structural duty rather than a voluntary promise. It can bridge gaps between existing laws and ensure that design and personalisation respect autonomy, accessibility, and equality by default.
What EDRi is calling for
In its contribution, EDRi urged the Commission to adopt a strong, enforceable, and coherent Regulation.
READ THE FULL EDRi SUBMISSION
Key recommendations include:
- Bans and grey lists for manipulative design: The Commission must update the Unfair Commercial Practices Directive’s Annex I to explicitly prohibit a number of incredibly harmful practices, and it must create a rebuttable list of borderline tactics presumed unfair unless proven otherwise.
- Fair defaults: Explicit consent must be required for attention and data-intensive features such as tracking, personalised pricing, or algorithmic engagement. People should never have to ‘opt out’ of exploitation.
- Build fairness into design: Mandate fairness-by-design and fairness-by-default obligations, requiring companies to prevent manipulation and ensure accessible, symmetrical, and genuinely free choices throughout design.
- Reverse the burden of proof: When manipulation is plausible, traders must show compliance, given the structural information asymmetry between people and platforms.
- Modernise definitions: Update and use precise definitions of core concepts that reflect today’s digital realities. Outdated notions like the ‘average consumer’ no longer capture how people actually interact with adaptive systems, nor how situational or structural vulnerabilisation affects autonomy and consent.
- Require Behavioural Design Impact Assessments (BDIAs): Traders must document and demonstrate that their designs are not manipulative, and share that evidence with regulators.
- Coordinate enforcement across regulators: Allow joint investigations and consistent enforcement across consumer, data-protection, and competition authorities, and potentially others.
- Ensure legal coherence: the DFA must reinforce, rather than weakening, existing rights under GDPR, ePrivacy, the DSA, the DMA, and the AI Act.
Together, these and other measures would make fairness measurable, auditable, and enforceable, transforming it from a principle into a lived reality.
The deregulatory risk
The DFA comes at a deregulatory political moment. Across the EU, ‘simplification’ is increasingly being used as shorthand for lowering protections and weakening accountability. Proposals to reopen or soften flagship laws reveal how the language of flexibility often hides an agenda of deregulation. The same pressure could easily affect the DFA.
The Digital Fairness Fitness Check estimated that unfair practices cause at least EUR 7.9 billion in consumer harm each year, excluding time loss and mental distress. Yet the true impact is far greater than financial loss. Manipulative and exploitative design directly infringes people’s rights, for instance to privacy, data protection, autonomy, equality, freedom of thought, and participation in public life. It shapes how people act and decide, reproduces structural discrimination, and entrenches dependency on platforms that profit from vulnerability.
Simplification must never mean dilution. The DFA must strengthen, not trade away, the protection of people’s rights, ensuring that fairness in the digital environment is treated as a matter of justice, not convenience.
A chance to make fairness structural
If ambitious, the DFA can bridge the gap between consumer, data-protection, and platform regulation. It can make fairness a structural feature of Europe’s digital landscape, ensuring that design serves people, not the other way around.
By embedding fairness-by-design duties, enforcing bans and opt-ins for manipulative features, and requiring transparent, auditable accountability, the DFA could rebuild trust and strengthen Europe’s digital rulebook as a whole.
EDRi and its members will keep advocating for a robust, rights-based, and ambitious DFA that makes fairness an everyday reality for everyone.
