EDRi welcomes EU preliminary findings on TikTok’s addictive platform design

The European Commission preliminarily found that TikTok was in breach of the Digital Services Act (DSA) due to the addictive design of its platform. EDRi welcomes this decision and urges TikTok to swiftly mitigate the risks to which its users are exposed.

By EDRi · February 9, 2026

The EU’s warning shot on addictive platform design

On Friday, the European Commission preliminarily found TikTok in breach of the Digital Services Act (DSA) due to the addictive design of its platform. EDRi welcomes this decision: the DSA was designed precisely to address systemic risks of this kind, and this case has the potential to push platforms to rethink fundamental design choices rather than rely on easy but ineffective quick fixes.

The Commission’s findings highlight a long-standing concern: engagement-optimised design comes at the expense of users’ physical and mental well-being and can pose grave risks to democracy and public debate.

According to the Commission, features such as infinite scroll, autoplay, push notifications, and highly personalised recommender systems were not adequately assessed. In addition, TikTok’s existing risk mitigation measures were found to be insufficient, prompting the Commission to call for a rethinking of core design mechanics. It is essential, that the Commission now quickly comes to a final decision and ensures that TikTok swiftly mitigates those risks.

This decision marks a significant step in applying the DSA to platform design itself, not only content moderation. It sends a first clear signal that engagement-driven design can constitute a systemic risk, an issue we look forward to seeing addressed under the upcoming Digital Fairness Act (DFA).

Necessarily complementing the DSA, the DFA must extend protection across the digital environment as a whole, address manipulation by design as a default business practice, and ensure that fairness is built into digital services from the outset.

As EDRi has previously stressed, fairness must be embedded by default in all digital products and services, ensuring protection for society as a whole. This includes banning the most harmful manipulative design practices, requiring explicit opt-in for attention- and data-intensive features, mandating fairness-by-design obligations, and reversing the burden of proof so companies must demonstrate that their designs do not exploit user. An ambitious DFA should make fairness measurable, enforceable, and structural across the digital environment, complementing the DSA and closing gaps that systemic design risks currently leave unaddressed.