Five lessons from three years of risk assessments under the Digital Services Act
Under the Digital Services Act (DSA), Big Tech platforms are required to annually assess systemic risks tied to their services and implement measures to mitigate them. EDRi member, the European Center for Not-for-Profit Law (ECNL) analysed the first three rounds of these risk assessments, spanning from 2023 to 2025, and identified five major gaps. In their current form, these assessments are unlikely to provide meaningful transparency or accountability for decisions affecting millions of internet users, raising fundamental questions about their usefulness and future direction.
Why do DSA risk assessments matter?
The DSA obliges very large online platforms (VLOPs) and very large search engines (VLOSEs) – such as Instagram, TikTok, YouTube and Google Search – to conduct annual evaluations of systemic risks linked to their service design, functioning and usage, as well as measures taken to mitigate the identified risks. As part of this practice, platforms must, for example, assess risks including:
- dissemination of illegal content;
- negative effects on fundamental rights;
- negative effects on civic discourse and electoral processes;
- gender-based violence;
- protection of minors; or
- physical and mental well-being of users.
Summaries of these assessments are audited and published on the respective platforms’ websites, offering civil society, regulators and the public insight into how platforms operate, which risks they foresee, and the measures they take to safeguard users.
By 2026, many platforms have completed three rounds of risk assessments. Civil society scrutiny of the first round already revealed several shortcomings, including an insufficient analysis of how platform design features, like recommendation algorithms, shape risks, and a lack of robust data on mitigation measures.
To deepen the understanding of whether risk assessments can be useful to external stakeholders, ECNL reviewed risk assessments published between 2023 and 2025 by five platforms – Facebook, Instagram, TikTok, YouTube and X – focusing on their evaluation of risks to fundamental rights. This analysis drew on the DSA requirementsas well as established standards for human rights impact assessments, which had informed recommendations by ECNL and Access Now, published in 2023.
Three years, five critical gaps
While the latest assessments show some progress, major gaps still remaint. Despite often exceeding 100 pages, reports still lack the necessary detail to fully understand the decisions of platforms and their potential impact ont people’s fundamental rights.
ECNL has identified five overarching issues:
- Vague risk statements. reports often rely on general declarations rather than detailed assessments of identified risks.
- Insufficient attention to fundamental rights: platforms devote less attention to assessing risks to rights compared to other systemic risks.
- Irrelevant or missing data: metrics used are at times irrelevant, and platforms’ claims about risks or mitigation measures are often not supported by evidence.
- Limited stakeholder engagement: it remains unclear how consultation with civil society or affected communities has influenced risk identification or mitigation strategies.
- Neglect of EU diversity:–risk assessments often fail to reflect the EU’s geographic and linguistic diversity.
These gaps raise key questions: How can risk assessments be strengthened? Do they truly serve the European Commission and civil society as meaningful oversight tools, or are they simply “tick-box exercises” filled with unverified claims?
Clear guidance and evidence are needed
Substantive improvements are essential for risk assessments to be genuinely useful,
ECNL recommends that the European Commission provide clear guidance on the data that platforms must disclose and establish standards and expectations for reporting. Without this clarity, future assessments risk repeating the same shortcomings.
Platforms themselves must also take responsibility by providing evidence-backed insights into risks and the effectiveness of mitigation measures. While the DSA allows civil society researchers access to some access to platform data, this alone will not resolve the gaps. Practical barriers remain: some platforms may restrict access (as we have seen, for example in the case of X), and data analysis requires technical expertise that many organisations lack.
Will risk assessments deliver on their promise?
The effectiveness of DSA risk assessments depends on addressing these shortcomings. If these issues are addressed, risk assessments can become powerful tools for transparency and accountability, helping civil society and regulators to understand and mitigate online risks. Otherwise, risk assessments may remain superficial compliance exercises, offering little insight into the real-world impact of platforms on fundamental rights.
Contribution by: EDRi member, European Center for Not-for-profit Law (ECNL)
