Digital Services Act: EU Parliament’s key committee rejects a filternet but concerns remain
The European Union's Digital Services Act (DSA) is a big deal. It's the most significant reform of Europe’s internet platform legislation in twenty years and the EU Commission has proposed multiple new rules to address the challenges brought by the increased use of services online. EU members of Parliament (MEPs) showed that they listened to civil society voices: Even though the key committee on internal market affairs (IMCO) did not follow the footsteps of the ambitious DSA reports from last year, MEPs took a stance for the protection of fundamental rights.
Fix what is broken vs break what works: oscillating between policy choices
The European Union’s Digital Services Act (DSA) is a big deal. It’s the most significant reform of Europe’s internet platform legislation in twenty years and the EU Commission has proposed multiple new rules to address the challenges brought by the increased use of services online. While the draft proposal got many things right — for example by setting standards on transparency, proposing limits on content removal, and allowing users to challenge censorship decisions — signals from the EU Parliament showed that we were right to worry about where the DSA might be headed. Leading politicians suggested a dystopian set of rules that promote the widespread use of error-prone upload filters. Proponents of rules that would make any “active platform” (which one isn’t?) potentially liable for the communications of its users, showed that not everyone in Parliament had learned from the EU’s controversial Copyright Directive, which turns online platforms into the internet police, with special license to scan and filter users’ content. Together with our allies, we called on the EU Parliament to reject the idea of a filternet, made in Europe, and refrain from undermining pillars of the e-Commerce Directive crucial in a free and democratic society. As any step in the wrong direction could reverberate around the world, we also added international voices to the debate, such as the Digital Services Act Human Rights Alliance, which stands for transparency, accountability, and human rights-centered lawmaking.
Committee vote: Parliament listened to civil society voices
In this week’s vote, EU members of Parliament (MEPs) showed that they listened to civil society voices: Even though the key committee on internal market affairs (IMCO) did not follow the footsteps of the ambitious DSA reports from last year, MEPs took a stance for the protection of fundamental rights and agreed to:
- Preserve the liability exemptions for internet companies: Online intermediaries will continue to benefit from the “safe harbor” rules, which ensure that they cannot be held liable for content provided by users unless they know it is illegal and don’t act against it (Art 5);
- Uphold and strengthen the ban on mandated monitoring: Under current EU internet rules, general monitoring of information transmitted or stored by intermediary service providers is banned, guaranteeing users’ freedom of expression and their rights to personal data as memorialised in the Fundamental Rights Charter, which enshrines the fundamental rights people enjoy in the EU. MEPs preserved this important key principle and clarified that monitoring should neither be imposed by law or de facto, through automated or non-automated means (Art 7(1));
- Abstain from introducing short deadlines for content removals: The Committee recognised that strict and short time frames for content removals, as imposed under dangerous internet legislation like Germany’s NetzDG or the EU’s controversial copyright directive, will lead to removals of legitimate speech and opinion, thus impinging rights to freedom of expression;
- Not interfere with private communication: The Committee acknowledged the importance of privacy online and rejected measures that would force companies to analyse and indiscriminately monitor users’ communication on private messaging services like Whatsapp or Signal. Even though Parliamentarians were not ambitious enough to call for a general right to anonymity, they agreed that Member States should not prevent providers of intermediary services from offering end-to-end encrypted services and impose monitoring measures that could limit the anonymous use of internet services (Art 7(1b)(1c)).
We welcome these agreements and appreciate the committee’s commitment to approving important procedural justice rights for users as recommended by EFF, such as the reinstatement of content and accounts that have been removed by mistake (Art 17(3)), and the important territorial limitation of take-down orders by courts or authorities (Art 8(2b)), which makes clear that one country’s government shouldn’t dictate what residents of other countries can say, see, or share online.
We also applaud the Committee’s strong focus on transparency—platforms must explain how content moderation works and the number of content moderators allocated for each official language—and for strengthening risk assessment mechanisms—platforms must take into account language and region-specific risks when assessing systemic risk resulting from their service. Lastly, we commend the Committee’s inclusion of an ambitious dark patterns prohibition: Platforms are banned from using misleading design or functionality choices that impair users’ ability to control and protect their internet experience.
Concerns remain: enforcement overreach, trust issues, and walled gardens
However, we continue to be worried that the DSA could lead to enforcement overreach and assign trust to entities that shouldn’t necessarily be trusted. If the DSA becomes law, online platforms would be required to hand over sensitive user information to non-judicial authorities at their request. While we acknowledge the introduction of procedural safeguards—platforms would be granted the right to lodge an effective remedy—essential human rights guarantees are still missing. Other sections of the bill, even though they come with a number of positive changes compared to the EC’s original proposal, still favor the powerful. The text still comes with the option of awarding the status of a “trusted flagger” to law enforcement agencies or profit-seeking industry organisations, whose notices must be given priority over notifications submitted by users. Even though conditions for becoming trusted flaggers were tightened and accompanied by comprehensive reporting obligations, further improvements are necessary.
Parliamentarians also failed to follow the lead of their colleagues, who recently took a first step towards a fair and interoperable market in their vote on the Digital Markets Act (DMA). Whereas DMA amendments called for functional interaction between messaging services and social networks, MEPs did not back key provisions that would ensure interoperability of services and instead went with a lofty non-binding political declaration of intent in the DSA.
Only incremental improvements were put in place on the limits of surveillance advertising and on a requirement that platforms appoint in-country legal representatives, unaffordable for many small non-EU providers. We also agree with the criticism that centralising enforcement power in the hands of the EU Commission comes with democratic deficits and could lead to corporate capture. There are further aspects of the committee’s position that require re-working, such as the last-minute approval of mandatory cell phone registration for pornographic content creators, which poses a threat to digital privacy and could risk exposing sensitive information of vulnerable content creators to data leaks.
We will analyse the details of the committee position in the next weeks and will work to ensure that EU lawmakers agree on a text that preserves what works and fixes what is broken.
The article was first published by EDRi member EFF here.
(Contribution by: Christoph Schmon, International Policy Director at EDRi member EFF)