DSA should promote open and fair digital environment, not undermine the rule of law

The Rapporteur of the European Parliament Committee leading one of the most important legal initiatives to regulate platforms has spoken. The Internal Market and Consumer Protection (IMCO) Committee's draft report on the Digital Services Act (DSA) turns online platforms into judge, jury and executioner when it comes to removing online content. This follows the same logic as the Copyright Guidelines that were presented last week. It also gives vast powers to the European Commission and national governments to suppress opposing voices.

The draft report proposes new provisions that would order online platforms to make legality assessments for thousands or even tens of thousands of pieces of content every day without any public scrutiny or transparency. The report suggests that these decisions should be taken in less than 24 hours if the content can harm “public policy” among others. The term public policy is left undefined, which poses huge concerns in terms of possible abuse, especially by state authorities. This is particularly worrying for civil society, human rights defenders, journalists and political opponents.

Similar provisions in the past have demonstrated that when online platforms face the threat of legal liability themselves, this leads to over removal of content with little regard of fundamental rights. That is why such provisions have been declared unconstitutional in France and other countries.

“Content removal within 24 hours for “harm to public policy” and similar vague concepts would allow governments to suppress any opposing content they want – without public scrutiny or transparency”, says Jan Penfrat, Senior Policy Advisor at European Digital Rights (EDRi).

The draft report proposes special protections for politicians and other “public interest accounts”. Before suspending those, platforms would be obliged to get approval by a judicial authority first. It is unclear, what would count as a “public interest account”. The report thus recognises that Big Tech platforms cannot be trusted to reliably take this kind of take-down decisions without judicial oversight, yet it subjects the rest of the population to exactly that.

“Politicians spreading disinformation or harms should not get special protection carved out from arbitrary platform’s decisions.”, argues Jan Penfrat.

 As advocated by EDRi, instead of putting forward over-restrictive measures that risk hurting the rule of law and fundamental rights, the Digital Services Act should establish clear rules and procedures to promote a healthy, open and fair digital environment. The law should contribute to reducing the power that Big Tech has over our lives, not bolster it. 

This press release is a first reaction to the IMCO report. EDRi will publish a longer analysis shortly.

(Contribution by:)

Jan Penfrat

Senior Policy Advisor

Twitter: @ilumium