Warning: the EU’s Digital Services Act could repeat TERREG’s mistakes
On 30 September, the Committee of Legal Affairs (JURI) in the European Parliament approved its draft report on a Single Market For Digital Services (Digital Services Act). We had expressed our concerns about negative fundamental rights implications of some proposed measures in the report to the Members of the European Parliament. However, it did not stop the JURI Committee from giving them their green light.
On 30 September, the Committee of Legal Affairs (JURI) in the European Parliament approved its draft report on a Single Market For Digital Services (Digital Services Act). We had expressed our concerns about negative fundamental rights implications of some proposed measures in the report to the Members of the European Parliament. However, it did not stop the JURI Committee from giving them their green light.
The regulatory race between Member States of the European Union (EU) about who proposes a faster fix to online hate speech or terrorist content resulted in a number of legal responses that do not comply with the EU Charter of Fundamental Rights or their very own national constitutions. The EU itself did not remain immune to such developments. Significant political pressure imposed by few national governments eventually bore its fruit and in 2018, the European Commission introduced the first text of the online terrorist content regulation, commonly referred to as ‘TERREG’, but officially the Terrorist Content Regulation, or TCO. The controversial legal proposal full of shortsighted “solutions” to deeply complex societal issues such as terrorism and radicalisation online triggered negative critique by international human rights bodies and civil society organisations. While the joint fight against the dangerous measures this Regulation entails achieved a lot of damage control over the last two years (see the timeline), in spring 2021 the European Parliament approved Regulation (EU) 2021/784 on addressing the dissemination of terrorist content online without final vote in the plenary of the European Parliament.
With the launch of the proposed Digital Services Act, a new hope sparked for digital rights advocates in the EU and beyond. In contrast to TCO Regulation, the DSA focuses on regulating systems and processes deployed by platforms, instead of addressing concrete categories of illegal content online. This novel approach is also reflected in, for instance, newly established due diligence safeguards and strong criteria of meaningful transparency. However, the ongoing negotiations in the European Parliament have thus far demonstrated how EU co-legislators refuse to learn from their past mistakes. Once again, we saw measures such as unduly short time frames for content removals or weakening judicial oversight appear in draft reports of the main parliamentary committees as well as in proposed amendment compromises. The purpose of this blog post is to repeat what was said many times before: silver bullet solutions to stop the spread of illegal content online will harm fundamental rights protection of individuals and ultimately miss their mark. The DSA gets this and so should the European Parliament before it dismantles the proposal that could be a turning point in content governance, with the potential to shape regulatory response to the same issues globally. The following analysis provides examples for Members of the European Parliament on what not to do and why.
Same old but different – when TERREG meets DSA
1. Unjustified and non-proportionate time frames for content removals
In the last two years, there have been heated debates in the European Parliament regarding the TCO Regulation content removal deadline, whether a one-hour time frame is doable for providers to remove or block terrorist content from the internet. It was clear that the one-hour time frame is disproportionate and insufficient, especially for smaller platforms and European startups, which don’t have the resources to act expeditiously. The one-hour time frame would have also prevented platforms from seeking a prior decision from the court or any independent authority, especially because it was combined with severe sanctions for failing to comply with removal orders. The strict one-hour time frame in its original form was finally withdrawn from the text, only to pop up in the DSA IMCO amendment, which requires platforms to proactively remove illegal content within 30 minutes when the illegal content pertains to the broadcast of a live sports entertainment event. In other cases, the amendment proposed by the rapporteurs of the DSA, offers 24 hours to the providers to disable access to an illegal content, such as content that can harm public policy. A vague definition of public policy could seriously harm the work of investigative journalists or whistleblowers. DSA is based on the same assumption as TCO Regulation, namely, it fails to address the concerns that platforms are the arbiters of online speech and not the independent judiciary. The legality of content should not depend on the understanding of platforms and their moderators.
It is time to realize that the requirement to remove allegedly illegal content in such a short time frame without the possibility to turn to the court will eventually empower platforms with extra power without proper oversight. Decisions about content removal have significant implications for society and individuals: it has a censoring effect and limits the users’ freedom of speech. The minimum requirement is to have proper oversight and certain preconditions that safeguard fundamental rights. The removal of content should never be left entirely to the companies.
1. Enforcement of companies’ private rules instead of the law
The TCO Regulation is one instrument among many others in the field of content regulation that further encourages and coerces intermediaries to police people’s online content. It introduces obligations (called “specific measures”) for hosting service providers to “voluntarily” restrict online speech on the basis of their terms of service – outside the legal framework. From a fundamental rights perspective, it is highly problematic that restrictions on rights and freedoms are left to the discretion of private companies and are no longer based on the law. Voluntary measures are arbitrary, non-accountable, difficult to contest, and often lead to the over-removal of perfectly legal content. On the most common online platforms, they also imply the use of automated tools to monitor and filter all content all the time, thus increasing the number of wrongful takedowns.
Proposals to put private platforms in charge of policing online content are very much part of the debate on the DSA. However, contrary to the limited scope of the TCO Regulation, the service providers would be expected under the DSA to search and delete any type of potentially illegal content under EU and national law. Given the plurality of and divergences among national laws regulating freedom of expression, it is expected that companies play safe and ban a wider range of content than what would be strictly necessary and proportionate. This undemocratic system of corporate censorship needs to be prevented in the future legislation.
2. Lack of independent judicial oversight
Proper safeguards are needed whenever there is a decision that limits the individuals’ fundamental rights. Judicial oversight is one of these safeguards. While the legislator missed the opportunity in the TCO Regulation to require independent judicial oversight of users’ content removal, the very same problem emerges from the DSA amendments submitted by the responsible IMCO committee. Competent authorities can and should support platforms when needed, but these authorities are not equally protecting freedom of expression as the judiciary. There is an additional problem with competent authorities, namely the lack of independence in several EU member states. Government captured media authority in Hungary is a good example of biased decision-making methods. That is why judicial independence and legal certainty are the basic requirements for content removal
However, DSA goes even further and requires platforms to proactively inform law enforcement if “serious criminal offence is likely to take place”, even though platforms and moderators are not trained lawyers or able to assist victims of crimes. Such a requirement will empower platforms to censor user content for the slightest chance of illegality to avoid any liability.
Hold your breath! The TERREG story ain’t over in the EU
The ongoing DSA negotiations uncannily resemble the TCO Regulation debates. We urge the EU co-legislators to learn from their past mistakes and to act in full compliance with the Charter of the EU and the constitutional tradition of the EU Member States.
While not perfect, the DSA proposal symbolises the shift away from the old rhetoric of content governance that focused on symptoms and not the causes of societal problems, and relied on non-transparent actions by online platforms to make decisions that heavily impact human rights and democratic discourse. The European Parliament is being given a unique opportunity to set the course toward the right direction and to dismantle walled gardens built by too-powerful players that hold too much control over individuals’ personal agency. We will continue to monitor and support the European Parliament in strengthening the human rights approach to platform governance where users come first.
(Contribution by: Eliska Pirkova, Europe Policy Analyst and Global Freedom of Expression Lead at Access Now, Eva Simon, Senior Advocacy Officer at Civil Liberties Union for Europe and Chloé Berthélémy, Policy Advisor at EDRi)