Blogs | Privacy and data protection | Cross border access to data | Privacy and confidentiality | Surveillance and data retention

Terrorist Content Online: Is this the end?

On 10 December, the European Parliament and the German Presidency acting on behalf of the Council reached a provisional agreement on the Regulation addressing the dissemination of terrorist content online.

By EDRi · December 16, 2020

Designed by: Claggett Wilson/ Symphony of Terror

Two years after the release of the Commission’s proposal, the problematic parts of the agreed text have been successfully curbed thanks to a large coalition of civil society groups. As a result of this collective action by digital rights groups, journalists, free speech and rule of law organisations; we have been able to improve the worrying initial proposal. However, the agreed text is still a softened version of the original proposal and still raises doubts as to whether it would pass the legality test of the Court of Justice of the European Union or not.

Saving what could still be saved

The joint action of civil society groups and pro-fundamental rights MEPs in the Parliament rallied together during the last rounds of negotiations and succeeded in ensuring:

  • A clause that now excludes material disseminated for educational, journalistic, artistic or research purposes from the scope of the Regulation. Moreover, purposes of preventing or countering terrorism shall not be considered terrorist content including content which represents an expression of polemic or controversial views in the course of public debate;

  • Hosting service providers are still obliged to delete content within one hour after receiving a removal order. However, some discharges have been granted in extenuating circumstances. Smaller or nonprofit platforms can evoke“technical and operational reasons” if they cannot comply with the order on time (e.g. no staff working on weekends);

  • The definitions in the Regulation finally match those of the EU Directive on combating terrorism, which is better than the original too broad definition. Unfortunately though, it includes all problematic terms of the Directive that allow for excessively wide criminalisation of speech.

The darker picture

Nonetheless, the Terrorist Content Online Regulation remains an unacceptable legislation for people’s fundamental rights.

Voluntary upload filters

Although authorities can no longer impose the use of automated content filtering tools, platforms are highly encouraged throughout the text to make all efforts to delete terrorist content, including by relying on their terms of services. Given current content moderation practices, it would involve content filtering. The problems of these [https://www.euractiv.com/section/digital/opinion/misguided-solution-to-terrorist-content-will-have-bad-consequences-for-our-rights/] context-blind, ill-suited technologies remain unanswered and unaccounted for.

Little requirement for competent authorities

National authorities, designated by Member States and empowered to make decisions under the Regulation, won’t necessarily be independent judicial ones. In its recent statement, the International Committee of Jurists commented about the risks resulting from this lack of independence, “leading to excessive, arbitrary or discriminatory interference with the freedoms of expression, religion, assembly and association online as well as with rights to privacy and data protection of persons residing or present in EU Member States.” Likewise, there is a substantial risk that the exemption granted to journalists, artists and researchers would become groundless.

It is however specified that these authorities cannot seek or take instructions from any other body while performing their tasks. How this will work in practice remains to be seen as each national government and jurisdiction in the EU will probably differ in its approach.

Cross-border removal orders authorised

Any competent authority will have the power to order the deletion of content anywhere in the EU within one hour. This means that it can extend its enforcement jurisdiction beyond its territory without prior judicial approval on the lawfulness of the order and of the rights in the affected jurisdictions. In light of the serious threats to the rule of law in certain Member States, the mutual trust that underpins the European judicial cooperation might be seriously undermined.

Yet, the text foresees a specific procedure whereby the competent authority of the Member State where the hosting service provider is established may, within 72 hours from receiving a copy of the removal order, check whether or not it “seriously or manifestly” breaches fundamental rights. The decision is binding, nullifying the order in case of breach and leading to the reinstatement of content.It is uncertain whether the requirement that it needs to be serious or manifest would cover disagreements over what constitutes terrorism, irony, or journalistic reporting. The concerned service provider and the user whose content is targeted may also explicitly request the Member State of establishment to carry out the check. It shows that in reality these Member State of establishment will never check foreign removal orders which is why legislators expect service or content providers to additionally motivate this.

There is also no protection against the platforms’ own decision to remove the content based on its terms of services, even after it is reinstated. After all, if one competent authority believes that the content is likely illegal, why would the platform risk leaving it online?

Open to be challenged in court?

With such far-reaching censorship powers given to authorities, the absence of strong oversight and given the  national and EU-level jurisprudence on freedom of expression, it is hard to see how the future Regulation would stand in court and not be overturned. Notably, the power to censor content online within an hour, without prior judicial authorisation, might not be in line with the principles of necessity and proportionality enshrined in the Charter of Fundamental Rights. In the context of the Digital Services Act, a legislative package which aims at updating the EU rules governing online service providers’ responsibilities, the fate of the Terrorist Content Online Regulation will be a crucial learning.

Next steps

The agreed text will be further finalised and remains to be adopted by the two legislators, the Council and the Parliament respectively. The vote in the Parliament’s plenary will most likely take place in January.

(Contribution by Chloé Berthélémy, EDRi)