Blogs | Information democracy | Equal access to the internet | Freedom of expression online

Online content moderation: Where does the Commission stand?

By EDRi · December 18, 2019

The informal discussions (trilogues) between the European Parliament, the Council of the European Union and the European Commission are progressing on the Terrorist Content Regulation (TCO, aka “TERREG”). While users’ safeguards and rights-protective measures remain the Parliament’s red lines, the Commission presses the co-legislators to adopt what was a pre-elections public relations exercise, rather than an urgently needed piece of legislation. Meanwhile, the same European Commission just delivered a detailed opinion to France criticising its currently debated hate speech law (“Avia law”). The contrast between the Commission’s positions supporting certain measures in the Terrorist Content Regulation and opposing similar ones in the French Avia law is so striking that it is difficult to believe they come from the same institution.



Scope of targeted internet companies

In its letter to the French government, the Commission mentions that “it is not certain that all online platforms in the scope of the notified project […] pose a serious and grave risk” in light of the objective of fighting hate speech online. The Commission also notes that the proportionality of the envisaged measures is doubtful and is missing a clear impact assessment, especially for small and medium-sized enterprises (SMEs) established in other EU Member States.

These considerations for proportionate and targeted legislative measures have completely vanished in the context of the Terrorist Content Regulation. The definition set out in the Commission’s draft Regulation is too broad and covers an extremely large, diverse and unpredictable range of entities. Notably, it covers even small communications platforms with a very limited number of users. The Commission asserts that terrorist content is currently being disseminated over smaller sites, and therefore, the Regulation obliges them “to take responsibility in this area”.

What justifies these two very different approaches to a similar problem? That is not clear: On the one hand, the Commission denounces a missing evaluation that an obligation to adopt measures preventing the redistribution of illegal content (“re-upload filters”) in the Avia law would have on European SMEs. On the other hand, it does not provide any analysis in its impact assessment on the Terrorist Content Regulation of the costs that would entail setting up hash databases for automated removal of content and still pushes for such “re-upload filters” in trilogues.

Expected reaction time frame for companies

The European Commission criticises the 24-hour deadline the French proposal introduces for companies to react to illegal content notifications. The Commission held that “any time limit set during which online platforms are required to act following notification of the presence of illegal content must also allow for flexibility in certain justified cases, for example where the nature of the content requires a more substantial assessment of its context that could not reasonably be made within the time limit set”. Considering the high fines in cases of non-compliance, the Commission believes it could place a disproportionate burden on companies and lead to an excessive deletion of content, thus undermining freedom of expression.

A year ago, the Commission strongly supported in the original TCO proposal the deletion of terrorist content online within one hour of receipt of a removal order. No exception for small companies was foreseen despite their limited resources to react in such short time frame, leaving them with no other choice than to pay the fines or apply automated processing if they have the means to do so. Although removal orders do not technically require the platform to review the notified content within one hour, the Commission’s proposal allows for any competent authority to issue such orders, even if they are not independent.

Terrorist content is as context-sensitive as hate speech

In the letter sent to the French government on the Avia law, the Commission argues that the French proposal could lead to a breach of Article 15(1) of the E-Commerce Directive, as it would risk forcing online platforms to engage in an active search for hosted content in order to comply with the obligation to prevent the re-upload of already identified illegal hate speech. Again, the Commission regrets that the French authorities did not provide sufficient evidence that this measure is proportionate and necessary in relation to the impact on fundamental rights including the rights to privacy and data protection.

At the same time, the Commission (and the Council) seemed in the TCO Regulation uncompromising on the obligation put on platforms to use “proactive measures” (aka upload filters). As in the copyright Directive discussions, EDRi maintains strong reservations against the mandatory use of upload filters, since they are error prone, invasive and, likely to produce “false positives”, meaning nothing less than a profound danger for freedom of expression. For example, current filters used voluntarily by big platforms have taken down documentation of human rights violations and awareness-raising material against radicalisation.

The turn of the Commission position regarding online content in the Avia law sets a positive precedent regarding online content, including upcoming legislation in the Digital Services Act (DSA). We hope that the brand new Commission can keep a similar sensible approach in future proposals.

Recommendations for the European Parliament’s Draft Report on the Regulation on preventing the dissemination of terrorist content online (December 2018)
https://edri.org/files/counterterrorism/20190108_EDRipositionpaper_TERREG.pdf

Trilogues on terrorist content: Upload or re-upload filters? Eachy peachy. (17.10.2019)
https://edri.org/trilogues-on-terrorist-content-upload-or-re-upload-filters-eachy-peachy/

EU’s flawed arguments on terrorist content give big tech more power (24.10.2018)
https://edri.org/eus-flawed-arguments-on-terrorist-content-give-big-tech-more-power/

How security policy hijacks the Digital Single Market (02.10.2019)
https://edri.org/how-security-policy-hijacks-the-digital-single-market/

(Contribution by Chloé Berthélémy, EDRi)