Today, on 8 April 2019, the European Parliament Committee on Civil Liberties, Justice and Home Affairs (LIBE) adopted its Report on the proposed Regulation for moderation of terrorist content online.
Released by the European Commission in September 2018, the proposal was very welcomed in the Council of Member States, which rapidly concluded a political agreement a few months later. Stronger reservations were, however, expressed in the different Committees in charge of the file in the European Parliament,which lead to substantial changes in the Commission’s original proposal.
The most critical points for the protection of fundamental rights concerning the proposed Regulation were taken on board by the LIBE Committee in its Report:
- The definitions of “terrorist content” and “hosting service providers” are clarified and brought in line with the counter-terrorism acquis. Exceptions are provided for educational, journalistic or research material, and LIBE has limited the scope of the Regulation to only cover hosting service providers that make content available to the public at the application layer, leaving out infrastructures providers, as well as cloud and messaging services.
- Amendments to the first instrument, removal orders, require that a single judicial or functionally independent administrative competent authority should be appointed. Unfortunately, the one-hour time frame to respond to removal orders, which is simply not feasible for smaller service providers with limited capacities, was not changed by LIBE, despite the blatant lack of evidence supporting this deadline.
- The possibility for national authorities to refer content to service providers for deletion on the basis of their terms and conditions is now removed from the text. This is a major step forward because this instrument would amount to increased online policing by platforms and a circumvention of legal safeguards attached to removal orders in order to tackle content that is not illegal.
- The LIBE Committee also deleted the obligation of pro-activity, involving the use of automated tools like upload filters. The Parliament is clearly reasserting the prohibition to oblige platforms to generally monitor the user-generated content they host on their services (Article 15 of the e-Commerce Directive).
- Lastly, the principles of the rule of law and the protection of fundamental rights are substantiated with additional transparency requirements falling on competent authorities and stronger redress mechanisms for both hosting service providers and content providers.
Overall, EDRi appreciates that the criticism from both European Parliament Committees on Internal Market and Consumer Protection and Culture and Education, three United Nations Special Rapporteurs as well as a large number of civil society groups was heard by the LIBE Committee.
After the European Parliament elections in May 2019, and once a new EU Commission has been set up, the text will be subject to several rounds of trilogue negotiations between the Parliament, the Council and the Commission. These closed-door meetings aim at finding a middle ground between the diverging positions of the three negotiators. Considering that the Council position did not depart a lot from the Commission’s proposal, there is a significant risk that the “damage control” conducted by the Parliament will be partly rolled back in the next phase of the policy-making process.
Terrorist Content Regulation: Document pool
FRA and EDPS: Terrorist Content Regulation requires improvement for fundamental rights (20.02.2019)
Terrorist Content Regulation – prior authorisation of all uploads? (21.11.2018)
EU’s flawed argument’s on terrorist content give big tech more power (24.10.2018)
Press Release: EU Terrorism Regulation – an EU election tactic (12.09.2018)
(Contribution by Chloé Berthélémy, EDRi)