The next important battle for our rights and freedoms in the digital sphere is looming on the horizon. While the public debate has recently focused on upload filters for alleged copyright infringement and online “terrorist” content, a planned legislative review will look more broadly at the rules for all types of illegal and “harmful” content.
This review aims to update the rules on how online services, such as social media platforms, should or should not delete or block illegal and “harmful” content. A reform might also bring changes to how online services could be held liable when such content is not taken down. The big question is: will the review of the E-Commerce Directive (ECD) open Pandora’s box and become one of this decade’s biggest threat to citizens’ rights and freedoms online – or will it be a chance to clarify and improve the current situation?
Christchurch, copyright and election manipulation
The recently adopted Copyright Directive and the draft European rules for the removal of terrorist content online initiated the creation of sector-specific rules for content removals.
Events like the Christchurch tragedy, potential disinformation threats during the European elections and hateful comments from increasingly radicalised right-wing extremists after the murder of a German pro-migrant politician contributed further to the debate surrounding illegal and “harmful” online content.
These events led to a multiplication of calls towards online services to “do more” and to “take more responsibility” for what is being uploaded to their servers. Several countries have started discussions about the adoption of national rules. For instance, following the German example, France has just introduced a law against online hate and the UK published a controversial Online Harms Paper.
E-Commerce Directive: what is it and its unavoidable reform
Adopted nearly 20 years ago, the E-Commerce Directive sets up liability exemptions for hosting companies for content that users share on their networks. Until very recently, these rules applied horizontally to all sorts of illegal content, including copyright infringements, hate speech and child abuse material. The current rules for take-downs and removals are therefore (indirectly) defined by the ECD.
While the Directive is not perfect and created a few issues, mainly due to lack of clarity, its safe harbour provisions encouraged the protection of the fundamental rights of users, in particular the freedom of expression and that of information.
Since the adoption of the ECD however, the landscape of services that might or might not fall under liability exemptions has drastically changed. Notably, cloud services and social media platforms became very important players and some have gained significant market power. Currently, a few number of dominant platforms have a high impact on individuals’ rights and freedoms, our societies and on our democracies.
The nature of the internet has also vastly changed in the past 20 years towards an increasingly participatory community. As a result, the amount of user-generated content increased exponentially too. On the other hand, we witness more government pressure on companies to implement voluntary mechanisms against alleged illegal or “harmful” content online. These two parallel developments resulted in an increasing number of wrongful removals and blocking of legitimate speech.
In the past months, the Directorate-General for Communications Networks, Content and Technology (DG Connect) of the EU Commission already started the process of exploring policy options for content moderation that will be presented to the incoming College of Commissioners. A reform of the ECD to attempt the harmonisation of liability exemptions and content moderation rules seems to have become unavoidable.
The upcoming reform can therefore be both a chance and a potential trap for policy-makers. On one hand, it offers the opportunity to create legal certainty and introduce safeguards that will enable users to enjoy their rights and freedoms. On the other, the reform can be a trap if policy-makers embrace blunt one-size-fits-all solutions that avoid real solutions for societal issues and instead lead to massive collateral damages.
This article is part of a new series of blog posts that looks at essential points that need to be taken into consideration during future discussions around content moderation and intermediary liability:
- #2 E-Commerce review: The need for evidence-based policy-making
- #3 E-Commerce review: Counter-productive effects and collateral damage
- #4 E-Commerce review: Safeguards for human rights