More responsibility to online platforms – but at what cost?
In the European Commission’s internal note published by Netzpolitik.org on 16 July 2019, the Commission presents current problems around the regulation of digital services and proposes a revision of the current E-Commerce Directive. Such a revision would have a huge impact on fundamental rights and freedoms. This is why it’s crucial for the EU to get it right this time.
From a fundamental rights perspective, the internal note contains a few good proposals, a number of bad ones, and one pretty ugly.
In its note, the Commission maintains that no online platform should be forced to actively monitor all user-uploaded content. As the Commission rightly says, this prohibition of a general monitoring obligation is a “foundational cornerstone” of global internet regulation. It has allowed the internet to become a place for everyone to enjoy the freedom of expression and communicate globally without having to go through online gatekeepers.
Unfortunately, the note is somewhat weak with regard to upload filters: the Commission merely says that transparency and accountability should be “considered” when algorithmic filters are used. It’s no secret though that filtering algorithms make too many mistakes – they do not understand context, political activism, or satire. Creating more transparency around the logic and data behind algorithmic decisions of big online platforms is certainly a good start. However, it isn’t enough to prevent fundamental rights violations and discrimination.
The Commission note recognises the need to re-assess whether and how different platform companies should be regulated differently. However, the Commission should bear in mind that not all so-called “hosting intermediaries” covered by its note are platforms similar to Facebook, Google, or Twitter. There are successful hosting intermediaries across Europe – such as the file sharing provider Tresorit or the hosting company Gandi.net – which host their customers’ content in a largely “content-agnostic” way.
Lastly, the Commission acknowledges that since the adoption of the current E-Commerce Directive, the internet has changed considerably: A small number of US-based online platforms developed into businesses with unprecedented market power. The Commission, therefore, proposes to examine “options to define a category of services on the basis of a large or significant market status (…) in order to impose supplementary conditions”. When doing so, the Commission must be careful to clearly define which services would fall into which category in order to avoid collateral damage for other types of services, including those who have not yet been invented.
To guide its future policy initiatives, the Commission says it wants to analyse policy options for both illegal and potentially “harmful” but legal content. While the definition of what is illegal is decided as part of the democratic process in our societies, it is unclear which content should be considered “harmful” and who makes that call. Moreover, the term “harmful” lacks a legal definition, is vague and its meaning often varies depending on the context, time, and people involved. The term should therefore not form the basis for lawful restrictions on freedom of expression under European human rights law.
The Commission acknowledges that when platform companies are pushed to take measures against potentially illegal and harmful content, their balancing of interests pushes them to over-block legal speech and monitor people’s communications to prevent legal liability for user content. At the same time, the note proposes that harmful content should best be dealt with through voluntary codes of conduct, which shifts the censorship burden to the platform companies. However, companies’ terms of service are often a convenient way of removing legal content as they are vague and redress mechanisms are often ineffective.
Drawing from the experience of the EU’s Code of Conduct on Hate Speech and the Code of Practice on Disinformation, this approach pushes platform companies to measure their success only based on the number of deleted accounts or removed pieces of content as well as on how speedy those deletions have been carried out. It does not, however, improve legal certainty for users, nor does it provide for proper review and counter-notice mechanisms, or allow for investigations into whether or not the removed material was even illegal.
The leaked Commission note claims that recent sector-specific content regulation laws such as the disastrous Copyright Directive or the proposed Terrorist Content Regulation had left “most of” the current E-Commerce Directive unaffected. This is euphemistic at the very least. According to these pieces of legislation, all online platforms are required to pro-actively monitor and search for certain types of content to prevent their upload, which makes them “active” under current case law and should flush their liability exemption down the toilet. This is not changed by the Copyright Directive’s claim on paper that it shall not affect the E-Commerce’s liability rules.
But the EU Commission turning a blind eye on this obvious legal inconsistency isn’t the only ugly thing in there. The question that remains unanswered is: how can the Commission save the current liability exemption for the sake of internet users and their fundamental rights, all the while making it compatible with the hair-raising provisions of the Copyright Directive? It looks almost as if everybody secretly hopes that by the time the new Digital Services Act comes into force, sectoral laws such as the Copyright Directive will have been declared invalid by the European Court of Justice.
While such a turn of events would certainly be welcome, in the meantime the Commission should approach this issue transparently, and discuss with civil society and other stakeholders how the liability exemption can be salvaged and the negative impact of the sectoral laws contained.
How to move ahead with an upcoming review of the E-Commerce Directive? Here are our recommendations (that are also explained in more detail in our blog post series on liability and content moderation):
- Before reviewing the E-Commerce Directive, policymakers should answer the following questions: What are the problems that the Digital Services Act should address? Is there a clear understanding of the nature, size, and evolution of those problems? And what does scientific evidence tell us about which solutions could help us solve those problems?
- The Commission should analyse and mitigate any unwanted negative side effects of the proposals in the planned Digital Services Act in order to avoid that problems are only treated superficially while doing immense damage to fundamental rights such as the freedom of expression of millions of people.
- The Commission should strictly limit the scope of the Digital Services Act to illegal content. It would be wise to not venture into the slippery territory of potentially harmful but legal content. Instead, the Commission should follow its own 2016 Communication on platforms.
- Policymakers should seize this unique opportunity to put in place fundamental rights safeguards, due process guarantees, as well as a binding notice-and-action regime. That way, the EU could take the global lead by setting the right standards for moderating online content while protecting fundamental rights.
Leaked document: EU Commission mulls new law to regulate online platforms (16.07.2019)
EU Commission’s leaked internal note on revision of the current E-Commerce Directive (16.07.2019)
E-Commerce review: Opening Pandora’s box? (20.06.2019)
E-Commerce review: Technology is the solution. What is the problem? (11.07.2019)