Blogs | Information democracy | Freedom of expression online

E-Commerce review: Safeguarding human rights when moderating online content

By EDRi · September 4, 2019

This is the fourth and last blog post in our series on Europe’s future rules for intermediary liability and content moderation. You can read the introduction here.

In our previous blog posts on the upcoming E-Commerce review, we discussed examples of what can go wrong with online content regulation. But let’s imagine for a moment that all goes well: the new legislation will be based on actual evidence, a workable liability exemption for hosting companies will be well maintained, and potential negative side effects will be mitigated.

Even then, policymakers will need to put in place sufficient safeguards in order to avoid the wrongful removal of legitimate speech, art and knowledge.

A workable notice-and-action system

The current E-Commerce Directive and the absence of workable notice-and-action rules have created a Wild West of intransparent content moderation and removal practices. Notice-and-action rules would establish a coherent system for people to flag illegal content or activities on platforms like Facebook, Youtube and Twitter (the “notice”) and to which the platform companies would be obliged to respond (the “action”). Which action to take should of course depend on the type of allegedly illegal content or activity that is concerned. Implementing those rules would need to be mandatory for platform companies.

Although many academics and NGOs have written extensively about safeguards that should be introduced in any notice-and-action system, there are currently no minimum procedural requirements in place in the EU that hosting companies would be obliged to follow. This is not good for reputable businesses and certainly not good for people.

Some examples of human rights safeguards

  • There should be an obligation for hosting companies to notify users when their content has been flagged by someone, ideally before any action against the content is taken, with exceptions where law enforcement needs to investigate criminal offences.
  • Hosting companies should be obliged to report certain very serious criminal offences (such as the distribution of child abuse material) to law enforcement.
  • Users whose content has been flagged should be able to send a counter-notice to defend themselves against wrongful removals.
  • Users need a general right to appeal content moderation decisions taken by hosting companies. They should also be informed about their right to an effective judicial redress if their appeal has been unsuccessful.
  • Hosting companies should be bound to clearly defined procedural time frames for reacting to notices and counter-notices.
  • There should be a minimum standard defining how a valid notice must look like and what it needs to contain; abusive notices should be discouraged by effective administrative fines.
  • Where ever possible, temporary suspension of allegedly illegal online content or activities should take precedence over definite removals.
  • Transparency reports about removals, wrongful take-downs, hosting companies’ policies, processes and other practices that impact user rights should be required.

While this list is not exhaustive, it provides the baseline for a human rights-respecting notice-and-action system that should be implemented as part of an E-Commerce Directive review. Designing such a system is far from simple, and there are opposing commercial and political interests involved that will push hard to have it their way.

However, similarly to the General Data Protection Regulation (GDPR) and data protection, getting the content moderation conundrum right provides Europe with a unique opportunity to set global human rights standards for a thriving online space – a space where everybody can feel safe, express themselves freely, and benefit from unhindered open access to the vast amount of knowledge and opportunity that is the internet.

E-Commerce Directive
https://ec.europa.eu/digital-single-market/en/e-commerce-directive

Safeguards for Freedom of Expression in the Era of Online Gatekeeping (11.09.2018)
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3247682

Notice-and-Fair-Balance: How to Reach a Compromise between Fundamental Rights in European Intermediary Liability
https://www.ivir.nl/publicaties/download/Notice_and_Fair_Balance.pdf

Manila Principles on intermediary liability
https://www.manilaprinciples.org/

The Santa Clara Principles On Transparency and Accountability in Content Moderation
https://santaclaraprinciples.org/

This article is the introduction to our blogpost series on Europe’s future rules for intermediary liability and content moderation. The series presents the three main points that should be taken into account in an update of the E-Commerce Directive:

  1. E-Commerce review: Opening Pandora’s box?
  2. Technology is the solution. What is the problem?
  3. Mitigating collateral damage and counter-productive effects
  4. Safeguarding human rights when moderating online content