Blogs | Information democracy | Privacy and data protection | Freedom of expression online | Surveillance and data retention | Transparency

Smashing the law without breaking it: A Commission guide

By EDRi · February 7, 2018

How to create a general monitoring obligation without creating a general monitoring obligation? That is the question that the Commission has been trying to answer with the Article 13 of its “Proposal for a Directive on Copyright in the Digital Single Market”. It aims at solving the issue of a so-called “value gap”, that is the alleged revenue that user uploaded content brings to online platforms and gives them a dominant position in the negotiations with the creators.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Together with 56 other civil society organisations, EDRi sent a letter to the European Commission in October 2017. We have just received a response – hilariously, we were told that the letter has been ready for a long time but it was “stuck in the machine”. Less hilarious is the legal sophistry that the response contained.

The (il)legality of general monitoring

A general monitoring obligation would be contrary to the E-Commerce Directive, European Court of Justice case law, and any reasonable reading of the Charter of Fundamental Rights of the European Union.
The E-Commerce Directive provides for (conditioned) exemptions of liability for online platforms. In practice, online platforms cannot be held responsible for the content that is uploaded by their users, as long as they act swiftly (“expeditiously”) when copyright infringing content (or any form of illegal material) is flagged by rightsholders.

The illegality of general monitoring has been established by the Court of Justice of the European Union (CJEU) in the Netlog/Sabam and Scarlet/Sabam cases. In the first case, between a collecting society and a social network, the Court ruled that hosting service providers could not be put under a legal obligation to install and operate, at its expense and for an unlimited period, filtering systems for content uploaded by the users in an indiscriminate and preventative manner. In the second case, between the same collecting society and an Internet Access Provider (ISP), the Court ruled that ISPs could not have an obligation of using filtering systems that would be able to identify works subjected to copyright. Last but not least, a general monitoring obligation would be contrary to the fundamental rights to protection of personal data and the freedom of expression and information.

Making it mandatory but not obligatory

To circumvent the abovementioned legal obligations, the Commission seems to have chosen the option of coercing online platforms into “voluntarily” operating “censorship machines”. With careful wording in Article 13 and its accompanying recitals, 38 and 39, there is no explicit general monitoring obligation, but rather a strong suggestion that all online platforms could (as the only option suggested) choose to use technologies already used by the biggest platforms, “such as content recognition technologies”.

Article 13 has come under heavy criticism from a wide range of stakeholders – civil society, startups, the internet industry, and academics, Some Member States have questioned its legality and asked clarification to the legal service of the Council. Two of the four Committees of the European Parliament voted against it in their Opinions.

The Commission’s reply to the civil society letter mentioned above is consistent with its line that upload filtering is mandatory not obligatory. It stresses that the proposal respects the Charter of Fundamental Rights, and that “the Commission committed itself to maintaining a balanced and predictable liability regime for online platforms”, and denies the establishment of a general monitoring obligation. But it also reiterates the “obligation” for online platforms to “take measures, such as content recognition technologies, aimed at preventing the upload of infringing content”.

General monitoring is neither “general” nor “monitoring”

The European Commission considers, in line with the US National Security Agency (NSA) argumentation that it opposes in other contexts, that searching through everything that people upload to the internet is not “monitoring”. Why? Because it is only a computer programme looking for copyrighted files. It will only delete your communications, breach your rights to parody, quotation, access to knowledge and freedom of expression, that’s all.

And it is not general monitoring because, after copyright owners give the service provider millions of file identifiers, they will be doing millions of “specific” searches.

So, when in 2001 after two years of discussion, the European Parliament, European Commission and Council agreed to prohibit “general monitoring” of the internet, they were only referring to cases where the entity doing the monitoring did not know what they were looking for. Because the NSA doesn’t do monitoring and neither will the censorship machine. Right?

Reply by the Commission (01.02.2018)
https://edri.org/files/copyright/20180201-EC_ReplyOpenLetter-Art13.pdf

Civil society calls for the deletion of the #censorshipmachine (16.10.2017)
https://edri.org/civil-society-calls-for-the-deletion-of-the-censorshipmachine/

Copyright reform: document pool (12.12.2016)
https://edri.org/copyright-reform-document-pool/

Commission claims that general monitoring is not general monitoring (10.01.2018)
https://edri.org/commission-claims-that-general-monitoring-is-not-general-monitoring/

(Contribution by Anne-Morgane Devriendt, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner