New copyright compromise text: Filter or be filtered
Discussions on the censorship machine continue at the level of the Council of the European Union.
After the “compromise” text that Member of the European Parliament (MEP) Axel Voss presented to the European Parliament Committee on Legal Affairs (JURI), the Bulgarian Presidency of the Council held a meeting after which they published new text on Article 13 of the copyright Directive.
The new Presidency text follows the same path towards a censorship machine as the previous “compromise” text. First, it states that an online content sharing service provider “is capable” of performing acts of “communication to the public” or “making available” − two specific copyright concepts meaning in essence publishing copyrighted works − and therefore requires authorisation for this activity. This definition encompasses any service that gives access to copyright protected works “or other subject matter” uploaded by its users. These services would be liable for unlawfully “communicating or making available to the public” when they do not “prevent the availability” of unauthorised works, which technically translates to the obligation to install upload filters. They could also be held liable for failing to remove this unauthorised content once it is notified of its existence in their platform, which is not so far removed from the current legal framework. This brings nothing new to the discussion. If you are one of the many online platforms that would fall under that definition of liable companies, the slogan for Article 13 is still “Filter or be filtered”.
The Presidency proposal includes, again, the mention to appropriateness and proportionality in light of different characteristics of the services. However, this provision will be ineffective and unenforceable in practice. The Copyright Directive will be implemented by Member States that are subject to EU law, including the Charter of Fundamental Rights as primary law of the European Union. However, the proposed Directive still leaves it to companies to decide to what extent and how to “prevent the availability” of unauthorised content. Companies are not bound by the Charter, and therefore it will be up to them to decide which measure to put in place. Furthermore, it is also theoretically possible that, despite the evidence against the efficiency of filters and of their necessity and proportionality, Member State law implementing the Copyright Directive may consider the use of filters and necessary and proportionate. In this case, it will be up to citizens to take their national laws to the courts. If they have the means to do so and the years it will take to reach a final decision of the Court of Justice on the definitive meaning of this chaotic text.
Paragraph 6 of the Presidency text addresses the issue of how individuals can enjoy the rights to use copyrighted works when they do not harm the normal exploitation of the work: these are the exceptions and limitations to the general rules on copyright that both optional and different in every EU Member State. In order to ensure that these freedoms are still enjoyed in an after-filter copyright scenario, a complaint mechanism is proposed. The efficiency of this mechanism will be limited or simply not available. Every time an uploaded meme or home-video is subject to a (dis-harmonised) exception in their country, it will be left to individual uploading the contents to challenge the filter, but only if the internet company chooses not to categorise the deletion as a terms of service violation. Good luck with that!
The text of recital 37a in the Presidency text has aimed at clarifying which services will not be covered. This recital takes out internet access providers, cloud services such as cyberlockers, online marketplaces and scientific or educational repositories. In practice, this aims at leaving services such as GitHub, Amazon and Wikimedia (among others) outside the scope of this Directive. It is less clear how this will work. For example, Wikimedia is both a company and a foundation, so it will be difficult for them to ascertain their non-for-profit aspect to not having to implement upload filters on Wikipedia. Furthermore, recital 37b states that in order to have the final decision about which service will be under these obligations to monitor content, a case-by-case analysis needs to be done (not clear by whom) taking into consideration the number of protected files available on that service. Hardly a level of legal security that is in line with a notion of “better regulation”.
After setting up the case where all sort of online content sharing services will be covered, it obliges to engage into mandatory licensing − mandatory licensing when services are de facto the same as services that are licensed. However, it is less clear how that will not affect other lawful online content sharing services that could end up under the same scope and how many new services will not be created in Europe because of the uncertainty created by this Article 13. Why would anyone choose to set up a service in Europe, faced with such a chaotic legal framework, rather than seeking refuge in part of the world where the laws actually make some form of sense?
General monitoring of communications to block “undesirable” content (21.02.2018)
European internet filter will destroy your freedom of expression: Stop it now! (27.02.2018)
Final Copyright “compromise”: Upload filters for everyone but Google & Co (23.02.2018)
In the making: The largest internet filter Europe has ever seen (22.02.2018)
(Contribution by Diego Naranjo, EDRi)