New Estonian Presidency “compromise” creates copyright chaos
Following the launch of the controversial proposed Copyright Directive in September 2016, the European Parliament and the Member States (gathered in the Council of the European Union) are now developing their positions.
The Council is working under its Estonian Presidency, which has produced a new “compromise” proposal. After the Estonian Presidency of the Council proposed massive internet filtering as a “compromise” among the different views of the Member States a few weeks ago, the Estonians proposed yet another “compromise” on 30 October 2017. Regarding Article 13, the Estonian Presidency seems to have rewritten the extreme position of France, Spain and Portugal and simply called this a “compromise”.
This time their “compromise” is even more extreme than the previous one. For example, Article 13.1 of the compromise reads as follows (for context, it is worth reading the entire tortuous text proposed by the Estonians (13.1, 13.1a. 13.1.aa, 13.1.ab, 13.1b, 13.1c, 13.2, 13.2a, 13.3,13.4, 13.5 and recitals 37a to 37f):
“Member States shall provide that an information society service provider whose main or one of the main purposes  (sic) is to store and give access to the public to copyright protected works 
or other protected subject matter uploaded by its users is performing an act of communication to the public or an act making available to the public within the meaning of Article 3(1) and (2)  of Directive 2001/29/EC when it intervenes in full knowledge of the consequences  of its action to give the public access to those copyright protected works or other protected subject matter by organising these works  or other subject matter with the aim of obtaining profit from their use.”
 How many main purposes can a service have?
 Virtually all content that is uploaded to the internet can be copyright-protected.
 This means that the provider is directly using the content from the perspective of copyright law -even when it has no knowledge of it, which means that providers would have significantly more liability for copyright-protected works than for terrorist or child abuse content.
 What might “intervene” mean? “Full knowledge of” what “consequences” and for whom? None of the 115 words of this sentence mentions copyright infringements or “copyright-infringing work”.
 What does “organising these works” mean? The explanatory recital says that making information “easily findable” would be covered. Bearing in mind we are talking about the internet, what might “easily” or “findable” mean for any of the 27 or 28 judicial systems that will have to interpret this?
To understand how the Estonian “compromise” on Article 13 and the associated recitals (i.e. legally-binding explanatory notes) attacks fundamental rights and European businesses, we divide it into three issues: service provider liability (1), upload filtering (2) and possible redress (3).
1) Service provider liability & responsibility
In essence, building on the strategy of the European Commission, the Estonian text undermines the legal certainty of internet companies, practically forcing them to filter, block and delete any content that might create a risk.
According to the Estonian text, a company provides a “communication to the public” if it is hosting copyrighted works (i.e. virtually everything that can be uploaded to the internet) and if it intervenes in the content, for example, by presenting it in a certain manner, categorising it or making it “findable”.
The “compromise” would leave all legislation on service provider (non-)liability in force, but reinterprets it in a way that makes it virtually impossible to apply it in practice, as a whole industry would be subjected to contradictory primary (Copyright Directive) and secondary (E-Commerce Directive) responsibility regimes. For instance, the compromise invents the notion of hosting providers that build services based on the copyright status of the content that is uploaded (“whose main or one of the main purposes is to provide access to copyright protected content uploaded by their users are engaging into acts of communication to the public and making available to the public”). This can only mean that the proposal covers either a very small minority of hosting providers (those designed around the copyright status of the uploaded content) or almost all companies that provide a hosting service online (those that allow copyrightable content to be uploaded).
Furthermore, the Estonians add another criterion to establish if a service is “communicating to the public”, namely, if it intervenes in the full knowledge of unspecified “consequences” for unspecified third parties. As an example, this would cover companies that either index content, present it “in a certain manner”, categorise content or make it findable.
Fundamental rights? Forget them!
Having created huge incentives to block, delete and filter content, the text says that the defensive measures to be implemented by service providers should respect the fundamental rights to freedom of expression and information. The only minor problem is that this obligation is purely hypothetical – companies have the right to manage their platforms how they wish. The EU Charter and national constitutions are binding on states, not on companies. Worried that this might give citizens too many rights, the proposal is clear that it only covers companies described in Article 13.1a, namely those that provide “access to the public to a significant amount of works or other protected subject-matter uploaded by their users who do not hold the relevant rights in the content uploaded”. Companies covered by the all-encompassing Article 13.1 (quoted above) are explicitly not be covered by this illusory safeguard.
In an effort to create some form of balance, at least, licensing arrangements paid for by the service providers would cover uploads by users (assuming that it was possible for European companies to survive after investing in multiple filters and paying multiple licence fees to multiple rightsholder groups). However, this part of the text would not (for no obvious reason) cover uploads done by individuals “in their professional capacity”. So, uploading a copyrighted picture on Sunday evening as a private individual might be okay, but uploading the same picture on a professional blog would not be okay – even though the hosting provider would have paid for it to be communicated to the public. How would the hosting provider know the difference between two identical uploads? The text does not say what would/could/should happen in cases like this.
In addition, an effort at seeming balanced, the Estonian suggests that services in which content is “mainly” uploaded by the rightsholders themselves would not be covered by these obligations. The only small problem is that the service provider has no way of knowing if this is the case, or of knowing what “mainly” might mean – even with every corporate surveillance mechanism currently available. Or if it was covered by this description yesterday and if this is still the case today.
2) Upload filtering
Having destroyed legal certainty for even the smallest provider in article 13.1, the Estonian proposal then moves on to demand service providers to filter uploads in 13.1b. Companies would need to put in place a censorship machine if they give access to a “significant amount of copyrighted protected content”. They fall outside the all-encompassing definitions described in article 13.1 because (apparently regardless of the content being uploaded) such services “thereby compete in the online content services market”. It would come as a surprise to many that there is a single online content services market, rather than numerous audio, image, graphic, audiovisual and text-based markets. However, to be fair to the Estonians, they have, in rectial 38f, publicly admitted for the first time, that multiple different filters would need to be acquired, with multiple and changing levels of effectiveness and, therefore, different levels of proportionality.
The Estonian Presidency proposes that service providers outside the EU, even if they are covered by liability protections, should invest in upload filtering. Services like github, imgur and 9gag would have three options: a) laugh energetically, b) pay for multiple upload filters because the EU said so, but has no way of enforcing this outside its jurisdiction or 3) ban access to people connecting from the EU. In keeping with the clarity of the rest of the text, it is not clear what is meant by EU citizens “using” such services – in particular whether this covers browsing the services or just uploading to them.
The fact that this provision was included – presumably to avoid the legitimate criticism that this is an extraordinary act of self-harm against legitimate European businesses – says a lot. The fact that this totally (legally) unenforceable provision was included shows just how totally divorced from reality this debate has become. It is glaringly obvious that no foreign service provider would dream of respecting such an absurd foreign obligation, an obligation that is in breach of the case law of the Court of Justice of the European Union (CJEU).
One imagines that, in line with the Commission’s recent “Communication on Illegal Content Online”, the 27 EU Member States would see to impose the obligations by using “voluntary” attacks on those platforms by advertising networks, internet access providers and other online companies.
More worryingly, upload filtering will lead to huge amounts of legal content being restricted – which is a direct contravention of the Charter of Fundamental Rights of the EU. The Estonian hope appears to be that, as the restriction is imposed through coercion and not explicitly written in law, it is out of reach of the CJEU and Member States’ Constitutional courts.
3) Inadequate redress for incoherent law
Finally, the hosting services, fearful of liability and pushed into deleting legal content (including content eligible to benefit from copyright exceptions and limitations) will have to organise a redress mechanism. What could be good news, as nothing prevents hosting companies from content on the basis of their terms of service rather than the law in order to avoid this obligation. Hosting providers would implement this redress mechanism even though the decision on what to remove would be based on what they are told by rightsholders, not on their own assessment nor the assessment of the law. Furthermore, they are supposed to implement this redress mechanism, even though the Council legal service assumes that no personal information will be processed by the filter. If the legal service is correct, then the hosting company will not have any record of the filtering and could not implement the redress mechanism.
It is expected that the Estonian Presidency will make a major push to adopt something resembling this text as the official position of the Council of the European Union between now and the end of 2017.
This will be followed by a vote in the European Parliament in January, to adopt the Parliament’s negotiating position.
There will then follow some months of negotiation between the Parliament and Council, before a final votes in both institutions, probably in July or September 2018.
Proposal for a Directive of the European Parliament and of the Council on
copyright in the Digital Single Market (30.10.2017)
Leak: Three EU countries join forces for restrictions & copyright chaos (26.10.2017)
Estonia loves digital – why is it supporting the #censorshipmachine? (07.09.2017)
Leaked document: EU Presidency calls for massive internet filtering (06.09.2017)
Deconstructing the Article 13 of the Copyright proposal of the European Commission, Revision 2