By Joe McNamee

In September 2017, the European Commission adopted its widely criticisedCommunication on Illegal Online Content.” Now, already, due to political pressure and internal competition between various European Commission services, a new Commission proposal for a Recommendation on the same subject is close to being shared with the Council for unanimous support, albeit not being legally binding. A leaked draft can be found here.

On the basis of no new analyses, no new data and no new pressing issues to be addressed, the leaked draft Recommendation seeks to fully privatise the task of deciding what is acceptable online or not. The only protection for user rights like freedom of expression is an unenforceable hope that certain “adequate safeguards” will be put in place voluntarily by the companies. The draft reminds readers – twice – that the providers have “contractual freedom”, meaning that any such safeguards will be purely optional.

One of the hopes is that the providers will be transparent about the amount of “illegal content and other content” that they delete. The Commission does not even suggest that their aspirational safeguards should be applied to the legal “other content” it mentions. There is literally nothing in law nor practice that would require either EU Member States or the companies themselves to implement a single one of the safeguards listed.

The draft Recommendation highlights one type of content, “terrorist” material, to justify the chaotic proposals. Even though we already have a recent Terrorism Directive and a Europol Regulation dealing with the subject, the Commission seeks to defend its attack on freedom of expression, privacy and the rule of law by using the threat of terrorism. In reality, the repeated references to measures proposed to address copyright and “intellectual property rights” infringements gives an indication of the real driving force behind for such far-reaching measures.

Indeed, in relation to “terrorist content” (undefined, of course), the Commission explains that its proposals could be “complemented” “by certain recommendations”, which are not explained – although this may refer to the setting up of the national internet referral units, whose value has yet to be demonstrated. This, one imagines, is motivated by the need to defend the Commission’s aggressive stance for copyright (mandatory upload filters) compared with its more relaxed approach to alleged terrorist content. Indeed, the probability that the removals will disproportionately target legal content is demonstrated by the numerous references to content being removed on the basis of the companies’ terms of service.

This shows how much the Commission prioritises (in line with the demands of the copyright lobby) the removal of availability over the investigation of removed content. In reality, Member States are very keen on NOT receiving reports of the content being deleted – as proven by the fact that no statistics are kept about any investigations that result from reports generated by Europol’s “Internet Referral Unit” (IRU), according to the Commission itself.

What is worse, the Commission’s draft includes general references to respect of the Charter of Fundamental Rights of the European Union in relation to all of these “voluntary” measures (cf. Preamble, Paragraphs 14, 38 and 39 and Chapter 1, paragraph 1). On top of being very vague and unenforceable in practice, as the European Commission knows very well, the Charter only applies to measures implemented by the European Commission and to Member States applying EU law, NOT to measures that are imposed “voluntarily” by private companies. Even then, it fails to mention one of the most relevant articles of the Charter in this context, Article 52.

The draft Recommendation makes limp references to safeguards, such as counter-notices (to which the provider should give “due consideration”, whatever this may mean). Even here, the Commission only suggests counter-notice procedures for content that is deleted on the basis of illegality, not for content removed under terms of service. In addition, alternative dispute settlement is given preference over court legal proceedings with little explanation. Surprisingly and positively, there is a provision on evaluating the implementation of the measures. However, this is limited to:

  • raw data on the amount and speed of content removal (with no consideration given to the difference between content removed on the basis of terms of service rather than the law);
  • the amount of content removed by upload filters, (with no consideration given to whether the removals were justified or not);
  • safeguards implemented by either service providers or Member States (legal redress, transparency regarding removals of legal content, review processes for implemented measures, etc.);

The Commission proposes no measures to gather any data on the usefulness or possible counter-productive effects of any of these measures for the fight against illegal activity.

For what is worth, here is Article 52.1 of the Charter of Fundamental Rights of the European Union, with the parts that are not respected by the Commission’s Recommendation, either in spirit or due to lack of data, highlighted in red:

Any limitation on the exercise of the rights and freedoms recognised by this Charter must be provided for by law [the Recommendation is based entirely on “voluntary” measures] and respect the essence of those rights and freedoms [the Recommendation includes references to deletion of content that is legal and has no review processes to assess its impact on rights and freedoms]. Subject to the principle of proportionality, limitations may be made only if they are necessary [despite the existence of multiple such projects on EU and national levels, the Commission has diligently avoided collecting data that indicates, let alone proves, necessity] and genuinely meet objectives of general interest recognised by the Union or the need to protect the rights and freedoms of others [despite the existence of various initiatives like the EU Internet Forum, no data has been collected to indicate necessity and proportionality].

Twitter_tweet_and_follow_banner