Blogs

Three steps to end freedom of expression

By EDRi · July 12, 2016

Our right to freedom of expression is laid out in law by the EU Treaties. To ensure democracy and accountability, this fundamental human right may not be restricted unless it is necessary, achieves an objective of general interest and the measure to restrict it is provided for by law.

................................................................. Support our work with a one-off-donation! https://edri.org/donate/ .................................................................

This guarantee of our fundamental rights is explicitly protected by the EU Treaties (the Charter of Fundamental Rights of the European Union) and by the European Convention on Human Rights. This right is now being actively removed by the European Commission.

Step 1 – Put companies in charge

The European Commission pushed US online companies (Google, Twitter, Facebook and Microsoft) to agree to a “Code of Conduct”. In the Code of Conduct, it is agreed that:

  • The companies would take “the lead” in fighting illegal hate speech online;
  • Their terms of service would ban illegal hate speech; and
  • They would review complaints of illegal hate speech first against their terms of service, and only “where necessary” against the law.

In short – companies will ban what is already illegal, adding this ban to whatever is already prohibited by their terms of service. The law becomes irrelevant. The companies avoid responsibility, as they don’t have to accuse any content of being illegal, but only a breach of terms of service. Because nobody has been accused of doing anything illegal, the police will not have to worry about getting reports that they will have to investigate.

Step 2 – Make sure the law enforcement authorities won’t be responsible for restricting free speech

The legal instrument that establishes the tasks of Europol, the European law enforcement agency, now includes a provision to request the deletion online content which is explicitly NOT illegal.

One of Europol’s roles is to refer reports to online service providers, so that they can “voluntarily” check if the online contents is compatible with their terms and conditions. Europol avoids liability for deletions of content, because their reports are for “voluntary consideration” by the companies. The companies rely on their vague terms of service to delete content, to avoid liability or public accusations of being unhelpful. As Deputy Director of Europol Wilhelm van Gemert explained in an interview: “violent content is not banned by legal authorities but by the companies themselves”.

Step 3 – Give states the job of writing the terms of service of companies

In the draft Directive on audio-visual media services, the European Commission takes another huge and surprising step. It proposes giving EU Member States the job ensuring that the ”concepts” of ”incitement to violence and hatred” and content that is harmful for children are adequately defined and applied in the terms and conditions of the video-sharing platform providers. This would mean that lawmakers could define the rules, not only in law, but also in private companies’ terms of service. These rules can then be policed arbitrarily by the companies, in line with the ad hoc arrangements already agreed with Europol and in the hate speech code of conduct.

To add to the already impossibly unclear mishmash of law and terms of services, the draft Directive even explains that regulating “hate speech” should only be aligned with the law “to the appropriate extent” – an extent which is not explained or defined. Of course, this is so far “only” for video-sharing websites and so far (in relation to video-sharing websites) EU Member States would not be allowed to go further than this, so there is nothing to worry about.

Step 4 is already in the early stages of development. Member States in the EU Council are having discussions about “voluntary” cooperation with internet companies to decrypt encrypted communications.

Conclusion:

It is quite clear that removal of material online is a restriction on fundamental rights. It is quite clear that the safeguards in the Charter of Fundamental Rights of the EU are being willfully ignored:

EU Charter: Article 52.1:

Any limitation on the exercise of the rights and freedoms recognised by this Charter must be provided for by law and respect the essence of those rights and freedoms. Subject to the principle of proportionality, limitations may be made only if they are necessary and genuinely meet objectives of general interest recognised by the Union or the need to protect the rights and freedoms of others.

“Code of Conduct”
http://ec.europa.eu/justice/fundamental-rights/files/hate_speech_code_of_conduct_en.pdf

(Contribution by Joe McNamee, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner