The worst examples of bigotry, ignorance, and hatred have appeared more visible in our public discourse in recent months and years. All reasonable people are appalled at willful ignorance and almost visceral hate. We need to take the necessary steps to fight ignorance and hatred.
But we need to do so in a way that is effective and in a way that is not counter-productive. Exacerbating the problem by blunt censorship that makes martyrs of bigots will help nobody.
We also need to defend our society. We need to defend our democracy. We need to defend the rule of law. In order to achieve any public policy goal, we need to know what it is we are trying to achieve and if the means we are using are appropriate. We need to protect the core pillars of our society.
And we need to remember that all regions of the world decided to put free speech into the Universal Declaration of Human Rights in 1948. And Europe chose to anchor free speech into the European Convention on Human Rights in 1950. Crucially, the EU reaffirmed these rights by the Charter of Fundamental Rights in 2009. Now, there are two crucial questions to be answered.
Does the Audiovisual Media Services Directive (AVMSD) approach respect the Charter of Fundamental Rights and the European Convention on Human Rights?
We need to look at the substance and the law. We need to recognise that when we say we support free speech, we don’t mean that we only support uncontroversial speech. Free speech also means, in the words of the United Nations (UN) Human Rights Committee, “deeply offensive” or, in the words of the European Court of Human Rights, it is applicable not only to “information” or “ideas” that are favourably received or regarded as inoffensive or as a matter of indifference, but also to those that offend, shock or disturb the State or any sector of the population.
When there are commercial interests at stake in industry codes of conduct, on what basis can we believe – on what basis could we even hope – that such information or ideas are safe? What’s the basis to believe that terms of service are better suited to judge speech than democratically drafted laws?
In the absence of meaningful reporting obligations, and bearing in mind the commercial and political pressures on companies, on what basis do we think that, in the words of Article 52 of the Charter, the restrictions being proposed are necessary, when we don’t know and can’t know how much legal content will be deleted under the proposed private enforcement of a non-harmonised hate speech law by private companies – and we know that legal content is being affected; or that the restrictions will genuinely achieve objectives of general interest when, for example, the Commission’s code of conduct has absolutely no mechanism for establishing whether the measures are proportionate or, indeed, might be counterproductive? This is what the law needs to uphold in practice.
Are the proposed rules in the AVMSD good enough to respect the “prescribed by law” obligations of the European Court of Human Rights (ECHR) or “provided for by law” of the Charter?
Firstly, the notions of “self-” and “co-regulation” are not clear. Most of the Directive is about traditional broadcasters, where “self-regulation” means broadcasters regulating themselves. However, “self-regulation” in the online world generally involves internet companies regulating their users. This is a fundamentally different thing and creates a first layer of ambiguity.
Underlining the measures on “hate speech” we have the Framework Decision, whose definition is already very broad and not in line with international standards. This is made more unpredictable by very varied transpositions in Member States. It is further undermined by failure to enforce in some Member States. This adds a second layer of ambiguity.
The Commission’s AVMSD proposal builds on this weak background. It mixes potentially harmful material with material that would be illegal under the Framework Decision.
It creates confusion by saying, in Recital 8 that the definitions of “incitement to hatred and violence” should only be aligned “to the appropriate extent” to the law. So, the transposing law restricting hate speech in the AVMS context would not necessarily have to be in line with the existing law on hate speech.
Recital 30 and Article 28a.5 say that, with respect to these restrictions on content, Member States shall not be precluded from imposing stricter measures with respect to “illegal content”… which implies that legal content is being restricted. This adds a third layer of ambiguity.
So, we have a Racism and Xenophobia Framework Decision, which is patchily transposed and patchily implemented. We have a draft AVMS Directive which requires action to be taken against material that incites to hatred or violence, where the definition of such content should be aligned, where appropriate to the law.
Where private companies enforce the law based on so-called “self-regulation” or “co-regulation” and where Member States can implement stricter rules than those applied to… content that is supposed to be illegal? Illegal according to whom? To the Member States that are unable to agree on what illegal is? To big companies whose terms of service do not necessarily follow the law? Do we want companies to decide what should be restricted? When did we decide to abandon the Charter of Fundamental Rights and hand over regulation of incitement to hatred to private companies, driven by commercial interests?
Separately, we have to consider basic principles of equality before the law. Do we expect US big companies, say Facebook, Google or Twitter to treat all users equally? Are they under an obligation to do so? The business priorities of private companies simply do not allow for treating a celebrity or politicians in the same way as other citizens.
Nine months have passed since the adoption of the EU Code of conduct against hate speech. We haven’t seen any other company joining this code and we don’t see meaningful results from it. Is it the model we want to replicate in law?
In summary, the EU appears not to have the political will to adopt laws based on global human rights standards. The stop-gap solution is to adopt measures in law and codes of conduct which fail to respect basic rules on the quality of law. The danger of arbitrary enforcement is real. The danger of counterproductive effects is real.
The speech was delivered by Joe McNamee at an expert discussion on the proposal for an updated Audiovisual Media Services (AVMS) directive, organised by Hanse-Office, the joint representation of the Free and Hanseatic City of Hamburg and the State of Schleswig-Holstein to the EU, on 2 March 2017.