Blogs | Information democracy | Freedom of expression online

Fighting defamation online – AG Opinion forgets that context matters

By EDRi · June 19, 2019

On 4 June 2019, Advocate General (AG) of the Court of Justice of the European Union (CJEU), Maciej Szpunar, delivered his Opinion on the Glawischnig-Piesczek v Facebook Ireland case. The case is related to injunctions obliging a service provider to stop the dissemination of a defamatory comment. Looking carefully at this Opinion is important, as the final ruling of the CJEU usually follows the lines of the AG’s Opinion.



The case involves Ms Glawischnig-Piesczek, an Austrian politician, who was the target of defamatory comment shared publicly on Facebook. As Facebook did not react to her first request for that comment to be deleted, Ms Glawischnig-Piesczek requested the Austrian courts to issue an order obliging Facebook to remove the publication and prevent its dissemination, including exact copies of the original comment as well as “equivalent content”. After the first court injunction, Facebook disabled access in Austria to the content initially published. Ultimately, the Supreme Court of Austria, before which the case was brought, referred to the CJEU several question related to the scope of application of such injunction geographically as well as to statements with identical wording or having equivalent meaning. As Facebook is not necessarily aware of all identical or equivalent content, the upcoming judgment of the CJEU will be essential for the interpretation of the E-Commerce Directive, notably its Articles 14 and 15.

In his Opinion, the AG states that a hosting provider such as Facebook can be ordered to seek and identify, among all the information disseminated by users of that platform, content identical to the content that has been characterised as illegal by a court. Moreover, the hosting provider may be required to search equivalent content, but only among the content disseminated by the user that generated the illegal information in the first place.

The Opinion is interesting for two reasons: first, it provides reflection on the way to distinguish between general and specific monitoring of content by hosting providers; second, it tries to draw a line between “identical” and “equivalent” content.

AG Szpunar starts by expressing great concerns that an obligation put on an intermediary to filter all content would make it aware of illegal content, thus causing the loss of its liability exemption provided under Article 14 of the e-Commerce Directive. In the present case, the referring court has established that Facebook falls under Article 14, so the active-passive host distinction is not further explored in the Opinion. The upcoming CJEU case about liability of YouTube for user uploads (C-682/18) will undoubtedly revisit this question. However, the AG does not preclude the possibility to impose “active” monitoring under the provisions of Article 15 of the same Directive. Recalling the conclusions from the L’Oréal v eBay case (C-324/09), which limits the preventive obligation (ie. “filtering”) to “infringements of the same nature by the same recipient of the same rights, in that particular case trade mark rights” (point 45). For a monitoring obligation to be specific and sufficiently targeted, the AG mentions the criteria of duration, but also the information relating to the nature of the infringements, their author and their subject. It raises the question on how the monitoring can be limited in time and stopped, once a specific case is declared to be over.

Applying these principles to the present case, the AG believes a monitoring obligation for “identical content” among information generated by all users would ensure a fair balance between the fundamental rights involved. His argument is to be found at points 61 and 63 where he speculates that seeking and identifying identical content can be done with passive “software tools” (ie. upload filters), which does not represent “an extraordinary burden” for the intermediary.

This is where the distinction with “equivalent” content is drawn: equivalent content would deserve more “active non-automatic filtering” by the intermediary of all the information disseminated via its platform. What is meant by non-automatic filtering is not entirely clear, but the distinction in the mind of the AG could be between filtering that never requires manual intervention to ensure a fair balance with other fundamental rights (freedom of expression and right to information, in particular) and non-automatic filtering that does require such intervention in order to avoid situations similar to the Netlog case C-360/10, where the CJEU ruled that a preventive filtering system applying indiscriminately to all users was incompatible with the Charter of Fundamental Rights of the European Union.

Unfortunately, a distinction along these lines seems ill-suited for the case at hand which is about defamation. Specific words that are defamatory in the present case could be used in other contexts without being defamatory. Obvious examples would be counterspeech, irony among friends, or even news reporting. The situation is really the same for content defined as identical or equivalent: context matters, and automated algorithms will not be able to make the finely grained decisions about when the use of certain words (whether copied verbatim, that is identical content, or with changes, meaning equivalent content) is legal or illegal. A filtering obligation for identical content will have the same negative effect on freedom of expression and the right to information as a filtering obligation for equivalent content.

The present case will be particularly important for defining the distinction between specific monitoring and general monitoring, where there is presently very little case law. Since the E-Commerce Directive Article 15(1) prohibits general monitoring, specific monitoring by implication is any monitoring that is compatibe with the E-Commerce Directive, interpreted in the light of the Charter of Fundamental Rights. Only the L’Oréal v eBay case (C-324/09) has dealt with this issue. Compared to the earlier case, the AG proposes an expanded definition of specific monitoring which has the notable disadvantage of being rather unworkable since it relies on a flawed dichotomy between identical and equivalent content. This dichotomy is disconnected from the legal reality that specific monitoring must comply with the Charter of Fundamental Rights and prevent the risk of censorship resulting from a filtering obligation. Hopefully, the judgment in the case can present a more workable definition of specific monitoring that is reconcilable with both Articles 14 and 15 of the E-Commerce Directive.

Case C-18/18: Eva Glawischnig-Piesczek v Facebook Ireland Limited
https://curia.europa.eu/jcms/upload/docs/application/pdf/2019-06/cp190069en.pdf

Legal victory for trademark litigants over intermediary liability (13.07.2011)
https://edri.org/edrigramnumber9-14ebay-loreal-case-ecj/

SABAM vs Netlog – another important ruling for fundamental rights (16.02.2012)
https://edri.org/sabam_netlog_win/

(Contribution by Chloé Berthélémy, EDRi, and Jesper Lund, EDRi member IT-Pol, Denmark)