20 Oct 2017

See which MEPs voted in favour of e-Privacy – and which ones against it


On 19 October, the European Parliament Committee on Civil Liberties, Justice and Home Affairs (LIBE) voted on the proposed e-Privacy Regulation. With 31 votes the Committee voted in favour of measures defending privacy, security and competition for phone and internet services.

The 31 MEPs in favour of the e-Privacy Regulation belong to the Alliance of Liberals and Democrats for Europe group (ALDE), the Europe of Freedom and Direct Democracy (EFDD), the European United Left-Nordic Green Left (GUE/NGL), the Non-Inscrits (NI), the Progressive Alliance of Socialists and Democrats (S&D) and the Greens–European Free Alliance (Verts/ALE):

Name EU political group Country National party
Gérard Deprez ALDE Belgium Mouvement Réformateur (MR)
Nathalie Griesbeck ALDE France Mouvement démocrate (MoDem)
Sophia in ‘t Veld ALDE Netherlands Politieke Partij Democraten 66 (D66)
Kaja Kallas ALDE Estonia Eesti Reformierakond
Angelika Mlinar ALDE Austria NEOS – Das Neue Österreich und Liberales Forum
Ignazio Corrao EFDD Italy Movimento 5 Stelle (M5S)
Laura Ferrara EFDD Italy Movimento 5 Stelle (M5S)
Xabier Benito Ziluaga GUE/NGL Spain Podemos
Cornelia Ernst GUE/NGL Germany Die LINKE
Josu Juaristi Abaunz GUE/NGL Spain Euskal Herria Bildu (EH Bildu)
Dennis de Jong GUE/NGL Netherlands Socialistische Partij (SP)
Martin Sonneborn NI Germany Die PARTEI
Caterina Chinnici S&D Italy Partito Democratico (PD)
Ana Gomes S&D Portugal Partido Socialista (PS)
Sylvie Guillaume S&D France Parti socialiste (PS)
Sylvia-Yvonne Kaufmann S&D Germany Sozialdemokratische Partei Deutschlands (SPD)
Cécile Kashetu Kyenge S&D Italy Partito Democratico (PD)
Dietmar Köster S&D Germany Sozialdemokratische Partei Deutschlands (SPD)
Marju Lauristin S&D Estonia Sotsiaaldemokraatlik Erakond (SDE)
Juan Fernando López Aguilar S&D Spain Partido Socialista Obrero Español (PSOE)
Andrejs Mamikins S&D Latvia Sociāldemokrātiskā Partija “Saskaņa” (SDPS)
Claude Moraes S&D United Kingdom Labour Party
Péter Niedermüller S&D Hungary Demokratikus Koalíció (DK)
Kati Piri S&D Netherlands Partij van de Arbeid (PvdA)
Soraya Post S&D Sweden Feministiskt initiativ (FI)
Birgit Sippel S&D Germany Sozialdemokratische Partei Deutschlands (SPD)
Janusz Zemke S&D Poland Sojusz Lewicy Demokratycznej (SLD)
Jan Philipp Albrecht Verts/ALE Germany Bündnis 90/Die Grünen
Eva Joly Verts/ALE France Europe Écologie Les Verts (EELV)
Judith Sargentini Verts/ALE Netherlands GroenLinks
Bodil Valero Verts/ALE Sweden Miljöpartiet de gröna (MP)


The 24 MEPs against the e-Privacy Regulation belong to the European Conservatives and Reformists (ECR), the Europe of Freedom and Direct Democracy (EFDD), the Europe of Nations and Freedom (ENF) and the European People’s Party (PPI):

Name EU political group Country National party
Daniel Dalton ECR United Kingdom Conservative Party
Jussi Halla-aho ECR Finland Perussuomalaiset (PS)
Monica Macovei ECR Romania Independent
Helga Stevens ECR Belgium Nieuw-Vlaamse Alliantie (N-VA)
Gerard Batten EFDD United Kingdom UK Independence Party
Raymond Finch EFDD United Kingdom UK Independence Party
Harald Vilimsky ENF Austria Freiheitliche Partei Österreichs (FPÖ)
Auke Zijlstra ENF Netherlands Partij voor de Vrijheid (PVV)
Asim Ahmedov Ademov PPE Bulgaria ГЕРБ, Граждани за европейско развитие на България
Heinz K. Becker PPE Austria Österreichische Volkspartei (ÖVP)
Michał Boni PPE Poland Platforma Obywatelska (PO)
Anna Maria Corazza Bildt PPE Sweden Moderata samlingspartiet (M)
Rachida Dati PPE France Les Républicains (LR)
Monika Hohlmeier PPE Germany Christlich-Soziale Union in Bayern (CSU)
Lívia Járóka PPE Hungary Fidesz
Barbara Kudrycka PPE Poland Platforma Obywatelska (PO)
Roberta Metsola PPE Malta Partit Nazzjonalista (PN)
Alessandra Mussolini PPE Italy Forza Italia (FI)
József Nagy PPE Slovakia Most–Híd
Csaba Sógor PPE Romania Uniunea Democrată Maghiară din România
Jaromír Štětina PPE Czech Republic Tradice Odpovědnost Prosperita (TOP 9)
Traian Ungureanu PPE Romania Partidul Național Liberal (PNL)
Axel Voss PPE Germany Christlich Demokratische Union Deutschlands (CDU)
Tomáš Zdechovský PPE Czech Republic Křesťanská a demokratická unie – Československá strana lidová (KDU–ČSL)


One of the MEPs of the Committee was absent:

Name EU political group Country National party
Kristina Winberg EFDD Sweden Sverigedemokraterna (SD)


Euro-parliamentarians say a clear “no” to the anti-privacy lobby (19.10.2017)

e-Privacy Directive: Frequently Asked Questions

e-Privacy revision: Document pool

Quick guide on the proposal of an e-Privacy Regulation (09.03.2017)

Dear MEPs: We need you to protect our privacy online! (05.10.2017)


20 Oct 2017

EDRi writes to EU Commissioner Gabriel about tackling illegal content online

By Ana Ollo

On 28 September 2017, the European Commission published a Communication on “Tackling Illegal Content Online: Towards an enhanced responsibility of online platforms”.

In order to be constructive and support the European Commission in developing a balanced, rights-friendly and harmonised approach to deal with illegal content online in the future, EDRi has written a letter to the Digital Economy Commissioner Mariya Gabriel. We propose the Commission to adopt at least three workstreams: a “fundamental rights framework”, “learning from experience”, and “effective and predictable frameworks for addressing illegal content”.

That Communication puts the emphasis on demanding that online platforms take more action to tackle illegal content online, by preventing, detecting, removing and disabling access to illegal content on a “voluntary” basis. While the Commission encouraged online platforms to adopt proactive measures and to use filtering technologies to that aim, little attention was paid to review mechanisms, counter-productive effects, mitigation or identification of problems, and legal predictability, among others.

The Commission’s Communication was widely criticised by EU policy-makers like Members of the European Parliament (MEPs) Marietje Schaake and Julia Reda, as well as European companies, EDRi, EDRi members, other civil society organisations, and academics. For example, Daphne Keller from Standford Law School expressed her concerns on filtering technologies, as well as on the risk of over-removal of content that can emerge from the compliance of online platforms with the Communication’s guidelines.

So, what is needed? Three workstreams that build a diligent, rights-based and comprehensive approach to this problem, creating methodologies to develop (1) an agreed understanding of the fundamental rights framework, (2) a structured approach to learning from experience and (3) a methodology for building effective policies in the future.

1. Fundamental rights framework

First, we recommend developing clear guidelines on government or European Commission compliance with the requirements of Article 52.1 of the Charter of Fundamental Rights of the European Union when they design, promote or participate in “voluntary” or mandatory measures that may restrict fundamental rights. Article 52.1 of the Charter establishes that any limitation on the exercise of fundamental rights needs to be provided for by law, and respect the principles of proportionality and necessity. Any regime for tackling illegal content on the internet needs to particularly ensure a robust protection for due process, safeguards against removal of legal content and provide an effective right to remedy.

2. Learning from experience

Secondly, we propose that the Commission conduct a neutral assessment of what has worked and hasn’t worked regarding the experience of EU and Member States initiatives to deal with illegal content online. This could cover a full range of impact assessments, including both on fundamental rights and on the public policy results of the measures that have been attempted.

3. Building a flexible, effective approach to illegal online content

Thirdly, we strongly encourage the Commission to develop an effective and flexible methodology for tackling different types and areas of illegal content online. This would ensure a diligent approach to fundamental rights, problem identification, review processes and contingencies. The methodology could include issues such as how the necessity, proportionality and predictability of the measures are guaranteed, risks for citizens, providers and governments and which review, redress and oversight mechanisms are available.

We hope that the letter can be a first step towards building a more effective approach, to protect the principles on which our democracies are based and to ensure a more effective, comprehensive approach to illegal online content.

Letter: A coherent and rights-based approach to dealing with illegal content (20.10.2017)

Commission’s position on tackling illegal content online is contradictory and dangerous for free speech (28.09.2017)


19 Oct 2017

Euro-parliamentarians say a clear “no” to the anti-privacy lobby


On 19 October, the European Parliament Committee on Civil Liberties, Justice and Home Affairs (LIBE) voted on the proposed e-Privacy Regulation. The Committee voted in favour of measures defending privacy, security and competition for phone and internet services.

Despite a huge lobbying effort to water down the proposal, the Committee voted for clear, privacy-friendly rules. We welcome this approach, as it will not just protect citizens, but promote competition and innovation as well,

said Joe McNamee, Executive Director of European Digital Rights (EDRi).

Currently, when people surf the internet, use apps on their mobile phone or use connected devices, they are monitored, tracked and profiled. The massive amounts of data that are generated create privacy risks, security risks, economic risks and, as we have seen recently, risks for democracy itself.

EDRi believes that clear rules, based on consent and transparency will help move us away from a dysfunctional market that undermines trust and security. This means clear rules for privacy by default for our hardware and software, it means protection of the content of our communications and the metadata generated by our phone calls and electronic messages.

The European Union is revising its legislation on data protection, privacy and confidentiality of communications in the electronic communications environment: the e-Privacy Regulation. This piece of legislation contains specific rules related to your freedoms in the online environment. It has been under review since 2016, after the adoption of the new General Data Protection Regulation.

Once the Parliament formally terminates this part of the legislative process, the next stage will be negotiations with the EU Member States in the Council.

Read more:

e-Privacy Directive: Frequently Asked Questions

e-Privacy revision: Document pool

Quick guide on the proposal of an e-Privacy Regulation (09.03.2017)

Dear MEPs: We need you to protect our privacy online! (05.10.2017)


18 Oct 2017

Success Story: A win on Austrian surveillance legislation

By Epicenter.works

The security debate in many countries shows an alarming trend towards restrictions of fundamental rights that liberal societies have codified in the past centuries. Particularly in the field of surveillance, recent legislation often goes beyond what has been deemed constitutional by courts and lacks any fact-based justification as to how those measures are supposed to prevent or mitigate serious crime. Advocacy groups rarely achieve a victory in this debate. Hence, it is worth taking a closer look at a recent example from Austria where such legislation was prevented by EDRi member NGO epicenter.works.

In January 2017, the Austrian government agreed to a “security package”. The proposed legislation spanned over several seemingly unconnected areas from the legalisation of government spyware, restrictions of the right to assembly, real-time CCTV systems, a centralised database for registered license plates (piggybacked on existing legislation for license plate tolls), IMSI catchers, a prohibition of anonymous pre-paid SIM-cards, and the arrest of people who pose “potential threats” before convictions or even charges.
Epicenter.works published an analysis of the government agreement on the day of its release and started a campaign called “surveillance package”. In the course of seven months, 10 out of 12 proposed measures were abandoned. This is how epicenter.works reached these outstanding results in its campaigning.

Separating the security debate from the surveillance spiral

Terrorism in Europe is much like a wasp trying to destroy a china shop. The wasp alone can never achieve its aim, but if people react with panic, the result will be destruction.

The hard truth is is that modern terrorism with cars and self made bombs can never be completely avoided. Political decisions are rarely guided by facts and analysis of previous attacks to reduce the chance of future incidents (due to the political need to propose “something” to reassure the population). Instead, they rely on the assumption that increased surveillance automatically leads to more security. In an effort to stress that this is a fallacy, epicenter.works coined the term “surveillance package” and made it clear that these reforms are unlikely to bring about a higher level of security. By the end of the campaign, this wording was picked up not only by the whole opposition, but also by media, which was a great success.

The “surveillance package” proposed by the Austrian government had the same central weakness as most recently proposed surveillance measures in Europe: their complete lack of justification in quantifiable criminological measurement. Part of the “surveillance package” campaign was to define an “objective security package” with measures that are actually adequate and important for security – such as multilingual police and better police training – none of which includes more surveillance powers. This builds on epicenter.works’ attempts to create a scientific concept to evaluate anti-terrorism measures.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

When a government restricts the fundamental rights of its citizens, the burden of proof always lies with the legislator as to why those restrictions are efficient, necessary, and proportionate. Questioning the security benefits of the proposed laws helped tremendously to undermine them as ineffective and purely opportunism-driven surveillance legislation.

Combine jurisprudence, technology and design in one team

From the first draft of the government’s work program to the proposed legislation epicenter.works provided an in-depth analysis from a legal and technical perspective. The complex analysis was complemented by single page synopses of important questions and infographics. Three press conferences, eight demonstrations in six cities, and thousands of tweets, videos, and images, as well as countless discussions with politicians and other stakeholders followed. The outcome was a combination of policy-driven and campaign-driven approaches that had great impact.

Make the many voices heard

For the first iteration of this campaign epicenter.works developed a new telephone tool that allowed citizens to subscribe to daily calls with their representatives. In a second stage, the campaign publicised the parliamentary consultation of the proposed legislation to a wider public thereby eclipsing all previous consultations threefold with a total of 18 094 responses, crashing the consultation website. In the final stage, all 6 350 public statements were analysed and visibility given to the arguments of institutions like the Austrian high court, the lawyers’ association, and the Red Cross.


The main lesson to take away from this is that one size doesn’t fit all. The “surveillance package” campaign was a success thanks to the diversity in methodologies, flexibility to react to new circumstances, and a multidisciplinary team of experts, grassroots campaigners, and media strategists. However, the most important detail that allowed this success was reframing the discussion away from surveillance and towards a broader security debate. In that new framing, privacy advocates will be able to propose more effective and fact-based solutions that do not rely on surveillance.

Surveillance package campaign (only in German)

Requirements for a comprehensive surveillance footprint evaluation (only in German)

(Contribution by EDRi member epicenter.works, Austria)



18 Oct 2017

Dutch NRA: T-Mobile may continue to violate net neutrality

By Bits of Freedom

EDRi member Bits of Freedom asked the Dutch national regulatory authority, Authority for Consumers and Market (ACM), to test the mobile operator T-Mobile’s subscription against the new European rules that protect net neutrality, to verify whether its “Data-free Music” service infringes upon the principle of net neutrality. On 11 October 2017, the ACM ruled T-Mobile to be compliant with the law. Bits of Freedom disagrees and will challenge this decision.

The ACM states in its decision that T-Mobile does not discriminate against music services, “as long as they comply with the conditions that T-Mobile imposes” – which is exactly the problem! T-Mobile is imposing all kinds of conditions upon these music services and therefore decides which ones are entitled to preferential treatment. For example, if a music service does not want T-Mobile to use its logo, they are out of luck. If the music service is not able or willing to structure its systems to the whims of T-Mobile, they are left out in the cold.

The ACM also thinks that the freedom of the music service providers or consumers is not being constrained. That is hardly surprising when you realise that the ACM seems to have no regard for the impact of services like Data-free Music on the innovative nature of the internet. One of the powerful properties of the internet is precisely that every computer, and therefore every service and user, is as easy to reach as any other. A service like T-Mobile’s flies in the face of that idea by giving some services preferential treatment.

Based on the text of its decision, the ACM has also not taken into account what other operators will do if this is allowed and how that will impact the freedoms of internet users.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

With this decision, the ACM undermines the innovative power of the internet for all of Europe. Because of the leading role that the Netherlands has played for years in the field of net neutrality, everybody in Europe is looking at enforcement in the Netherlands. In the rest of Europe, several supervisors are looking into similar subscriptions at other operators. But they have not ruled yet, waiting for the ACM’s decision. The fact that the ACM did not strictly enforce the principle of net neutrality is a missed opportunity. Bits of Freedom will challenge this decision.

Bits of Freedom has advocated for net neutrality for more than seven years. The Netherlands was the first in Europe to anchor net neutrality into law, in 2012. Since the end of 2015, similar rules hold for all of Europe. Since then, operators have been constantly testing the limits of these rules. T-Mobile did this with the Data-free Music service, among other things. Earlier, ACM already ruled that this service is out of line with the Dutch interpretation of these European rules. T-Mobile went to court over this and won. Bits of Freedom then filed an enforcement request. On 11 October 2017, the ACM decided on the request.

If you get access to the internet, you should get access to the entire internet. You, and not your operator, should get to decide what you do on the internet. A service like Data-free Music, a subscription in which music services can provide you with music without expending your data plan, sounds like a great offer: listening to Spotify on your mobile phone without a risk of running out of data or paying more. And for the short term, it could be. But if operators offer one or more services at a cheaper price than their competitors, this yields an unfair advantage, taking away the opportunity for competitors and newcomers to compete for customers on their own merit.

T-Mobile’s subscription only works with a small number of music services. If a provider can’t or won’t comply to T-Mobile’s conditions, they will be left out. Even though T-Mobile claims that every service can participate, the reality is more of a problem: months after the introduction, only a handful of music services have joined.

Dutch NRA: T-Mobile may continue to violate net neutrality (11.10.2017)

T-Mobile may continue with Data-free Music (only in Dutch, 11.10.2017)

(Contribution by EDRi member Bits of Freedom, the Netherlands)



18 Oct 2017

Extending the use of eID to online platforms – risks to privacy?

By Anne-Morgane Devriendt

On 10 October 2017, the European Commission published the “draft principles and guidance on eID interoperability for online platforms” on the electronic Identification And Trust Services (eIDAS) observatory. Building on the eIDAS Regulation, the Commission would like to extend the scope of use for the eIDs to online platforms, in addition to public services. This raises a number of issues, particularly on the protection of privacy.

The eIDAS Regulation, adopted in 2014, is part of the “European eGovernment Action Plan 2016-2020”. It aims at making all Member State issued eIDs recognisable by all Member States from 28 September 2017. By extending the scope of use of eIDs to “online platforms” in general and not only public services, the Commission is trying to make authentication easier and more secure, as the eID itself would allow logging in. It would answer some of the issues raised by the use of passwords as main authentication method. It would also be more convenient for the users who could use the same eID across different platforms.

However, as are presented in the Commission’s document, the guidelines raise a number of issues, such as the lack of definition of “online platforms”. As the eIDAS Regulation concerns access to public services throughout the EU with the same, government approved eID, it appears that “online platforms” refers to the private sector. “Online platforms” are defined, to a certain extent, in the Commission’s Communication on Online Platforms. However, the characteristics that are used are so wide they encompass both online sales websites and social media platforms.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The second issue is protection of privacy. Indeed, the draft document states that “users should be able to preserve a level of privacy and anonymity, e.g. by using a pseudonym”. The failure to understand the basic notion that anonymity and pseudonymisation are fundamentally different is worrying. It is, or should be, obvious that using one’s eID to authenticate oneself would allow the platform to link the pseudonym to the real identity and personal information. Furthermore, while it might be useful for online sale platforms to make sure transactions are taking place between real people, it defeats the purpose of using a pseudonym on social media to separate online activities to be linked to one’s real identity.

Finally, if the Commission sets the direction to make authentication easier for both platforms and users with the use of the eID, they do not provide guidelines on the implementation of privacy by default. This would make sure that online platforms only have access to authentication information and do not use it for other purposes. One of the safeguards for the use of eIDs to access public services is the ability to monitor which public servant accessed the data and when. However, regarding the use of eIDs for authentication on online platforms, there is no provision in the draft guidelines that would make sure that data are properly secured.

Bearing in mind the huge and varied damage caused to Facebook users by its “real names” policy, the risks of this project being used by certain online platforms are real and significant.

All interested stakeholders can communicate their opinion on this draft to the Commission before 10 november 2017 through the eIDAS observatory post or by email.

Draft principles and guidance on eID interoperability for online platforms – share your views! (10.10.2017)

Workshop: Towards principles and guidance on eID interoperability for online platforms (24.04.2017)

Communication from the Commission – Online Platforms and the Digital Single Market: Opportunities and Challenges for Europe (25.05.2016)

(Contribution by Anne-Morgane Devriendt, EDRi intern)



17 Oct 2017

EU Council legal services unclear about censorship filters

By Diego Naranjo

On 16 October 2017, Politico leaked the response from the Legal Services of the Council of the European Union (CLS) to the questions raised by six member states about the legality of the upload filter proposal in the Article 13 of the Copyright Directive proposal. As the censorship filter is about restricting fundamental rights, it is regrettable that the questions did not mention the rules under which restrictions can be imposed, namely Article 52.1 of the Charter of Fundamental Rights of the European Union. This means that key issues were not addressed in the CLS response.

In anticipation of being leaked, the document is carefully written to say nothing clearly, speculating instead about future interpretations, and ending with a “to be continued” clause, because, as the CLS correctly points out, the “confusing terms in which that [explanatory] recital is drafted raise various legitimate questions to which, regrettably, no answers are given…”

Censorship filters: Not in my name… or maybe yes

Paragraph 12 of the document repeats the views of the Court of Justice of the European Union (CJEU) regarding the upload filters proposed in the Sabam/Netlog and Telekabel cases (in both cases the Court ruled against filtering) by saying that the measures to be put in place to prevent copyright infringements should be “the most affordable and effective ones among those available, taking into account elements such as ‘the resources and abilities available’ to them, the specificities and the needs of their services, as well as their size”.

The document correctly states that content recognition technology (i.e. upload filtering) is not explicitly imposed in Article 13. This is because the article is designed to coerce the companies into imposing the measure, rather than explicitly mandating that they do so. As a result, they can choose “through cooperation with the right holders, the most appropriate and proportionate measure to achieve the objective pursued”. Is that “provided for by law” as required by the Article 52.1, “necessary” or “genuinely achieve objectives of general interest” as required by the Charter of Fundamental Rights”? The answer isn’t given because the question wasn’t asked. Are the rules for restricting fundamental rights “clear and precise” as the Court has repeatedly insisted that they must be? It is impossible to read previous rulings of the CJEU and come to the conclusion that Article 13 achieves minimum standards of predictability.

If you’re censored, please contact the company customer services

Moving away from the basic principle of law that fundamental rights are the law and exceptions must be clearly provided for, the CLS builds on the notion of corporate rights. The rights of the internet companies are not restricted, because they will be wise enough to buy, and able to afford,the level of filter that the courts will deem appropriate. Can we realistically assume that such appropriate, affordable, necessary, proportionate, non-intrusive, accurate, up-to-date, multiple (text, image, video, audio…) filters exist?

The CLS states that there is no breach of freedom of expression in case that the company takes down legal content by failures of the system, since you can always contact the relevant online platform. As long as the provider does not take the cheap and easy option of saying the upload was a terms of service violation, as long as they don’t answer too slowly, as long as the file identifiers they were given were correct, as long as the usage was not a legal exception to copyright in one country but not in another, the CLS may have a point.

How this is going to work in practice for thousands of uploads taking place every minute of every day, and with many examples already known of these technologies failing, is a mystery. Will users get used to the idea of getting their blogposts, memes, videos or audios potentially censored by a company? Will this need to be taken to a court once the redress mechanism fails? Legal services have no response to that.

Data protection risks? No… but maybe yes!

Later on, when analysing the concerns raised regarding the right to data protection being affected by upload filters, the CLS replies “no”. The logic is simple: restrictions of freedom of expression do not breach of your freedom of expression, because you can complain (if the company does not take the easy option of claiming a terms of service violation). Similarly this activity is not a breach of your data protection rights, if no personal data is processed. What is not explained is how you can complain to a company about them filtering out your uploads, if the company that you are complaining to has processed no personal data and therefore does not know that it has filtered out your uploads.

Unsurprisingly, the document concludes that “in view of any technological developments, the envisaged proposed measures may lead to an interference with the right to personal data protection, further guarantees will need to be foreseen”.

Legal confusion regarding the e-Commerce Directive

When analysing whether the Copyright Directive impacts the e-Commerce Directive (contrary to the assertions of the European Commission), the CLS admits that the Commission’s proposal text for the Directive gives no clear answer. The CLS says that “the confusing terms in which that recital is drafted raise various legitimate questions to which, regrettably, no clear answers are given”. The CLS seems comfortable with the notion that it is proportionate and legally possible to impose one (harsher) interpretation for copyright, through the Copyright Directive, while leaving a less harsh regime in place for terrorism and child abuse. It may well be legally possible for one set of words (the e-Commerce Directive) to end up with two different meanings. It may well be legally possible to interpret those words in a way that treats the upload of a meme to YouTube as worse than uploading a beheading video. It may well be.

This document ends with two conclusions: First, the CLS replies to the Member States that asked about the upload filter proposals that the measures in the Article 13 are not necessarily disproportionate (but they might be and nobody actually knows what they will be). Secondly, it states and that there is no legal certainty regarding recital 38 of the Copyright Directive and its impact on the e-Commerce Directive.

Leak: Council legal service supports more clarity on proposed copyright reform (16.10.2017)

Six states raise concerns about legality of Copyright Directive (05.09.2017)

Leaked document: Questions from Member States to the Council legal services on the Censorship Machine (05.09.2017)

EU countries question legality & attack on fundamental rights

Proposed Copyright Directive – Commissioner confirms it is illegal (28.06.2017)

(Contribution by Diego Naranjo and Joe McNamee, EDRi)


17 Oct 2017

Privacy Camp 2018: Speech, settings and [in]security by design

By Kirsten Fiedler

Join us for the 6th annual Privacy Camp! Privacy Camp will take place on 23 January 2018 in Brussels, Belgium, just before the start of the CPDP conference. Privacy Camp brings together civil society, policy-makers and academia to discuss existing and looming problems for human rights in the digital environment. In the face of a “shrinking civic space” for collective action, the event aims to provide a platform for actors from across these domains to discuss and develop shared principles to address key challenges for digital rights and freedoms of individuals.

Our theme this year is “speech, settings and [in]security by design”. The two main tracks of the event therefore will focus on the one hand on the security of devices and infrastructure, and on the other hand on areas and policies where legitimate speech is endangered. Participate!

The first track will include sessions on state hacking and malware, law enforcement access to user data (so-called “e-evidence”), device security with hands-on tutorials on how to protect your communications better.

The second track will include sessions on algorithmic decision-making and discrimination via big data and privacy-invasive measures to censor legitimate speech online, as well as the hacking of democracies via the spread of misinformation and propaganda.

The event is co-organised by European Digital Rights (EDRi), Privacy Salon, USL-B Institute for European Studies and VUB-LSTS. Participation is free. Registrations will open in early December.


We invite you to propose a panel for one of these two tracks:

Track 01 [in]security of devices
Topics: #statehacking #encryption #surveillance #statemalware #Eevidence #security #technopolitics

Track 02 [in]security of speech
Topics: #uploadfilters #censorship #algorithms #discrimination #accountability #hackingelections #misinformation #propaganda

When submitting your proposal:

  • Indicate a clear objective for your session: What would be a good outcome for you?
  • Indicate other speakers that could participate in your panel (and let us know which speaker has already confirmed, at least in principle, to participate).
  • Make it as participative as possible, think about how to include the audience and diverse actors as much as possible.
  • Send us a description of no more than 500 words.
  • Deadline for submissions is 20 November.

After the deadline, we will review your submission and let you know by 6 December whether your panel can be included in the programme. It is possible that we suggest to merge proposals if they are very similar.

Please send your proposal via email to Maren <edri.intern3(at)edri(dot)org>!

If you have questions, please contact Kirsten <kirsten.fiedler(at)edri(dot)org> or Imge <imge.ozcan(at)vub(dot)be>.

16 Oct 2017

EU’s plans on encryption: What is needed?

By Maryant Fernández Pérez

On 18 October 2017, the European Commission is expected to publish a Communication on counter-terrorism, which will include some lines on encryption.

What is encryption? Why is it important?

When we send an encrypted message (or store an encrypted document), no one else but the intended recipient can read it using a unique key. So even if someone manages to intercept the message when it’s on its way to the recipient, they will not be able to read its contents without the key – they can only see something that looks like a random set of characters. Encryption ensures the confidentiality of our personal data and company secrets. This is not only essential for our democratic rights and freedoms, but it also promotes trust in our digital infrastructure and communications, which is vital for innovation and economic growth. For example, encryption is essential for securing online banking transactions, and protecting the confidentiality of sources in journalism.

Encryption workarounds needed?

The European Commission has come under pressure from some EU Member States to take actions to address the perceived problem of data not being available to law enforcement authorities, due to encryption. This issue is frequently hyped as a major problem, and certain politicians have suggested simplistic and counter-productive policies to weaken encryption as a “solution” to them.

There are several techniques that law enforcement authorities use to access encrypted data. One approach consists in obtaining the key to decrypt the data, for instance, through a physical search when the key is saved on a USB drive. The key can also be obtained from the user directly, for example via social engineering or legal obligation. Another approach is to access the decrypted data through bypassing the key by exploiting a flaw or weakness in the system or by installing software or spyware. However, the existence of workarounds does not mean that law enforcement should resort to them nor that they would be necessary or proportionate, or even compatible with human rights law.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

From a technical point of view, encryption cannot be weakened “just a little”, without potentially introducing additional vulnerabilities, even if unintentionally. When there is a vulnerability, anyone can take advantage of it, not just the police or intelligence services. Sooner or later, a secret vulnerability will be exploited by a malicious user, perhaps the same one it was meant to be safeguarding us from. Law enforcement aims are legitimate. However, as pointed out by the European Union Agency for Network and Information Security (ENISA), limiting the use of encryption will create vulnerabilities, lower trust in the economy and damage civil society and industry alike.

What should the European Union do?

A more balanced approach is needed, which avoids much of the rhetoric that is often heard in relation to encryption. Such an approach would recognise a variety of options for addressing this issue without compromising everybody’s security or violating human rights.

Saying “no” to backdoors is a step into the right direction, but not the end of the debate, as there are still many ways to weaken encryption. The answer to security problems like those created by terrorism cannot be the creation of security risks. On the contrary, the EU should focus on stimulating the development and the use of high-grade standards for encryption, and not in any way undermine the development, production or use of high-grade encryption.

We are concerned by the potential inclusion of certain aspects of the forthcoming Communication, such as the increase of capabilities of Europol and what this may entail, and references to removal of allegedly “terrorist” content without accountability in line with the Commission’s recent Communication on tackling illegal content online. We remain vigilant regarding the developments in the field of counter-terrorism.

Read more:

Encryption – debunking the myths (03.05.2017)

EDRi delivers paper on encryption workarounds and human rights (20.09.2017)

EDRi position paper on encryption (25.01.2016)

How the internet works (23.01.2012, available in six languages)


16 Oct 2017

Civil society calls for the deletion of the #censorshipmachine


Today, 16 October, European Digital Rights (EDRi), together with 56 other civil society organisations, sent an open letter to EU decision makers. The letter calls for the deletion of the Article 13 of the Copyright Directive proposal, pointing out that monitoring and filtering of internet content that it proposes breach citizens’ fundamental rights.

The proposals in the Copyright Directive would relegate the European Union from a digital rights defender in global internet policy discussions to the leader in dismantling fundamental rights, to the detriment of internet users around the world,

said Joe McNamee, Executive Director of EDRi.

The censorship filter proposal would apply to all online platforms hosting any type of user-uploaded content such as YouTube, WordPress, Twitter, Facebook, Dropbox, Pinterest or Wikipedia. It would coerce platforms into installing filters that prevent users from uploading copyrighted materials. Such a filter would require the monitoring of all uploads and would be unable to differentiate between copyright infringements and legitimate uses of content authorised by law. It undermines legal certainty for European businesses, as it creates legal chaos and offers censorship filters as a solution.

The letter points out that the censorship filter proposal of the Article 13:

  1. would violate the right to freedom of expression set out in the Charter of Fundamental Rights;
  2. provoke such legal uncertainty that online services would have no other option than to monitor, filter and block EU citizens’ communications; and
  3. includes obligations on internet companies that would be impossible to respect without imposing excessive restrictions on citizens’ fundamental rights.

Read the letter below.

In September 2016, The European Commission published its proposal for a new Copyright Directive that aims at modernising EU copyright rules. The proposal has received mixed responses so far in the European Parliament and is awaiting a vote in the Legal Affairs Committee of the Parliament.


Article 13 – Monitoring and Filtering of Internet Content is Unacceptable
Open letter

Dear President Juncker,
Dear President Tajani,
Dear President Tusk,
Dear Prime Minister Ratas,
Dear Prime Minister Borissov,
Dear Ministers,
Dear MEP Voss,
Dear MEP Boni,

The undersigned stakeholders represent fundamental rights organisations.

Fundamental rights, justice and the rule of law are intrinsically linked and constitute core values on which the EU is founded. Any attempt to disregard these values undermines the mutual trust between member states required for the EU to function. Any such attempt would also undermine the commitments made by the European Union and national governments to their citizens.

Article 13 of the proposal on Copyright in the Digital Single Market include obligations on internet companies that would be impossible to respect without the imposition of excessive restrictions on citizens’ fundamental rights.

Article 13 introduces new obligations on internet service providers that share and store user-generated content, such as video or photo-sharing platforms or even creative writing websites, including obligations to filter uploads to their services. Article 13 appears to provoke such legal uncertainty that online services will have no other option than to monitor, filter and block EU citizens’ communications if they are to have any chance of staying in business.

Article 13 contradicts existing rules and the case law of the Court of Justice. The Directive of Electronic Commerce (2000/31/EC) regulates the liability for those internet companies that host content on behalf of their users. According to the existing rules, there is an obligation to remove any content that breaches copyright rules, once this has been notified to the provider.

Article 13 would force these companies to actively monitor their users‘ content, which contradicts the ‘no general obligation to monitor‘ rules in the Electronic Commerce Directive. The requirement to install a system for filtering electronic communications has twice been rejected by the Court of Justice, in the cases Scarlet Extended (C 70/10) and Netlog/Sabam (C 360/10). Therefore, a legislative provision that requires internet companies to install a filtering system would almost certainly be rejected by the Court of Justice because it would contravene the requirement that a fair balance be struck between the right to intellectual property on the one hand, and the freedom to conduct business and the right to freedom of expression, such as to receive or impart information, on the other.

In particular, the requirement to filter content in this way would violate the freedom of expression set out in Article 11 of the Charter of Fundamental Rights. If internet companies are required to apply filtering mechanisms in order to avoid possible liability, they will. This will lead to excessive filtering and deletion of content and limit the freedom to impart information on the one hand, and the freedom to receive information on the other.

If EU legislation conflicts with the Charter of Fundamental Rights, national constitutional courts are likely to be tempted to disapply it and we can expect such a rule to be annulled by the Court of Justice. This is what happened with the Data Retention Directive (2006/24/EC), when EU legislators ignored compatibility problems with the Charter of Fundamental Rights. In 2014, the Court of Justice declared the Data Retention Directive invalid because it violated the Charter.

Taking into consideration these arguments, we ask the relevant policy-makers to delete Article 13.


Civil Liberties Union for Europe (Liberties)
European Digital Rights (EDRi)

Access Info
Article 19
Associação D3 – Defesa dos Direitos Digitais
Associação Nacional para o Software Livre (ANSOL)
Association for Progressive Communications (APC)
Association for Technology and Internet (ApTI)
Association of the Defence of Human Rights in Romania (APADOR)
Associazione Antigone
Bangladesh NGOs Network for Radio and Communication (BNNRC)
Bits of Freedom (BoF)
BlueLink Foundation
Bulgarian Helsinki Committee
Center for Democracy & Technology (CDT)
Centre for Peace Studies
Centrum Cyfrowe
Coalizione Italiana Libertà e Diritti Civili (CILD)
Code for Croatia
Culture Action Europe
Electronic Frontier Foundation (EFF)
Estonian Human Rights Centre
Freedom of the Press Foundation
Frënn vun der Ënn
Helsinki Foundation for Human Rights
Hermes Center for Transparency and Digital Human Rights
Human Rights Monitoring Institute
Human Rights Watch
Human Rights Without Frontiers
Hungarian Civil Liberties Union
Index on Censorship
International Partnership for Human Rights (IPHR)
International Service for Human Rights (ISHR)
JUMEN – Human Rights Work in Germany
Justice & Peace
La Quadrature du Net
Media Development Centre
Miklos Haraszti (Former OSCE Media Representative)
Modern Poland Foundation
Netherlands Helsinki Committee
One World Platform
Open Observatory of Network Interference (OONI)
Open Rights Group (ORG)
Plataforma en Defensa de la Libertad de Información (PDLI)
Reporters without Borders (RSF)
Rights International Spain
South East Europe Media Organisation (SEEMO)
South East European Network for Professionalization of Media (SEENPM)
The Right to Know Coalition of Nova Scotia (RTKNS)

CC: Permanent and Deputy Permanent Representatives of the Members States to the EU
CC: Chairs of the JURI and LIBE Committees in the European Parliament
CC: Shadow Rapporteurs and MEPs in the JURI and LIBE Committees in the European Parliament
CC: Secretariats of the JURI and LIBE Committees in the European Parliament
CC: Secretariat of the Council Working Party on Intellectual Property (Copyright)
CC: Secretariat of the Council Working on Competition
CC: Secretariat of the Council Research Working Party

Read more:

Article 13 Open letter – Monitoring and Filtering of Internet Content is Unacceptable (16.10.2017)

Over 50 Human Rights & Media Freedom NGOs ask EU to Delete Censorship Filter & to Stop Copyright Madness (16.10.2017)

Deconstructing the Article 13 of the Copyright proposal of the European Commission, revision 2

The Copyright Reform: a guide for the perplexed

Copyright reform: Document pool

Six states raise concerns about legality of Copyright Directive (05.09.2017)

Proposed Copyright Directive – Commissioner confirms it is illegal (28.06.2017)