A measure which would be illegal if implemented by a government should also be illegal if implemented by industry as a “voluntary” measure, as a result of government pressure or for public relations or anti-competitive reasons. However, as key international legal instruments, such as the European Charter of Fundamental Rights and the European Convention on Human Rights, as well as national constitutions are binding for states and governments, they are not directly applicable to other entities, such as private companies. As a result, there is a major trend towards governments persuading or coercing companies to impose restrictions on fundamental freedoms under the guise of “self-regulation,” thereby circumventing legal protections.

04 Dec 2017

#BoostYourShield against violations of digital rights!


Companies and governments increasingly abuse online tools to restrict our freedoms: mass surveillance, random censorship, blocking access to information, knowledge and culture, continuous monitoring of what we do online, tracking and profiling.

We are now looking for 200 Digital Defenders to join our fight against these digital rights violations!

By becoming a regular donor, you will help to boost the shield against digital rights violations and allow us to continue defending your rights and freedoms online.

Advocating for digital rights is a long-term commitment and thereby needs long-term support. We are happy about any kind of support we get, but having regular donors allows us to think ahead and to work sustainably. And: Regular giving is easy and convenient! You will always have the flexibility to change or cancel your donation – at any time you want, without any hassle.

European Digital Rights (EDRi) is a not-for-profit association of digital civil rights organisations. Our objectives are to promote, protect and uphold civil rights. We advocate for a world where everyone enjoys privacy, where we are not being spied on and our data is not being sold or abused. A world where we can share our thoughts freely, without fear.

30 Nov 2017

EDRi celebrates its 15th anniversary – save the date

By Kirsten Fiedler

European Digital Rights (EDRi) will be celebrating its 15 years of fighting for fundamental rights online. We wish to mark the occasion with a celebration of the enduring passion and energy of the digital rights movement in Europe!

The anniversary will take place on 12 April 2018 in Brussels, before EDRi’s annual General Assembly. We look forward to our network of supporters, EDRi’s members and friends who will help us celebrate our this milestone and reinforce our passion to promote human rights online.

When: 12 April 2018
Where: Brussels (exact venue tbc)

EDRi’s founders had the vision to see the need for international cooperation to promote and defend the pillars of our democracy in the digital environment. Our anniversary is a celebration of our successes, but also a call to action for the next 15 years

said Joe McNamee, Executive Director of European Digital Rights (EDRi).

The anniversary celebration will include many surprises. Stay tuned for new announcements in the coming months!

30 Nov 2017

Time to stop the #CensorshipMachine: NOW!


Following the launch of the controversial proposed Copyright Directive in September 2016, the European Parliament and the Member States (gathered in the Council of the European Union) are now developing their positions.

Now it’s the time to send a clear message to European Parliament and national governments to oppose the “censorship machine”!

What happened in the EU Parliament?

The European Parliament Committee in charge of the file, the Committee on Legal Affairs (JURI), has been relatively quiet on their Report. However, some compromises have been drafted on Articles that do not concern the so-called “censorship machine”, the upload filter in Article 13 of the Copyright Directive proposal. On the other hand, the Committee on Civil Liberties, Justice and Home Affairs (LIBE) voted on its Opinion on 20 November 2017 to remove the mention to upload filters, as did the Internal Market and Consumer Committee (IMCO). So far, those two Committees have been in favour, while two others opposed (Industry, Research and Industry (ITRE) and Culture and Education (CULT). The next step in the parliamentary process is the vote in JURI.

What about the EU Council?

The Troika of copyright extremist Member States (Spain, France and Portugal) have given Estonia (who holds the Presidency of the Council until January 2018, when Bulgaria will take on the role) new “ideas” on how to make the Copyright Directive even worse.

What can you do now?

To make sure we won’t end up with a censorship machine that will limit our freedom of expression, we need to remind your MEPs and our national governments how important it is to oppose upload filters!

Should every upload to the internet in Europe be under surveillance?
Should big online companies like Google and Facebook get a huge competitive advantage?
…or should we defend our freedom of expression and fundamental rights, and stop the #CensorshipMachine?

The time to speak up is now!

In just two weeks, the European Parliamentarians will leave for their holidays – and even if the vote on the proposal is scheduled for January 2018, crucial parts of the negotiations will be taking place already before the holidays. That’s why you need to make your voice heard now!

There are two easy actions you can take before 17 December:

  1. Contact your representatives in the European Parliament JURI Committee!
    You can find your MEP in the list below, including their twitter handle and further contact details when you click on their names. Ideally you can call or just email them. If you prefer calling from your computer you can use EDRi member Bits of Freedom’s free calling tool.You can also tweet at your MEP by using the hashtags #CensorshipMachine and #FixCopyright! You could tweet for example along the lines:

    .@MEP_twitter_handle, please oppose the #CensorshipMachine in the #copyright Directive proposal! #FixCopyright

  2. Contact the relevant ministry in your country dealing with the file!
    You can find some contact details here. (If you have more, please feel free to add them to the spreadsheet!) To strengthen your message, team up with other human rights, consumer or internet users associations, and draft a common response!

Members of the Committee on Legal Affairs (JURI)

The European Parliament works in different specialised Committees. The Committee on Legal Affairs (JURI) consists of one chair (Pavel Svoboda), four vice-chairs (Jean-Marie Cavada,  Laura Ferrara, Mady Delvaux and Lidia Joanna Geringer de Oedenberg), 20 members and 23 substitutes*. This Committee is in charge of, among other files, the discussions on copyright.


Name and Twitter handle EU political group National party
Evelyn Regner


S&D Sozialdemokratische Partei Österreichs (SPÖ)



Name and Twitter handle EU political group National party
Emil Radev


EPP Граждани за европейско развитие на България (ГЕРБ)
Angel Dzhambazki*




Czech Republic

Name and Twitter handle EU political group National party
Pavel Svoboda (Chair)


EPP Křesťanská a demokratická unie – Československá strana lidová (KDU–ČSL)
Jiří Maštálka


GUE-NGL Komunistická strana Čech a Moravy (KSČM)



Name and Twitter handle EU political group National party
Jens Rohde*


ALDE Det Radikale Venstre



Name and Twitter handle EU political group National party
Jean-Marie Cavada (Vice-Chair)


ALDE Génération Citoyens (GC)
Joëlle Bergeron


EFD Sans étiquette
Marie-Christine Boutonnet


ENF Front national (FN)
Gilles Lebreton


ENF Front National (FN)
Pascal Durand*


Greens/EFA Europe Écologie
Constance Le Grip*


EPP Les Républicains (LR)
Virginie Rozière*


S&D Parti radical de gauche (PRG)



Name and Twitter handle EU political group National party
Heidi Hautala*


Greens/EFA Vihreä liitto



Name and Twitter handle EU political group National party
Sylvia-Yvonne Kaufmann


S&D Sozialdemokratische Partei Deutschlands (SDP)
Julia Reda


Greens/EFA Piratenpartei Deutschland (PIRATEN)
Axel Voss


EPP Christlich Demokratische Union Deutschlands (CDU)
Evelyne Gebhardt*


S&D Sozialdemokratische Partei Deutschlands (SPD)
Angelika Niebler*


EPP Christlich-Soziale Union in Bayern e.V. (CSU)
Rainer Wieland*


EPP Christlich Demokratische Union Deutschlands (CDU)
  Tiemo Wölken*


S&D Sozialdemokratische Partei Deutschlands (SPD)



Name and Twitter handle EU political group National party
Kostas Chrysogonos


GUE-NGL Synaspismós Rizospastikís Aristerás (Syriza)



Name and Twitter handle EU political group National party
József Szájer


EPP Fidesz-Magyar Polgári Szövetség-Keresztény Demokrata Néppárt (Fidesz)



Name and Twitter handle EU political group National party

Brian Crowley


ECR Ireland /
Fianna Fáil Party



Name and Twitter handle EU political group National party
Laura Ferrara (Vice-Chair)


EFDD Movimento 5 Stelle (M5S)
Enrico Gasbarra


S&D Partito Democratico (PD)
Isabella Adinolfi*


EFDD Movimento 5 Stelle (M5S)
Mario Borghezio*


ENF Lega Nord (LN)
Sergio Gaetano Cofferati*


Stefano Maullu*


EPP Forza Italia (FI)



Name and Twitter handle EU political group National party
Antanas Guoga*


EPP Independent
Viktor Uspaskich*


ALDE Darbo partija (DP)



Name and Twitter handle EU political group National party
Mady Delvaux (Vice-Chair)


S&D Parti ouvrier socialiste luxembourgeois (POSL)



Name and Twitter handle EU political group National party
Francis Zammit Dimech


EPP Partit Nazzjonalista (PN)



Name and Twitter handle EU political group National party
Lidia Joanna Geringer de Oedenberg (Vice-Chair)


 S&D  Bezpartyjna
Tadeusz Zwiefka


EPP Platforma Obywatelska (PO)
Kosma Złotowski*


 ECR Prawo i Sprawiedliwość (PiS)
Stanisław Żółtek*
 ENF Kongres Nowej Prawicy (KNP)



Name and Twitter handle EU political group National party
António Marinho e Pinto


ALDE Partido Democrático Republicano (PDR)



Name and Twitter handle EU political group National party
Daniel Buda*


EPP Partidul Naţional Liberal (PNL)
Răzvan Popa*


  S&D  Partidul Social Democrat (PSD)



Name and Twitter handle EU political group National party
Rosa Estaràs Ferragut


EPP Partido Popular (PP)
Luis de Grandes Pascual*


EPP Partido Popular (PP)



Name and Twitter handle EU political group National party
Max Andersson


Greens/EFA Miljöpartiet de gröna (MP)
Jytte Guteland*


S&D Arbetarepartiet- Socialdemokraterna


United Kingdom

Name and Twitter handle EU political group National party
Mary Honeyball


S&D Labour Party
Sajjad Karim


ECR Conservative Party
Jane Collins*


EFDD UK Independence Party (UKIP)



29 Nov 2017

Dutch mass surveillance law receives two BBA nominations

By Bits of Freedom

Until 9 November 2017 people in the Netherlands could nominate individuals, organisations and companies for a Big Brother Award. The three most “popular” nominees are now in the running to become the biggest privacy offender of the year. Two of the three nominees, Christian Democratic Appeal (CDA) parliamentary party leader Sybrand Buma and the Cabinet, owe their nominations to the 2017 dragnet surveillance law.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

The dragnet surveillance law just can’t get a break. In 2015, the then Minister of the Interior Ronald Plasterk received an Award for “the most far-reaching surveillance law for the secret services the Netherlands has ever seen” and for being “deaf to criticism”. The new Intelligence and Security Services Act was passed earlier this year, but citizens continue to protest. This summer students collected over 350 000 signatures to ensure that a referendum would be organised on this subject. Thanks to them, every Dutch citizen will be able to cast their vote in a (non-binding) referendum about the controversial law on 21 March 2018.

Citizens are also using the Big Brother Awards to draw attention to the law. People feel the Cabinet deserves the award for “pushing through the dragnet law”, which “shows a complete disregard for the privacy of millions of innocent citizens”. Sybrand Buma owes his nomination to the fact that he refuses to listen to the people: in an interview he proclaimed he would ignore the outcome of the referendum. The law is going to come into effect, he said, no matter what.

The Big Brother Awards are based on a concept created by EDRi member Privacy International. The goal is to draw attention to violations of privacy. The Dutch Big Brother Awards are organised by EDRi member Bits of Freedom and will take place on 11 December at the city theatre in Amsterdam. The third nominee nominated by the public is the Dutch Association of Mental Health and Addiction Care. The Tax and Customs Administration, data dealer Focum, and Translink, the company behind the public transport chip card, are in the running for the other prize, the “Expert Award”.

Dutch Big Brother Awards

Dutch Senate votes in favour of dragnet surveillance powers (26.07.2017)

Minister of Interior and National Police Force win Dutch BBA 2015 (04.11.2015)

The Dutch continue to fight new mass surveillance law (15.11.2017)

(Contribution by Evelyn Austin, EDRi member Bits of Freedom, the Netherlands)



29 Nov 2017

EU Member States plan to ignore EU Court data retention rulings

By IT-Pol

Documents made publicly available through EDRi member Statewatch reveal that EU Member States are exploring all possible options to keep, and in fact expand, their current data retention regimes. The general plan is based on a new concept of ”restricted data retention”, which is really blanket data retention with a new name, along with amendments to the draft e-Privacy Regulation to facilitate blanket data retention. Member States are considering whether these new elements should be introduced through an EU instrument or through national law in each Member State.

On 15 September 2017, the EU Counter-Terrorism Coordinator (EU CTC) submitted a new data retention proposal to Member States. The proposal was discussed at a meeting of the Working Party on Information Exchange and Data Protection (DAPIX) Friends of the Presidency (FoP) on 18 September 2017. A partial report of the discussions at the DAPIX FoP meeting can be found in Council document 13845/17.

----------------------------------------------------------------- Support our work with a one-off-donation! -----------------------------------------------------------------

The judgement of 21 December 2016 by the Court of Justice of the European Union (CJEU) in the Tele2 case (joined cases C-203/15 and C-698/15) concerned the national data retention laws that are still in place after the annulment of the Data Retention Directive in 2014. The EU CTC notes that data retention cannot be ”general and indiscriminate retention of all traffic and location data of all subscribers and registered users relating to all means of electronic communication” since this would violate paragraph 134 of the Tele2 judgement. In the Tele2 judgement (paragraphs 108-111), the CJEU outlines a targeted data retention regime which does not include every subscriber.

The EU CTC, considering input received from Member States, makes it clear that he is not at all interested in targeted data retention. Instead, the EU CTC proposes the concept of ”restricted data retention” on the basis that it is necessary to fight terrorism and serious crime, including cyber attacks. This measure has to be limited to the strictly necessary and be based on objective evidence. However, according to the EU CTC, the measure can cover the entire population, even though this is quite obviously blanket data retention.

The justification for this is claimed to be paragraph 106 of Tele2, which states that data retention must be restricted to (i) particular time periods and/or geographical and/or a group of persons likely to be involved, in one way or another, in a serious crime or (ii) persons who could, for other reasons, contribute, through their data being retained, to fighting crime. In essence, the EU CTC argues that the entire population, perhaps with an opt-out for persons bound by a legal obligation of professional secrecy (such as lawyers, journalists and doctors), could fall under the second category, ”persons who could, for other reasons, contribute, through their data being retained, to fighting crime”.

While deliberately covering the entire population, the EU CTC emphasises that other aspects of the data retention measure must be limited to what is absolutely necessary. What this means is not clear from the proposal, but it could include some differentiation with respect to categories of data and service providers. Minor operators, such as WiFi access points at pizza restaurants could be excluded since that data ”may potentially not be indispensable for retention”, as the EU CTC carefully notes. As far as the purpose limitation is concerned, there is nothing novel about the reinvention of restricted data retention. The annulled Data Retention Directive also limited data retention to the purpose of investigation, detection and prosecution of serious crime.

The critical aspect of restricted data retention is obviously that the entire population is covered. The EU CTC argues that this can meet the necessity test. However, the CJEU has ruled twice that a data retention measure which covers all subscribers exceeds the limits of what is strictly necessary. Referring to the entire population as ”persons who could, for other reasons, contribute, through their data being retained, to fighting crime” clearly fails to satisfy the requirement of objective criteria that establish a connection between the personal data to be retained and the objective pursued. The CJEU has referred to this principle several times, most recently in paragraph 191 of opinion 1/15 on the EU-Canada PNR agreement. Moreover, paragraph 110 of the Tele2 judgment specifically says that ”conditions must be shown to be such as actually to circumscribe, in practice, the extent of that measure and, thus, the public affected.”

The DAPIX FoP meeting report mentions that, while the CJEU rules out general data retention, it “does not solely permits” (sic) targeted data retention (which appears to mean that data retention that is not forbidden by the ruling may be permitted). Therefore, there are other legally possible regimes for non-general data retention. This is undoubtedly true, but largely irrelevant. Since the proposed unrestricted yet “restricted” data retention covers the entire population, it cannot possibly be classified as non-general data retention. The DAPIX FoP report refers to the proposed concept as ”restricted data retention and targeted access”, but the Tele2 judgment makes it very clear that safeguards and limitations at the access stage are not sufficient and cannot justify blanket (general) data retention.

The proposal from the EU CTC contains some general comments about the data categories (communication services) to be retained. It is claimed that approaches in some Member States show that a number of data categories are indeed not necessary (and, by implication, illegal).

The new focus on cyber attacks, where data retention is claimed to be key for attribution and investigation, could easily lead to more retention of internet traffic data, in particular, perhaps even internet connection records as in the UK Investigatory Powers Act (information about every internet packet, including all destination IP-addresses). Moreover, Europol has recently complained about the unavailability of data from internet service providers that use Carrier Grade network address translation (CG-NAT) since a large number of subscribers may share the same IP address. Data retention requirements to address the technical limitations caused by CG-NAT would, in most cases, substantially increase the amount of data collected. The DAPIX FoP report describes a matrix with categories of data to be retained, for example content data, traffic data, location data, and subscribers’ data. Except for content data (where generalised data retention would, incidentally, not respect the essence of the fundamental rights), this is simply the list of data categories in the annulled Data Retention Directive and the current data retention laws in Member States. In summary, the proposal of the EU CTC could easily lead to more data being retained per subscriber, despite the claim that a “peeling off” approach is taken to limit the data categories.

Data retained for business purposes, such as billing data, will be complementary to the data covered by the mandatory data retention regime. The EU CTC foresees that the new mandatory data retention regime will also cover over-the-top (OTT) service providers like Google and Facebook, and it is noted in the proposal that OTT operators collect much more data for business purposes than traditional telecommunications operators. In this connection, the EU CTC fails to mention (or, possibly, understand) that the proposed e-Privacy Regulation seeks to create a level playing field by subjecting all electronic communications service providers, whether OTT or telecommunications providers, to the same privacy rules.

The proposal from the EU CTC respects the strict access conditions set out in the second part of the Tele2 ruling. Access to retained data must be solely for the purpose of fighting terrorism and serious crime and must be subject to a prior court review. With the exception of terrorism cases, access can only be granted to data of individuals suspected of involvement in serious crime (Tele2 paragraph 119). The EU CTC also mentions pseudonymisation and encryption, and that this could facilitate searches of the retained encrypted data with decryption only on the basis of a warrant. The purpose of this is not entirely clear, since the retained data, as the general rule, can only be accessed with a prior court review for a specific person. It could perhaps mean that searches of encrypted or pseudonymised data are not intended to count as access to the retained data, and that such searches can be used to find persons of interest who can then, under certain substantive conditions, be depseudonymised subject to a court review. If data on specific persons could only be accessed after a prior court review, there would not really be a need for encrypted searches. Encryption is, of course, a useful security measure for the stored data, but that is an entirely different issue.

In the final part of the proposal, the EU CTC considers the role of the draft e-Privacy Regulation in relation to restricted data retention. The EU CTC notes that the Tele2 judgment is stricter than the annulment of the Data Retention Directive since Article 15(1) of the e-Privacy Directive makes data retention an exception to the main rule of erasure once the communication is completed. The EU CTC hypothesises that the draft e-Privacy Regulation could be amended to make blanket data retention easier. According to the EU CTC, it should be considered to allow storage of communications data in Article 7 of the draft e-Privacy Regulation if legally required to assist governments to fight serious crime and terrorism. However, a provision of this type would still be a restriction on the fundamental rights to privacy and data protection of subscribers, and the restriction would have to satisfy the conditions of Article 52(1) of the Charter of Fundamental Rights. This would not necessarily be different from the current situation with Article 15(1) of the e-Privacy Directive or Article 11 of the draft e-Privacy Regulation.

Working document on contributions to the discussion on data retention, EU Counter-Terrorism Coordinator, WK 9699/2017 INIT, LIMITE (15.09.2017)

Retention of communication data for the purpose of prevention and prosecution of crime, Council document 13845/17, LIMITE (30.10.2017)

Carrier-Grade Network Address Translation (CGN) and the Going Dark Problem, Council document 5127/17, LIMITE (16.01.2017)

(Contribution by Jesper Lund, EDRi member IT-Pol, Denmark)



29 Nov 2017

Eurojust: No progress to comply with CJEU data retention judgements

By IT-Pol

A recently published Eurojust report on data retention in Europe confirms that EU Member States failed to make meaningful progress towards complying with fundamental rights standards, as clarified by the two Court of Justice of the European Union (CJEU) rulings banning blanket data retention.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

The CJEU has delivered two rulings on mandatory data retention of traffic and location data (metadata) for electronic communications services. In the Digital Rights Ireland judgement of 8 April 2014 (joined cases C-293/12 and C-594/12), the Data Retention Directive 2006/24/EU was declared invalid. This was followed by the Tele2 judgement of 21 December 2016 (joined cases C-203/15 and C-698/15), where the CJEU ruled that Article 15(1) of the e-Privacy Directive, read in the light of the Charter of Fundamental Rights of the European Union, precludes national laws which require general and indiscriminate retention of metadata (blanket data retention). Only targeted data retention is allowed under EU law.

A month after the Tele2 ruling, the Council Legal Service sent an analysis of the judgement to Member States, where it concluded that ”a general and indiscriminate retention obligation for crime prevention and other security reasons would no more be possible at national level than it is at EU level, since it would violate just as much the fundamental requirements as demonstrated by the Court’s insistence in two judgements delivered in Grand Chamber”. This was a clear message to Member States who had hitherto claimed that the annulment of the Data Retention Directive in April 2014 did not affect their national data retention laws. When the analysis of the Legal Service was released to the public on 27 March 2017 (Council document 5884/17), the paragraph containing this critical sentence was redacted.

Despite the clear judgement in the Tele2 case, blanket data retention laws are still in place in most Member States. EDRi member Privacy International surveyed 21 national data retention laws and examined their compliance with fundamental rights standards. None of the 21 laws are currently in compliance with these standards, as interpreted by the CJEU judgements in Digital Rights Ireland and Tele2.

This conclusion is confirmed by a recent Eurojust report ”Data retention regimes in Europe in light of the CJEU ruling of 21 December 2016 in Joined Cases C-203/15 and C-698/15” (Council document 10098/17, LIMITE) which was made publicly available by EDRi member Statewatch on 20 November 2017. The Eurojust report covers 25 EU Member States (as well as Norway and Switzerland), and is based on a detailed questionnaire sent to members of the European Judicial Cybercrime Network (EJCN) in March 2017.

According to the survey, five Member States (Austria, the Netherlands, Romania, Slovenia and Slovakia) do not currently have mandatory data retention, as their previous laws were invalidated by constitutional or high courts in accordance with the CJEU judgement on the Data Retention Directive. For the remaining Member States that responded to the survey, the Eurojust report concludes that “none of the countries have national legislation that obliges the targeted retention of data linked to specific persons or geographical locations”. In other words, their national data retention laws cover all subscribers, which is illegal under EU law.

Some respondents indicated that “they considered that their data retention regime is targeted by virtue of the limitations set with regard to retention periods and/or reason for the data retention”. However, this notion of “targeted” is rejected by the Eurojust report, as it is clearly not in line with the standards of the Tele2 judgement.

For access to the retained data, the majority of respondents state that a judicial review is required before access is granted. The replies also state that access is granted depending on the seriousness of the crime being investigated. The Eurojust questionnaire does not ask the respondents whether access to the retained data, as a general rule, can only be granted to “data of individuals suspected of planning, committing or having committed a serious crime or of being implicated in one way or another in such a crime”. This is a requirement in the Tele2 judgement (paragraph 119), except in terrorism cases.

Respondents of the Eurojust survey were also asked about the impact of the CJEU judgement in relation to the admissibility of evidence in court. Five countries reported on court rulings where the admissibility of evidence from data retention was evaluated by the court. So far the evidence has been deemed admissible by courts, although one of the five cases (in Ireland) is still pending on appeal. This part of the Eurojust report shows a clear concern that evidence obtained from illegal data retention could one day be ruled inadmissible by courts.

The legal uncertainty regarding the admissibility of evidence obtained from data retention is by no means surprising. Unless Member States quickly amend their data retention laws to bring them into compliance with the CJEU standards, it is reasonable to expect that there will be more challenges to the admissibility of the evidence. Even if national courts generally allow illegally obtained evidence in specific cases, the courts may eventually rule differently when prosecutors consistently submit evidence that is only available because of illegal data retention laws. The fundamental right to a fair trial may certainly be questioned if the state systematically relies on evidence that is obtained in violation of established human rights standards.

Finally, the Eurojust survey asks about initiatives at the national level to change the data retention legislation. In ten Member States, a review or assessment of the legislation is ongoing, and three Member States are in the process of drafting amendments. The Eurojust report also outlines the substantive legal changes being planned or considered by Member States. Most of these seem concerned with access to the retained data, such as limiting access to serious crime only. This would address a narrow reading of the 2014 Digital Rights Ireland ruling, whereby blanket data retention may be understood as theoretically possible if sufficient safeguards for access are put in place. With the 2016 Tele2 ruling that interpretation is clearly rejected by the CJEU. Only one Member State (Austria) specifically mentions the introduction of targeted data retention and quick freeze.

Informal remarks of the respondents show a clear preference for blanket data retention with arguments that it is impossible to determine in advance the individuals who will commit crimes and thus the data that needs to be retained. There are also claims that storing data indiscriminately for all citizens is more acceptable since the alternative, targeting specific persons or particular geographical locations, could result in criminal investigations that are considered discriminatory. Some respondents also indicated that the necessary balance is already guaranteed by the limitations placed on access to the retained data.

The last argument is particularly odd since the CJEU has clearly ruled in Tele2 that restrictions on access to the retained data are not sufficient. The retention of data must also meet objective criteria that establish a connection between the data to be retained and the objective pursued. In particular, such conditions must ensure that data is not retained on everyone (Tele2 paragraph 110). However, this does not mean that “the individuals who will commit crimes must be determined in advance”. The CJEU rulings in Digital Rights Ireland and Tele2 only require objective evidence to identify a public whose data is likely to reveal a link, at least an indirect one, with serious criminal offences (Tele2 paragraph 111).

There is also the possibility of retaining data on specific persons or a group of persons at an early stage of an investigation based on evidence or intelligence which does not currently meet the substantive requirements for access to metadata. If the police gathers further evidence to substantiate the suspicion for the person of interest and can make a reasoned request for access to data, retained metadata from the past of the suspected person will become available to the police. However, it will not be possible to “look into the past” of every possible citizen since this will require retention of data on everyone. The CJEU has ruled twice that this practice of mass surveillance is illegal.

Eurojust Report: Data retention regimes in Europe in light of the CJEU ruling of 21 December 2016 in Joined Cases C-203/15 and C-698/15

Information note from the Council Legal Service on the judgement of the Court in joined cases C-203/15 and C-698/15, Council document 5884/17, unredacted version (01.02.2017)

National Data Retention Laws since the CJEU’s Tele-2/Watson judgement, Privacy International (06.09.2017)

Data retention regimes in Europe in light of the CJEU ruling of 21 December 2016 in Joined Cases C-203/15 and C-698/15, Eurojust, Council document 10098/17

(Contribution by Jesper Lund, EDRi member IT-Pol, Denmark)



29 Nov 2017

Italy extends data retention to six years

By Hermes Center

On 8 November 2017, the Italian Parliament approved a Regulation on data retention that allows telecommunication operators to save telephone and internet data for up to six years.

----------------------------------------------------------------- Support our work with a one-off-donation! -----------------------------------------------------------------

The Italian Coalition for Civil Liberties and Rights (CILD) and EDRi observer member Hermes Center for Transparency and Digital Human Rights published their statement criticising the lack of scrutiny and meaningful debate about the Regulation prior to its approval. They also stated that the measure is to the detriment of the privacy of citizens, and could have extremely serious consequences for all of us. The two organisations have been voicing concerns since July 2017, when the provision was inserted into a transposition law following a European Council Directive 2014/33/EU on the “safety of lifts”.

In particular, the Regulation is in unequivocal breach case of law of the Court of Justice of the European Union and results in a clear conflict of law with current Italian privacy regulations, as pointed out by the president of the Italian Data Protection Authority Antonello Soro in October 2017.

Also, on 13 November, the European Data Protection Supervisor Giovanni Buttarelli commented that the newly approved Italian Regulation definitively fails to respect the European approach to data retention.

It seems inevitable that the law will be challenged in court.

Our phone and web data will be stored for 6 years: what about our rights? (12.11.2017)

Court of Justice of the European Union: The Members States may not impose a general obligation to retain data on providers of electronic communications services (21.12.2016)

European Data Protection Supervisor: “EU is the leader in data protection” (only in Italian, 13.11.2017)

‘6 years data retention, Court of Justice of EU may cancel it’. Interview to Prof. Filippo Benelli on likely CJEU action (only in Italian, 09.11.2017)

Metadata of phone and internet traffic: must be stored for 6 years (only in Italian, 08.11.2017)

(Contribution by Antonella Napolitano, the Italian Coalition for Civil Liberties and Rights CILD, and Fabio Pietrosanti, Hermes Center for Transparency and Digital Human Rights, Italy)



29 Nov 2017

Is anti-plagiarism software legal under EU Copyright legislation?

By Modern Poland Foundation

Are anti-plagiarism technologies compatible with copyright law? Surprisingly, this might not be the case.

Anti-plagiarism technology involves machine comparison of works such as diploma theses with pre-existing publications. This activity constitutes a use that is covered by copyright. Since no explicit limitation or exception of authors’ and publishers’ exclusive rights authorises providers and users of such technologies to use works in this way, their situation is legally uncertain, to say the least.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

The legal framework is set out in the 2001 Copyright in the Information Society (InfoSoc) Directive. It obliges EU Member States to provide for the exclusive reproduction right for authors, of their works, whether direct or indirect, temporary or permanent, by any means and in any form, in whole or in part. There is only one mandatory exception to this monopoly, covering temporary acts of reproduction in networks, but even it is has to comply with strict requirements. For example, it has to be transient or incidental, and it cannot have any independent economic significance. Needless to say, the reproduction performed in the course of anti-plagiarism machine analysis does not fall under this exception.

There are several other (non-mandatory) exceptions from the reproduction right provided for in the Directive, but none of them even remotely covers this use either. Interestingly, the current proposal for the Directive on Copyright in the Digital Single Market introduces a new exception for text and data mining, but only for research organisations for the purpose of scientific research. So, we cannot expect the situation to change in the near future – anti-plagiarism machine analyses require and will continue to require authorisation from respective copyright owners. It will be possible, as before, to buy such services from foreign providers that have more functional copyright legislation.

Although Member States are not allowed to introduce exceptions beyond those listed in EU law, they do approach the issue of anti-plagiarism in their legislation and practice. For example, in January 2016, an amendment to the Polish Act on Higher Education was published, which introduced a national repository of diploma theses and a uniform anti-plagiarism system. Both are administered by the Minister of Science and Higher Education and the law obliges universities to deposit theses in the repository and verify them using the system. They said obligations are not followed by a statutory or any other authorisation in the area of copyright. In fact, the law is silent on this issue.

However, this issue has to be solved one way or another when the system is put into operation. At the time of writing, it is still under construction. Along with the published specification in the implementation, test files containing snippets of actual research papers were made available and there is also a requirement that the system would search and analyse the web in the process. It follows that at least these types of copyrighted works will be used in the system. Obviously, in order to be effective, it should also include published scientific articles and books. Such scanning, when it relates to press publications published in the past 20 years would be explicitly prohibited by the European Commission’s proposal on ancillary copyright, in the draft Copyright Directive.

But even if the system was to compare students’ theses with other theses only, the authors’ authorisation would still be necessary. Actually, since anti-plagiarism verification has been performed by many universities before the introduction of the uniform system, there is a practice of obtaining students’ consent for anti-plagiarism analyses. Many universities are simply requiring consent for anti-plagiarism analysis as a condition for the master’s exam. Whether such a mandatory consent is compliant with copyright law is also an open question. Forcing students to license their works may also trigger consumer-protection regulations. So, even this practice is far from being fully compliant with the law and it cannot guarantee that universities may perform their obligation to deposit theses in the central repository and clear their rights in order to allow for anti-plagiarism analysis. Definitely, it cannot be done without paying authors’ remuneration for such a use, should they so require.

It follows that there is little chance of the system being organised properly without intervention at the legislative level. However, this cannot happen in the near future, as it would boil down to introducing a text and data mining exception that is not foreseen in the EU’s extremely restrictive framework. Until such an exception is provided for in the EU legislation, authors’ and publishers’ legally uncertain “consent” will have to be individually contracted for by the universities or the Ministry.

Copyright reform: Document pool

Public procurement competition for uniform anti-plagiarism system (only in Polish)

(Contribution by Krzysztof Siewicz, EDRi member Modern Poland Foundation, Poland)



29 Nov 2017

e-Privacy: What happened and what happens next

By Anne-Morgane Devriendt

With the vote on the mandate for trilogues in the European Parliament Plenary session of 26 October 2017, the European Parliament confirmed its strong position on e-Privacy for the following inter-institutional negotiations, also called trilogues.

The e-Privacy Regulation aims at reforming the existing e-Privacy Directive to complement the General Data Protection Regulation (GDPR) regarding communication data and metadata, as well as device security. In order for the text to efficiently protect European citizens’ privacy, some key issues needed to be addressed in the Commission’s proposal.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

In October 2017, we encouraged citizens to contact Members of the European Parliament (MEPs) to make sure the entire e-Privacy proposal will not be watered down. We (very exceptionally) asked them to support the mandate being granted to continue the negotiations on the proposal text in the trilogues. Here is the outcome of our campaign:

Protection of communications in transit and at rest (Art. 5)

Communications data is always sensitive. This is why, for instance, there is no point in protecting your email while it is being sent if any company hosting your email can read it once it arrives to your inbox, for example to target you with advertising. Therefore EDRi supports the protection of communication data both when it is in transit and at rest. The proposed Article 5 in the European Parliament (EP) version of the e-Privacy Regulation proposal protects “any interference with electronic communications”, including “data related to or processed by terminal equipment”. This is an important step in the right direction.

Consent as the only legal basis for processing (Art. 6)

Informed and free consent should be the sole legal basis for non-necessary processing of such data. Because of the intricate way online tracking works, only users who are fully informed (and free to make the choice) could allow that by consenting to that feature, if it is in their interest.

Privacy and devices protected by design and by default (Art. 10)

As happens with any other device that may create risks for the user, safety and security need to be part of the design and not an after-thought.This is why we need privacy by design and by default. Article 10 of the proposal states that all software allowing electronic communication should, “by default, have privacy protective settings activated to prevent other parties from transmitting to or storing information on the terminal equipment of a user and from processing information already stored on or collected from that equipment”.

The security of devices are also covered by Article 8 that restricts the use of end-users’ terminal equipment to what is strictly necessary, subject to consent.

Restrictions of users’ rights (Art. 11)

Article 11 limits restrictions to vague general public interests such as national security, defence and public security, but the EP has done a better job at being specific in the three sub-articles. Furthermore, Article 11 also contains provisions to ask for mandatory documentation on the requests to access communications by Member States.

Protection of encryption (Art. 17)

In order to protect citizens’ privacy and the safety of their electronic communications, it is fundamental to ban any attempts to undermine encryption. Article 17, on security risks, states that Member States cannot weaken encryption, for example by forcing companies to include ”back-doors” in their products.

The European Parliament has done a good job with its improvements to the text. Thanks to the strong position of the Committee on Civil Liberties, Justice and Home Affairs (LIBE) and citizens’ mobilisation, the European Parliament voted for a strong text that will protect citizens’ privacy and communication. However the fight is not over yet: the Commission, the Council and the Parliament have yet to reach an agreement during the obscure process called trilogues. The final text will be passed in the Plenary of the European Parliament in 2018, tentatively after the summer.

Tell the European Parliament to stand up for e-Privacy! (25.10.2017)

Report on the proposal for a regulation of the European Parliament and of the Council concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC (Regulation on Privacy and Electronic Communications) (23.10.2017)

EDRi’s position on the proposal of an e-Privacy regulation (09.03.2017)

Trilogues: the system that undermines EU democracy and transparency (20.04.2017)

(Contribution by Anne-Morgane Devriendt, EDRi intern)



22 Nov 2017

Estonian eID article – additional information

By Joe McNamee

Our article on the problems with the Estonian eID card attracted some criticism and non-specific allegations of inaccuracies. We recognise the sensitivities of the Estonian authorities on this issue, but stand behind the article.

For the sake of completeness and to allow our analysis to be verified, here is the timescale that we describe in the article:

30 August – Estonian Information System Authority (RIA) was informed about the vulnerability by a third party

5 September – RIA issued a statement about “a possible vulnerability” but that “the given security risk has not been realized” and that this risk was “not enough to cancel the cards”

28 September – RIA made a logically questionable distinction between advice to those who use their cards regularly – that they should renew their card – and advice to others, whose cards were equally compromised, who were not advised to renew their card

16 October – RIA issued a press release pointing to a “potential vulnerability” arguing that the “Estonian ID-card and the corresponding digital solutions continue to be safe”

30 October – RIA “recommended” that cards be updated

2 November – RIA announced that it will block all affected cards and urged “all holders of security risk affected ID-cards” to begin to remotely update their cards

Other criticisms:

We also received some comments by email and via Twitter with regard to the article. Again, we understand the sensitivity that led to some somewhat strident comments and are happy to respond appropriately. We have done our best to reflect all of the comments and deal with them comprehensively:

Comment: Article 17 was not deleted but simply moved verbatim to another instrument.
Our response: Yes, this is partially correct and was an oversight on our part (both the fact that it was moved and that the move itself was proposed by the previous presidency). Instead of strengthening this measure, as proposed by the European Parliament, the Estonian presidency proposed:

  • moving it from a stronger instrument (Regulation) to a weaker instrument (Article 40.3a of the Directive establishing the European Electronic Communications code);
  • moving to an instrument whose scope is significantly narrower than the e-Privacy Regulation;
  • amending it in a way that is less clear and subject to varying interpretations in 27 EU Member States;
  • amending it in a way that establishes a higher and less clear threshold for the provision to be enacted.

Tracked changes proposed by Council:

Member States shall ensure that, in the case of a particular and significant risk of a security incident that may compromise the security of in public communications networks and electronic communications services, the provider of an electronic communications such services shall inform end-their users potentially affected by concerning such risk a threat of any possible protective measures [sic] or and, where the risk lies outside the scope of the measures to be taken by the service provider, inform end-users of any possible or remedies which can be taken by the users. Where appropriate, providers should also inform their users also of the threat itself. , including an indication of the likely costs involved.

Comment: Other countries were affected by the same flaw
Our response: This was explicitly recognised in the article. However, due to Estonia being at the vanguard of eGovernment implementation, the flaw created vastly more dangers for Estonian citizens compared with elsewhere. It could credibly be argued that the Estonian authorities, despite the confusion described above, handled the matter better than some other countries.

Comment: Other technologies, such as credit cards, were affected.
Our response: Credit cards have liability rules and insurance, operate in a market, are not compulsory and subject to choice. The comparison is therefore not valid.

Comment: There was no harm.
Our response: This is impossible to know for certain. Furthermore, if there was no harm, this can be ascribed to good fortune.

Comment: There is no causal relationship between efforts to weaken Article 17 of the proposed ePrivacy Regulation and this incident.
Our response: None was alleged. We simply wished to draw attention to the importance of Article 17 for trust and security.


We always welcome feedback and requests for clarification and further information in relation to everything we publish. We are therefore grateful for the energetic response we received from our Estonian friends and hope that all outstanding issues have now been fully clarified.

Estonian eID cryptography mess – 750 000 cards compromised (15.11.2017)