The right to privacy is a crucial element of our personal security, for free speech and for democratic participation. It is a fundamental right in the primary law of the European Union and is recognised in numerous international legal instruments. Digital technologies have generated a new environment of potential benefits and threats to this fundamental right. As a result, defending our right to privacy is at the centre of EDRi’s priorities.

25 Sep 2019

Portugal: Data retention complaint reaches the Constitutional Court

By Guest author

September 2019 brought us long-awaited developments regarding the situation of data retention in Portugal. The Justice Ombudsman decided to send the Portuguese data retention law to the Constitutional Court, following the Court of Justice of the European Union’s (CJEU’s) case law on blanket retention of data that lead to invalidation of Directive 2006/24/EC. This decision comes after a complaint presented by EDRi observer Associação D3 – Defesa dos Direitos Digitais, in December 2017.

The Ombudsman had first decided to issue an official recommendation to the government, urging it to propose a legislative solution for the problematic law that originated from the now invalidated Data Retention Directive. Faced with a refusal from the Minister of Justice to find a solution through legislative means, the Ombudsman has now decided to concede to D3’s original request, and has sent the matter for the appreciation of the Constitutional Court, which will have to provide a ruling on the constitutionality of the Portuguese data retention scheme.

A few days later, the same Constitutional Court partially stroke down, for the second time, a law that granted the intelligence services’ access to retained data. In 2015, the Constitutional Court had already declared the unconstitutionality of a similar law, after the president had requested a preventive ruling by the Court before signing it into law. However, in 2017, a new law that addressed some of the problems raised by the Constitutional Court was approved in the Parliament. As the new president opted not to request a preventive decision, the law came into force. 35 Members of the Parliament (MP) from three parties then requested a Constitutional Court ruling on the law, which was now issued.

The fundamental reasoning of this decision is that the Portuguese Constitution forbids public authorities from accessing citizen’s correspondence and telecommunications, except in the context of a criminal procedure. Given that the intelligence services have no criminal procedure competences, they cannot access such data within the existent Constitutional framework. However, the Court did allow access to user location and identification data (in the context of the fight against terrorism and highly organised crime), as such data was not considered to be covered by the secrecy of communications.

This case has also lead to the resignation of the original judge rapporteur due to disagreements related to the reasoning reflected in the final version of the text of the decision.

Associação D3 – Defesa dos Direitos Digitais

Portugal: Data retention sent to the Constitutional Court (07.03.2018)

European Court overturns EU mass surveillance law (08.04.2014)

(Contribution by Eduardo Santos, Associação D3 – Defesa dos Direitos Digitais, Portugal)

23 Sep 2019

Your mail, their ads. Your rights?

By Andreea Belu
  • In the digital space, “postal services” often snoop into your online conversations in order to market services or products according to what they find out from your chats.
  • A law meant to limit this exploitative practice is stalled by the Council of European Union

We all expect our mail to be safe in the hands of a mailman. We have confidence that both the post office and the mailmen working there will not take a sneak-a-peak into our written correspondence. Neither we expect mailmen to act like door-to-door salespersons.

When we say “postal services” snoop, it is important to understand that this refers to both traditional mail services such as Yahoo, but also instant messaging apps like WhatsApp. While targeted ads are no longer popular among mail providers, the practice is gaining momentum in the instant messaging zone after Facebook’s CEO announced plans to introduce ads on WhatsApp’s Status feature.

Not just shoes ads

You might think: ”Well, what’s the harm in having shoes advertised after they’ve read the shopping chats between my friend and me?”. Short answer: it’s not just shoes.

Often targeted ads are the result of you being profiled according to your age, location, gender, sexual orientation, political views or ethnicity. You will receive jobs ads based on your gender, or housing ads based on your ethnicity. Sometimes, you may be targeted because you feel anxious or worthless. Are you sure all of these will benefit you? More, your online mailman might be required to read all of your mail, just in case you get in trouble with the law in the future. We call this mass data retention.

Click to watch the animation

The need for encrypted mail in storage *and* in transit

The WhatsApp case is a good example. Currently, WhatsApp seals the message right after you press “send”. The message goes to WhatsApp’s servers, is stored encrypted, and then sent to its recipient, also encrypted. This means that, technically, the mail is encrypted both in storage and in transit and nobody can reads its content. However, as Forbes points out, future ads plans might modify WhatsApp’s encryption so that they “first identify key words in sentences, like “fishing” or “birthday,” and send them to Facebook’s servers to be processed for advertising, while separately sending the encrypted message.

There’s a law for it, but it’s stalled by the EU Council

The ePrivacy Regulation, which is currently under negotiation, is aimed at ensuring privacy and confidentiality of our electronic communications, by complementing and particularising the rules introduced by the General Data Protection Regulation (GDPR). The EU Parliament adopted a good stand for ePrivacy that would ensure your online messages are protected both in storage and in transit (Art.5), that would consider “consent” as the only legal basis for processing data (Art 6), that would make privacy–by–design and privacy–by–default core principles in software design (Art. 10), and that would protect encryption from measures aimed at undermining it (Art. 17). However, the Council of the European Union is yielding under big tech lobby pressure and drafted an opinion that threatens our rights and freedoms. More, the text adopted by the EU Parliament in October 2017 has been stuck in the EU Council, behind closed-door negotiations for soon two years. We have sent several letters (here, here and here) calling for the safeguarding our communications and for the adoption of this much needed ePrivacy Regulation.

Will our voices be heard? If you are worried about being targeted based on your private conversations, join our efforts and stay tuned for more updates coming soon.

Read more:

Your family is none of their business (23.07.2019)

Real-time bidding: The auction for your attention (4.07.2019)

e-Privacy Directive: Frequently Asked Questions

e-Privacy: What happened and what happens next (29.11.2017)

e-Privacy Mythbusting (25.10.2017)

23 Jul 2019

Civil society calls for a proper assessment of data retention

By Diego Naranjo

In preparation of a possible proposal for new legislation, the European Commission is conducting informal dialogues with different stakeholders to research about the possibilities of data retention legislation that complies with the rulings of the Court of Justice of the European Union (CJEU) and the European Court of Human Rights (ECtHR). As part of these dialogues, EDRi has previously met with the Commission Directorate-General for Migration and Home Affairs (DG HOME) on 6 June 2019.

On 22 July 2019, 30 civil society organisations sent an open letter to the European Commission President-elect Ursula von der Leyen and Commissioners Avramopoulos, Jourová and King, urging the commissions of the EU Commission to conduct an independent assessment on the necessity and proportionality of existing and potential legislative measures around data retention. Furthermore, signatories asked to ensure that the debate around data retention does not prevent the ePrivacy Regulation from being adopted swiftly.

You can read the letter here, and below:

22 July 2019

By email:
President-elect von der Leyen
First Vice-President Timmermans

Commissioner Avramopoulos
Commissioner Jourová
Commissioner King

Dear First Vice-President Timmermans,
Dear President-elect von der Leyen,

The undersigned organisations represent non-governmental organisations working to protect and promote human rights in digital and connected spaces. We are writing to put forward suggestions to ensure compliance with the EU Charter of Fundamental Rights and the CJEU case law on data retention.

EU Member States (and EEA countries) have had different degrees of implementation of the CJEU ruling on 8 April 2014 invalidating the Data Retention Directive. EDRi’s 2015 study reported that six Member States1 have kept data retention laws which contained features that are similar or identical to those that were ruled to be contrary to the EU Charter. Other evidence pointed in the same direction.2 While personal data of millions of Europeans were being stored illegally, the European Commission had not launched any infringement procedures. On 21 December 2016, the CJEU delivered its judgment in the Tele2/Watson case regarding data retention in Member States’ national law. In the aftermath of this judgment, the Council Legal Service unambiguously concluded that “a general and indiscriminate retention obligation for crime prevention and other security reasons would no more be possible at national level than it is at EU level, since it would violate just as much the fundamental requirements as demonstrated by the Court’s insistence in two judgments delivered in Grand Chamber.”3

On 6 June 2019 the Council adopted “conclusions on the way forward with regard to the retention of electronic communication data for the purpose of fighting crime” which claim that “data retention is an essential tool for investigating serious crime efficiently”. The Council tasked the Commission to “gather further information and organise targeted consultations as part of a comprehensive study on possible solutions for retaining data, including the consideration of a future legislative initiative.”

While the concept of blanket data retention appeals to law enforcement agencies, it has never been shown that the indiscriminate retention of traffic and location data of over 500 million Europeans was necessary, proportionate or even effective.

Blanket data retention is an invasive surveillance measure of the entire population. This can entail the collection of sensitive information about social contacts (including business contacts), movements and private lives (e.g. contacts with physicians, lawyers, workers councils, psychologists, helplines, etc.) of hundreds of millions of Europeans, in the absence of any suspicion. Telecommunications data retention undermines professional confidentiality and deters citizens from making confidential communications via electronic communication networks. The retained data is also of high interest for criminal organisations and unauthorised state actors from all over the world. Several successful data breaches have been documented.4 Blanket data retention also undermines the protection of journalistic sources and thus compromises the freedom of the press. Overall, it damages preconditions of open and democratic societies.

The undersigned organisations have therefore been in constructive dialogue with the European Commission services to ensure that the way forward includes the following suggestions:

  • The European Commission commissions an independent, scientific study on the necessity and proportionality of existing and potential legislative measures around data retention, including a human rights impact assessment and a comparison of crime clearance rates;
  • The European Commission and the Council ensure that the debate around data retention does not prevent the ePrivacy Regulation from being adopted swiftly;
  • The European Commission tasks the EU Fundamental Rights Agency (FRA) to prepare a comprehensive study on all existing data retention legislation and their compliance with the Charter and the CJEU/European Court of Human Rights case law on this matter;
  • The European Commission considers launching infringement procedures against Member States that enforce illegal data retention laws.

We look forward to your response and remain at your disposal to support the necessary initiatives to uphold EU law in this policy area.


European Digital Rights (EDRi)
Access Now
Chaos Computer Club (CCC)
Bits of Freedom
Asociatia pentru Tehnologie si Internet (ApTI)
Electronic Frontier Norway (EFN)
Digital Rights Ireland
Privacy International
Hermes Center for Transparency and Digital Human Rights
Access Info
Aktion Freiheit statt Angst
Homo Digitalis
Electronic Privacy Information Center (EPIC)
Iuridicum Remedium (IuRe)
La Quadrature du Net
Associação D3 – Defesa dos Direitos Digitais
IT-Political Association of Denmark (IT-Pol)
Panoptykon Foundation
Open Rights Group (ORG)
Electronic Frontier Finland (Effi ry)
Državljan D
Deutsche Vereinigung für Datenschutz e. V. (DVD)
Föreningen för Digitala Fri- och Rättigheter (:DFRI)
AK Vorrat

2) See, for example. Privacy International, 2017, National Data Retention Laws since Tele-2/Watson Judgment:
3) Council document 5884/17, paragraph 13
4) A recent example can be found here:

07 Jun 2019

Data Retention: EU Commission inconclusive about potential new legislation

By Diego Naranjo

On 6 June 2019, representatives from eight civil society organisations (including EDRi members) met with officials from the European Commission (EC) Directorate General of Home Affairs (DG HOME) to discuss data retention. This meeting, according to the EC officials, was just another one in a series of meetings that DG HOME is holding with different stakeholders to discuss potential data retention initiatives that could be put forward (or not) by the next Commission. The meeting is not connected to the publication of the conclusions by the Council on data retention published also on 6 June which coincidentally tasks the Commission with doing a study “on possible solutions for retaining data, including the consideration of a future legislative initiative”.

Ahead of the meeting, civil society was sent a set of questions about the impact of existing and potentially new data retention legislation on individuals, how a “legal” targeted data retention could be designed, and what are the specific issues (data retention periods, geographical restrictions, and so on) that could be included in case new data retention legislation were to be proposed.

According to the Commission, there are no clear “next stages” in the process, apart from the aforementioned study that will have to be prepared after the Council conclusions on data retention published on 6 June. The Commission will, in addition to this study, continue dialogues with civil society, data protection authorities, EU Fundamental Rights Agency and Member States that will inform a potential future action (or inaction) from the EC on data retention.

Four years ago EDRi met with DG HOME and presented them a study of a set of data retention laws which were likely to be considered illegal in light of the Digital Rights Ireland case. The EC then replied to our meeting and study saying that they would “monitor” existing data retention laws and their compliance with EU law. Four years after that, no infringing proceedings have been launched against any Member State and their (quite probably) illegal data retention laws.

Read more:

EU Member States willing to retain illegal data retention (16.09.2019)

Data retention – Conclusions on retention of data for the purpose of fighting crime (27.05.2019)

EU Member States plan to ignore EU Court data retention rulings (29.11.2017)

(Contribution by Diego Naranjo, EDRi)

05 Jun 2019

Czech Constitutional Court rejects complaint on data retention

By Iuridicum Remedium

Czech EDRi member Iuridicum Remedium (IuRe) has fought for 14 years against Czech implementation of the controversial EU data retention Directive which was declared invalid by the Court of Justice of the European Union (CJEU). After years of campaigning and many hard legislative battles, the fight has finally come to an end: on 22 May 2019, the Czech Constitutional Court rejected IuRe’s proposal to declare the Czech data retention law unconstitutional. The court ended up rejecting the claim, despite it being supported by 58 deputies of the parliament across the political spectrum.

In the Czech Republic, data retention legislation was first adopted in 2005. In March 2011, the Constitutional Court upheld first IuRe’s complaint on original data retention legislation and canceled it. In 2012, however, a new legal framework was adopted to implement the EU Data Retention Directive – that the CJEU found to contravene European law in Digital Rights Ireland case in 2014, and to comply with the Constitutional Court’s decision. This new legislation contained still problematic general and indiscriminate data retention and a number of sub-problems. Therefore, even in the light of CJEU’s decisions, IuRe decided to prepare a new constitutional complaint.

IuRe originally submitted a complaint to challenge the very principle of bulk data retention as massive collection and storage of data of people, without any link to the individual suspicion in criminal activities, extraordinary events, or terrorist threats. The CJEU already declared this general and indiscriminate data retention principle inadmissible in two of its decisions (Digital Rights Ireland and Tele2). Although the Czech Constitutional Court refers to both judgments several times, their conclusions – especially when it comes to analyse the foundations of why data retention is not in line with the Czech Constitution – does not deal with it properly.

The Constitutional Court’s main argument to declare data retention constitutional is that as communications increasingly occur in the digital domain, so does crime. Even though this could be true,it is regrettable that the Constitutional Court did not further develop this reasoning and argued why this is in itself a basis for bulk data retention. The Court also ignored that greater use of electronic communication also implies greater interference with privacy that is associated with general data retention.

The Court further argued that personal data, even without an obligation to retain it, are kept in any case for other purposes, such as invoicing for services, answering to claims and behavioral advertising. In the Court’s opinion, the fact that people give operators their “consent” to process their personal data reinforces the argument to claim that data retention is legal and acceptable. Unfortunately, the Constitutional Court does not take into consideration that the volume, retention period and sensitivity of personal data held by operators for other purposes is quite different from the obligatory data retention prescribed by the Czech data retention law. Furthermore, the fact that operators need to keep some data already (for billing purposes for example) shows that police would not be completely left in the dark without a legal obligation to store data.

In addition to the proportionality of data retention, which has not been clarified by the Court, another issue is how “effective” data retention is to reduce crime. Statistics from 2010 to 2014 show that there was no significant increase in crime or reduction of the crime detection in the Czech Republic after the Constitutional Court abolished the obligation to retain data in 2011. Police statistics presented to the Court that data retention is not helping to combat crime in general, nor facilitating investigation of serious crimes (such as murders) or other types of crimes (such as frauds or hacking). In arguments submitted by police representatives and by the Ministry of the Interior, some examples of individual cases where the stored data helped (or hampered an investigation when missing) were repeatedly mentioned. However, it has not been proven by any evidence shown to the Court that general and indiscriminate data retention would improve the ability of the police to investigate crimes.

The Court also did not annul the partially problematic parts of the legislation, such as the data retention period (six months), the volume of data to be retained, or too broad range of criminal cases where data may be required. Furthermore, the Court has not remedied the provisions of the Police Act that allow data to be requested without court authorisation in cases of search for wanted or missing persons or the fight against terrorism.

In its decision, the Constitutional Court acknowledges that stored data are very sensitive and that in some cases the sensitivity of so-called “metadata” may even be greater than the retention of the content of the communications. Thus, the retention of communications data represents a significant threat to individuals’ privacy. Despite all of this, the Court discarded IuRE’s claim to declare data retention law unconstitutional.

IuRe disagrees with the outcome of this procedure in which the Court has come to a conclusion on the constitutional conformity of the existing Czech data retention legislation. Considering the wide support for the complaint, IuRe will work on getting at least a part of existing arrangements changed by legislative amendments. In addition to this, we will consider the possibility for the EC to launch infringing proceedings or initiate other judicial cases, since we strongly believe that the existing bulk data retention of communications data in Czech law still contravenes the aforementioned CJEU decisions on mass data retention.

Czech constitutional decision (only in Czech)

Proposal to revoke data retention filed with the Czech Court (10.01.2018)

(Contribution by Jan Vobořil, EDRi member Iuridicum Remedium, Czech Republic)

22 May 2019

ePrivacy: Private data retention through the back door

By Digitalcourage

Blanket data retention has been prohibited in several court decisions by the European Court of Justice (ECJ) and the German Federal Constitutional Court (BVerfG). In spite of this, some of the EU Member States want to reintroduce it for the use by law enforcement authorities – through a back door in the ePrivacy Regulation.

The ePrivacy Regulation

The ePrivacy Regulation, which is currently under negotiation, is aimed at ensuring privacy and confidentiality of communications in the electronic communications, by complementing and particularising the matters covered in the General Data Protection Regulation (GDPR). Confidentiality of communications is currently covered by the ePrivacy Directive dating back to 2002. A review of this piece of legislation is long overdue, but Member States keep delaying the process and therefore not updating necessary protections for online privacy in the EU.

Ever since 2017, the EU Ministers of Justice and Interior have been “deliberating” the Tele2 verdict by the European Court of Justice. The Court had declared the blanket retention of telecommunications metadata inadmissible. Yet the EU Member States are unwilling to accept this ruling. During an informal discussion in Valetta on 26 and 27 January 2017, the Justice and Interior Ministers expressed their wish for “a common reflection process at EU level on data retention in light of the recent judgments of the Court of Justice of the European Union” (Ref. EU Council 6713/17) to implement EU-wide data retention. This process was set in motion in March 2019 by the Presidency of the Council of the European Union. A sub-group of the Council’s Working Party on Information Exchange and Data Protection (DAPIX) was put in charge. From the very beginning, this reflection process has mainly served the purpose of finding opportunities to implement yet another instance of data retention on the EU level. This has been proven by documents published by EDRi member Statewatch.

Instead of complying with the clear rulings by the European Court of Justice (Tele 2 and Digital Rights Ireland), the responsible ministers are doing everything they can to “resurrect” data retention, potentially using ePrivacy as a basis for a new era of data retention. In a working document (WK 11127/17), the Presidency of the EU Council in 2017 concluded in addition to a specific data retention legislation it would be desirable to also collect citizens’ communications data (metadata) in ePrivacy to avoid so companies can use it for commercial purposes. The logic behind being, probably, to circumvent CJEU case law by not imposing an obligation on companies but having the data available when law enforcement needs it thanks to ePrivacy.

Private data retention

In plain words, this means: If the courts will not allow mass data retention, service providers will simply be given incentives to do so by their own choice. That is why the ePrivacy Regulation is being watered down by Member States in order to give the service providers manifold permissions to store data for a wide variety of reasons (see Article 6 of the draft ePrivacy Regulation). Those responsible are relying on the assumption that the providers’ appetite for data will be sufficient even without an explicit obligation to retain data.

The immediate problem with this type of private data retention is the fact that it weakens the protection of all users’ personal data against data hungry corporations whose main interest is making profit. What’s even worse is that, once again, a governmental function is being outsourced to private corporations. These corporations are not subject to democratic scrutiny, and they are given ever more power over the countries concerned.

In Germany, the hurdles for criminal investigators to get access to data are already very low. The e-mail provider Posteo, for example, had to pay a fine because they were unable to provide the criminal investigators the IP addresses from which a certain e-mail account had been accessed. Posteo simply hadn’t stored those data; they were erased as soon as they were received. The Court declared the fine to be justified. This decision could easily lead to a situation where private companies prefer to err on the side of caution and store even more data, just to avoid such fines.

The draft ePrivacy Regulation as proposed by the European Commission in 2017 placed relatively strict duties on service providers regarding data protection. For example, they were obliged to either erase or anonymise all data that was no longer needed. This is diametrically opposed to the goal of private data retention, and the DAPIX task force noticed it, too. As the Presidency of the EU Council statedservice providers will be given the freedom to use and store data in order to prevent “fraudulent use or abuse”. And these data could then be picked up by law enforcement doing criminal investigation.

No data retention through the back door!

EDRi member Digitalcourage wanted to know how the German government argued with respect to the data retention issue, and submitted a request for the disclosure of documents related to it. Unfortunately, the request was largely denied by the Council of the European Union, long after the legal deadline was missed. The secretariat declared that a disclosure would be a threat to public safety – the risk to the relationship of trust between the Member States and Eurojust, the EU agency dealing with judicial co-operation in criminal matters among agencies of the Member States, would be too severe. Furthermore, such a disclosure would threaten ongoing criminal investigation or judicial procedures. No further details were given. Digitalcourage lodged an appeal against this dismissal, but in addition to being asked for patience, they haven’t received an answer from the European Commission. Several requests pursuant to the Freedom of Information Act have also been submitted to German ministries.

It is unbelievable to imagine policy makers contemplating existing and potential new surveillance laws that would clearly be illegal. However, this is exacly what the DAPIX task force is doing, and they are doing it behind closed doors. The changes they propose can be found in the current draft ePrivacy Regulation. Digitalcourage will continue to request documents from the EU and the German government. As soon as the trilogue negotiations between EU Council, Commission and Parliament begin, the concerns will be voiced our concerns and a demand: No data retention through the back door!

This article was first published at


ePrivacy: Private data retention through the back door (in German, 18.04.2019)

(Contribution by EDRi member Digitalcourage, Germany)

27 Feb 2019

New UK counter-terrorism law limits online freedoms

By Index on Censorship

The Counter-Terrorism and Border Security Act 2019 became law in the United Kingdom (UK) in February, after passing through UK parliament with less debate than many had hoped, while Brexit dominated the political agenda. The new law is problematic in many ways, including the way in which it limits freedom of expression and access to information online. It also creates extensive new border security powers, which include accessing information on electronic devices.

The draft law was widely criticised by civil society organisations, which led to some changes to the text. However, the changes were limited and did not do enough to safeguard freedom of expression and access to information.” alt=”—————————————————————– Support our work – make a recurrent donation! —————————————————————–” class=”wp-image-8690″/>

The new law criminalises publication of pictures of clothes, symbols, or for example of a flag in a way that raises “reasonable suspicion” – an expression that leads into a low legal threshold – that the person publishing the picture is a member or supporter of a terrorist organisation. “Publication” includes posting on social media pictures or video that have been taken privately at home. This could be, for example, a selfie with a poster in the background that shows the symbol of a terrorist organisation.

As previously reported in the EDRi-gram, parliament’s Joint Committee on Human Rights found that this clause “risks a huge swathe of publications being caught, including historical images and journalistic articles”. United Nations rapporteur Fionnuala Ní Aoláin, in a submission that expressed serious concerns about the draft law, found that the clause risks criminalising “a broad range of legitimate behaviour, including reporting by journalists, civil society organizations or human rights activists as well as academic and other research activity.”

A related problem is that the UK authorities have admitted that at least 14 organisations that are currently listed as terrorist organisations do not meet the criteria for being on the list.

Another clause makes it a crime to watch or otherwise access information online that is likely to be useful to a person committing or preparing acts of terrorism. It also includes, for example, watching the content over the shoulder of another person who is sitting by a computer.

After debates in parliament, the government agreed to make a change, which states that working as a journalist or carrying out academic research is an acceptable excuse for accessing material online that could be useful for terrorism. This was a positive change, but not nearly sufficient, and the clause is still very problematic. No terrorist intent is required, and if someone for example watches a terrorist video online because she or he wants to understand why people might be drawn to terrorism, the person risks a long prison sentence.

The law also introduces wide new border security powers connected to a new and vaguely defined crime of “hostile activity”. Under the new powers, anyone can be stopped on the border, even if there are no suspicions that the person has been involved in hostile activity, and it’s a crime not to answer questions by the border officers or hand over to them requested information. A draft code of practice, which will guide how border officers use the powers, specifies that information “may include passwords to electronic devices”. During the first hour of questioning there is no right to a lawyer.

How this deeply concerning piece of legislation will work in practice remains to be seen. We fear that vague and overbroad provisions lead to arbitrariness and discrimination affecting human rights defenders, journalists, or ethnic minority groups on the grounds of mere suspicion.

Index on Censorship

UK counter-terrorism law would restrict freedom of expression (26.09.2018)

(Contribution by Joy Hyvarinen, EDRi observer Index on Censorship, the United Kingdom)



20 Feb 2019

FRA and EDPS: Terrorist Content Regulation requires improvement for fundamental rights


On 12 February 2019, the European Union Agency for Fundamental Rights (FRA) published an Opinion regarding the Regulation on preventing the dissemination of terrorist content online. In the same day, the European Data Protection Supervisor (EDPS) submitted its comments on the topic to the responsible committee in the European Parliament. These two texts complement EDRi’s analysis and the previous Report prepared by three UN Special Rapporteurs on the proposal.

FRA: Substantial threats for freedom of expression

In its Opinion, FRA structures its criticism around four main areas.

First, it calls to improve the definition of “terrorist content”. The Opinion highlights the need to add to this definition the concept of “incitement” or giving specific instructions to commit terrorist offences. The definition of such instructions should be aligned with the Terrorism Directive and specific actions such as “providing specific instructions on how to prepare explosives or firearms”. Further, the text calls to limit the proposal to content disseminated to the public and to exclude from the Regulation’s scope certain forms of expression, such as content that relates to educational, journalistic, artistic or research purposes.

Second, FRA calls to ensure that fundamental rights safeguards are in place through “effective judicial supervision”. Currently, there is no mention in the proposal of any “independent judicial authority in the adoption or prior to the execution of the removal order”. FRA also reminds of the need to avoid a disproportionate impact on the freedom to conduct a business when having to react to notices for removals of terrorist content in a very short time-frame (up to one hour in the original proposal). FRA suggests instead a reaction time of 24 hours from the receipt of the removal order. Regarding safeguards in cross-border removal orders, the Opinion calls to ensure that the authorities of the Member State where the content is hosted are “empowered to review the removal order in cases where there are reasonable grounds to believe that fundamental rights are impacted within its own jurisdiction.” FRA thus encourages the EU legislator to require a notification by the issuing Member State to the host Member State – in addition to the notification to the hosting service provider – when the removal order is issued.

Third, FRA states that the proposal “does not sufficiently justify the necessity of introducing the mechanism of referrals”, and suggests to distinguish between content needing a removal order and content requiring a referral.

Fourth, the Opinion states that the proposed proactive measures of the Regulation come very close to a general monitoring obligation. This is not only prohibited by Article 14 of the EU’s eCommerce Directive, but also generally incompatible with individuals’ right to freedom of expression under Article 11 of the Charter of Fundamental Rights in the European Union. Thus, FRA proposes to delete from the Regulation text the obligation for Hosting Service Providers’ (HSPs) to introduce proactive measures.

EDPS: Concerns for the Regulation’s data retention and GDPR compliance

While the EDPS issued similar concerns regarding the definition of terrorist content and the “one hour rule”, it also issued some targeted comments on the concerns surrounding potentially privacy intrusive elements of the Regulation proposal.

In the Regulation proposal, Hosting Service Providers’ have obligations to retain data of supposed terrorist content that they delete or disable access to on their platform. The EDPS presents substantive doubts whether such obligations would be compliant with case law of the Court of Justice of the European Union (CJEU). This opinion was based on the assessment that the proposed measures, in similarity to the Data Retention Directive that was struck down by the CJEU in 2014, do not lay down specific criteria regarding the time period and access and use limitations for the retained data. The EDPS is further not convinced of the overall usefulness of data retention measures in the Terrorist Content Regulation, given that the text obliges HSPs to promptly inform the competent law enforcement authorities of any evidence regarding terrorist offences.

On the proposal’s foreseen proactive measures, the EDPS stated that automated tools for recognising and removing content would likely fall under Article 22 of the General Data Protection Regulation (GDPR), which regulates citizens’ rights in automated decision making and profiling activities. This would, in turn, require more substantive safeguards than the ones provided in the Commission’s proposal, including case-specific information to the data subject, understandable information about how the decision was reached, and the right to obtain human intervention in any case.

The observations of the EU’s most important fundamental rights institutions feed into a steady stream of criticism of the proposal. These represent noteworthy positions for policy makers in the legislator institutions, particularly in the European Parliament’s LIBE, CULT and IMCO committees that are currently adopting their positions. It is now more evident than ever that the proposed Terrorist Content Regulation needs substantive reform to live up to the Union’s values, and to safeguard the fundamental rights and freedoms of its citizens.

Read more:

EDRi Recommendations for the European Parliament’s Draft Report on the Regulation on preventing the dissemination of terrorist content online (December 2018)

All Cops Are Blind? Context in terrorist content online (13.02.2019)

Terrorist Content: LIBE Rapporteur’s Draft Report lacks ambition (25.01.2019)

CULT: Fundamental rights missing in the Terrorist Content Regulation (21.01.2019)

Terrorist Content: IMCO draft Opinion sets the stage right for EP (18.01.2019)

(Contribution by Diego Naranjo and Yannic Blaschke)


16 Jan 2019

EU Member States willing to retain illegal data retention

By IT-Pol

With its judgments in April 2014 (Digital Rights Ireland ) and December 2016 (Tele2 ), the Court of Justice of the European Union (CJEU) ruled that blanket data retention was illegal under EU law. Rather than repealing their illegal data retention laws, EU Member States have instead adopted a tactic of ignoring the highest court of the European Union under the pretence of a “common reflection process” with an expert data retention working group under the Working Party on Information Exchange and Data Protection (DAPIX).

----------------------------------------------------------------- Support our work with a one-off-donation! -----------------------------------------------------------------

At the Justice and Home Affairs (JHA) Council meeting on 6-7 December 2018, the state of play of the expert working group on data retention was discussed. Council document 14319/18 prepared for the meeting reveals that the common reflection process has produced no tangible results towards compliance with the Tele2 judgment: replacing general and indiscriminate (blanket) data retention with targeted data retention. Member States appear to be happy with their current and illegal data retention regimes and do not want to make any changes. A recurring element in the Council document is the unwillingness of Member States to accept the Tele2 judgment, often disguised under a very selective reading of the judgment.

The expert working group has considered the concept of “restricted data retention”, previously analysed in the EDRi-gram. The main novelty is that Member States are supposed to limit the data categories to be retained to what is strictly necessary. No limitation is foreseen with respect to the persons concerned, which means that data about the entire population is retained, as with the current data retention regimes. Therefore, restricted data retention cannot possibly comply with the Tele2 judgment. However, even the token gesture of limiting the data categories has no support among Member States. They claim that the data categories which are not necessary for law enforcement purposes are already excluded. Based on this premise, Member States even contend that “there is no general and indiscriminate retention of data as referred to in the Tele2 judgment”, which is rather remarkable since the CJEU has stated the exact opposite in the Tele2 judgment.

The renewable retention warrant (RRW) proposal is another attempt by Member States to circumvent the Tele2 judgment. While the warrant only covers a single provider of electronic communications services for a fixed period of validity, all providers are expected to be covered by different warrants that are constantly renewed because the RRW would be rendered ineffective for law enforcement purposes if not all providers are covered. In practice, the RRW will be indistinguishable from the current blanket data retention regimes. With the exception of one Member State, which uses a similar system (undoubtedly the United Kingdom), there is no support for the RRW since the system would be too complex and inefficient and would require changes to national laws on criminal procedure.

After two years of “reflection” on the Tele2 judgment, Member States and their expert working group have not come up with a single realistic alternative to the current blanket data retention regimes that the CJEU has ruled to be illegal under EU law. The Council document does not describe a single suggestion which would actually make the data retention scheme targeted and limit the persons concerned by the measure, even though this is expressly required by the CJEU in paragraph 110 of the Tele2 judgment.

The second part of Council document 14319/18 deals with access to the retained data. According to the Tele2 judgment, access to the retained data must be limited to investigations involving serious crime and must be subject to review by a court or an independent administrative authority. As a general rule, only data of individuals suspected of being involved or implicated in a crime can be accessed.

Once again, Member States are reluctant to accept the restrictions imposed by the CJEU. Since there is no EU law or CJEU guidance defining “serious crime”, this task is left to Member States. Some Member States have a very broad definition, even to the point of including crimes that cannot be regarded as serious because of their low maximum sentence, but are nonetheless claimed to be perceived as serious by the general public. It is also noted in the Council document that without access to retained data, criminal investigations in cybercrime cases would often “turn out to be futile because digital evidence would be unavailable”. However, when data retention of electronic communications metadata is a particularly serious interference with fundamental rights, as the CJEU has established (Tele2 paragraph 100), access to the retained data must be subject to strict rules and will not always be available for law enforcement authorities. Since more and more activities are related to the online environment, making a complete carve out for crimes committed online would deprive the privacy and data protection safeguards at the access level of almost any meaning.

The Council document notes that the judicial review regimes of most Member States are in line with the prerequisites set out by the CJEU, through a prior review by a court/judge, an independent administrative authority or the prosecution office. However, by silently adding the prosecution office, which is not an independent judicial authority, to the list, Member States are rather misleadingly overstating their compliance with the Tele2 judgment regarding the requirement of independent review of access requests.

Finally, Member States are very reluctant to limit the access to the retained data to persons that are suspects or accused persons, as required by the CJEU, except in special cases involving terrorism (paragraph 119 of the Tele2 judgment). The main reason for this is that “proceedings are commenced not against certain individuals, but against (at least in the beginning) unknown perpetrators.” This suggests that law enforcement authorities routinely use data retention to find possible suspects of a crime, for example through cell phone tower inquiries where information is obtained about all persons that are present in a certain area. Data-mining investigations like this affect a large number persons, some of whom may become suspects simply because of their presence in a certain area (location data). The Tele2 judgment only allows broad access to the retained data as an exception in particular cases involving terrorism, but Member States want to turn the exception into the general rule by only requiring a connection to criminal investigations when retained data is accessed.

At the JHA Council meeting in December, ministers agreed to continue “the work at experts level to explore avenues to develop a concept of data retention within the EU.” However, this is precisely what the expert working group has been doing for the past two years, without delivering a single proposal for data retention that respects the requirements of the Tele2 judgment.

This puts the European data retention situation at a stalemate. Member States refuse to even think of alternatives to their current blanket data retention regimes, but they cannot have blanket data retention, at least not legally, because the CJEU has ruled that it is illegal under EU law. The European Commission is the “guardian of the Treaties”, but appears unwilling to start infringement proceedings against Member States even if it is “monitoring” them. Legal action at the national level against data retention laws is, of course, a potential way out of the stalemate. Litigation is currently being pursued in some Member States, and in the past has been successful in a number of Member States.

However, Member States are fighting for their blanket data retention regimes at other levels than ignoring the Tele2 judgment. One possibility is that the future ePrivacy Regulation will present a more “favourable” environment for data retention than the current ePrivacy Directive – something that the Council is actively working on. This could give Member States a “fresh start” on data retention since the CJEU would have to assess the national data retention laws against the new ePrivacy Regulation, but still interpreted in light of the (unchanged) Charter of Fundamental Rights. There is also the risk that the CJEU could revise its stance on data retention in some of the new cases that are pending before the Court (C-623/17 from UK, C-520/18 from Belgium, and C-511/18 and C-512/18 from France). The first question in C-520/18 is very similar to the first question in the Tele2 case, that is whether Article 15(1) of the ePrivacy Directive, read in the light of the Charter of Fundamental Rights, precludes a general obligation to retain traffic data for providers of electronic communications services. Member States would undoubtedly see this as an opportunity to “retry” the Digital Rights Ireland and Tele2 cases before the CJEU.

Data retention – state of play. Council document 14319/18 (23.11.2018)

EU Member States plan to ignore EU Court data retention rulings (29.11.2017)

EU Member States fight to retain data retention in place despite CJEU rulings (02.05.2018)

(Contribution by Jesper Lund, EDRi member IT-Pol, Denmark)



07 Nov 2018

NGOs urge Austrian Council Presidency to finalise e-Privacy reform


EDRi member, together with 20 NGOs, is urging the Austrian Presidency of the Council of the European Union to take action towards ensuring the finalisation of the e-Privacy reform. The group, counting the biggest civil society organisations in Austria such as Amnesty International and two labour unions, demands in an open letter sent on 6 November 2018 an end to the apparently never-ending deliberations between the EU member states.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

It is today 666 days since the European Commission launched its proposal. The e-Privacy regulation is an essential aspect for the future of Europe’s digital strategy and a necessity for the protection of modern democracies from ubiquitous surveillance networks. Echoing European citizens rightful demands for protections of their online privacy, the organisations ask the Austrian Presidency to lead the way into a new privacy era by concluding the e-Privacy dossier by 2019.

The letter comes in a context in which a parliamentary inquiry from the Austrian Social Democratic party tries to shed light on the lobby connections of the Austrian government regarding the hampering of secure communications for its citizens. Right now, the Austrian government’s position is closely aligned with the interests of internet giants like Facebook and Google, big telecom companies and the advertisement industry.

The Austrian government has recently fast-tracked negotiations on the controversial e-evidence proposal, which would weaken the rule of law and foster further surveillance of citizens’ online behaviour. This is a stark contrast to the meager effort Austrian representatives put into negotiations around legislative proposals that aim to protect the fundamental right to privacy – a topic missing from the Austrian Council Presidency agenda.

In order to ensure that e-Privacy laws will not be used as excuse for the establishment of new repressive instruments, demands a clear commitment to the prohibition of data retention. Data retention has been found unconstitutional in different European countries, while was plaintiff in the 2014 proceedings of the European Court of Justice (ECJ) annulling the data retention directive. A circumvention of the ECJ’s ban through the e-Privacy regulation could expose EU citizens to indiscriminate mass-surveillance and severely undermine trust in EU institutions.

Open Letter sent to Austrian Government (in German only, 06.11.2018)

Parliamentary inquiry from the Austrian Social Democratic Party (in German only, 29.10.2018)

Council continues limbo dance with the ePrivacy standards (24.10.2018)

ePrivacy: Public benefit or private surveillance? (24.10.2018)

ECJ: Data retention directive contravenes European law (09.04.2014)

(Contribution by Thomas Lohninger, EDRi member