21 Jun 2018

ENAR and EDRi join forces for diligent and restorative solutions to illegal content online

By Maryant Fernández Pérez

The European Network Against Racism (ENAR) and European Digital Rights (EDRi) joined forces to draw up some core principles in the fight against illegal content online. Our position paper springs both from the perspective of victims of racism and that of free speech and privacy protection.

The European Commission has so far not been successful in tackling illegal content in a way that provides a redress mechanism for victims. In fact, the European Commission has been way too long focused on a “public relations regime” on how quickly and how many online posts have been deleted, while not having a diligent approach for addressing the deeper problems behind the removed content. Indeed, the European Commission has been continuously promoting rather superficial “solutions” that are not dealing with the problems faced by victims of illegal activity in a meaningful way.

At the same time, the European Commission’s approach is undermining people’s rights to privacy and freedom of expression by urging and pressuring internet giants to take over privatised law enforcement functions. As a consequence, ENAR and EDRi have agreed a joint position paper following our commitment to ensure fundamental rights for all.

Our joint position paper relies on four basic principles:

1. No place for arbitrary restrictions – Any measure that is implemented must be predictable and subject to real accountability.

2. Diligent review processes – Any measure must be implemented on the basis of neutral assessment, rather than being left entirely to private parties, particularly as they may have significant conflicts of interest.

3. Learning lessons – Any measure implemented must be subject to thorough evidence-gathering and review processes.

4. Different solutions for different problems – No superficial measure in relation to incitement to violence or hatred should be implemented without clear obligations on all relevant stakeholders to play their role in dealing with the content in a comprehensive manner. Illegal racist content inciting to violence or discrimination should be referred to competent and properly resourced law enforcement authorities for adequate sanctions if they meet the criminal threshold. States must also ensure that laws on racism and incitement to violence are based on solid evidence and respect international human rights law.

This paper follows cooperation between the two organisations over the past few years to bring the digital rights community and the anti-racist movement together in a more comprehensive way. The common initiative comes at a time where the European Commission is consulting stakeholders and individuals to provide their opinion on how to tackle illegal content online by 25 June 2018. EDRi has developed an answering guide for individuals that consider that the European Union should take a diligent, long-term approach that protects for the victims of illegal content, such as racism online, and victims of free speech restrictions.

(Contribution by Maryant Fernández Pérez, EDRi Senior Policy Advisor)

Read more:

ENAR-EDRi Joint position paper: Tackling illegal content online – principles for efficient and restorative solutions (20.06.2018)

EDRi Answering guide to EU Commission’s “illegal” content “consultation” (13.06.2018)

Commission’s position on tackling illegal content online is contradictory and dangerous for free speech (28.09.2017)

EU Commission’s Recommendation: Let’s put internet giants in charge of censoring Europe (28.09.2017)

20 Jun 2018

Press release: MEPs ignore expert advice and vote for mass internet censorship


In a vote today, 20 June 2018, the Legal Affairs Committee of the European Parliament voted for  a Copyright Directive, which includes measures to monitor and filter virtually all uploads to the internet.

The Copyright Directive includes the controversial Article 13, which mandates the mass monitoring and censorship of internet uploads. The vote comes after widespread criticism of these measures and against the advice of civil society, of leading academics and universities, of research institutions, the United Nations Special Rapporteur on Freedom of Opinion and Expression and even the inventors of the internet and of the world wide web.

Upload filters are opposed by every independent, expert voice in this debate. If the campaign keeps growing like it is, we will save the internet from the censorship machines,

said Diego Naranjo, Senior Policy Advisor at EDRi.

The next step is a negotiation between the Parliament and the EU Member States. A final vote of the European Parliament on the outcome of that negotiation will take place around the end of 2018.

Will Parliamentarians be willing to publicly support such an awful proposal, just weeks before the 2019 elections?

asked Joe McNamee, Executive Director of EDRi.

Time will tell.

EDRi will continue on its efforts to inform the public and MEPs on the dangers of the proposed Copyright Directive, and will continue to offer constructive opposition to the measure in the run-up to the final plenary vote.

Read more:

We can still win: Next steps for the Copyright Directive (20.06.2018)

Copyright reform: Document pool

EU Censorship Machine: Legislation as propaganda? (11.06.2018)

Censorship Machine: Busting the myths (13.01.2017)

20 Jun 2018

We can still win: Next steps for the Copyright Directive

By Andreea Belu

On the 20th of June 2018, the European Parliament’s Legal Affairs Committee (JURI) ignored all advice and voted for the chaotic Article 13 of the proposed Copyright Directive.

There are several steps for the EU institutions to go through before the Directive can finally be adopted. We can still win!

1. Mandate to negotiate (approved today, to be confirmed in the coming weeks):

The Committee voted to give itself a mandate to negotiate a final deal with the EU Council (the EU Member States).

If a political group or a group of Parliamentarians opposes this mandate, a vote of the full Parliament will be needed. This scenario is very likely to occcur and the vote would happen sometime between 3 and 5 July.

2. Negotiation with EU Council (from July until October, approximately)

If the mandate is finally approved, negotiations will take place between the Parliament and Council to reach a final deal. This process has no formal timeline. However, the Parliament side will be keen to reach a deal quickly (as it would be uncomfortable if such a bad proposal was voted too close to the May 2019 elections). So, this process could end as early as October.

3. Legal linguists (after the end of step number 2 until November, approximately)

The text will then be checked for legal coherence and translated into all EU working languages. This process can take a month.

4. Final Parliament vote (December/January).

The final Parliament vote will be the end of the legislative process. This is likely to happen in December/January, when MEPs will decide whether or not to oppose the views of voters, academics, universities, internet luminaries, the UN Special Rapporteur and others.

Will they vote for you? In May 2019, they will be asking you to vote for them.

Read more:

Press Release: MEPs ignore expert advice and vote for mass internet censorship (20.06.2018)

EU Censorship Machine: Legislation as propaganda? (11.06.2018)

Copyright Directive: Busting the myths (13.12.2017)


18 Jun 2018

We’re looking for policy interns to join our Brussels team. Is that you?

By Kirsten Fiedler

European Digital Rights (EDRi) is an international not-for-profit association of 39 digital human rights organisations from across Europe. We defend and promote rights and freedoms in the digital environment, such as the right to privacy, freedom of expression, and access to information.

privacy kids

Join EDRi now and become a superhero for the defence of our rights and freedoms online!

The EDRi office is currently looking for two interns to support our Policy team in Brussels. This is your opportunity to get first-hand experience in EU policy-making and contribute to promote digital rights and freedoms across Europe. This six-month internship starts on 3 September 2018 and ends on 28 February 2019. The internship is paid 750,- EUR per month.

Key tasks:

  • Research and analysis on data protection, privacy, copyright; or on surveillance & law enforcement, freedom of expression and intermediary liability, net neutrality and digital trade;
  • Monitoring and reporting about international, EU and national-related policy developments;
  • Organising and participating in meetings and events;
  • Writing articles for the EDRi-gram newsletter;
  • Assisting with the preparation of draft reports, position papers, presentations and other internal and external documents;
  • Development of public education materials;


  • A demonstrated interest in and enthusiasm for human rights and technology-related legal or policy issues;
  • Good understanding of the EU decision-making;
  • Experience in the fields of data protection, privacy, copyright, intermediary liability & freedom of expression, surveillance & law enforcement, net neutrality or digital trade would be an asset;
  • Excellent research and writing skills;
  • Fluent command of spoken and written English; other languages is a plus;
  • Computer literacy; advanced technical knowledge is a plus.

How to apply:

To apply, please send a maximum one-page cover letter and a maximum two-page CV in English and in .pdf format to julien.bencze(at)edri.org by 1 July 2018.

We are an equal opportunities employer with a strong commitment to transparency and inclusion. People from all backgrounds are encouraged to apply and we strive to have a diverse and inclusive working environment.

The closing date for applications is 1 July 2018. Please note that due to scarce resources, only shortlisted candidates will be contacted.

Find out more about Policy internships at EDRi.

13 Jun 2018

12 days of digital rights in Brussels. Was it Christmas?


This article is a short story about my participation in the Brussels exchange programme. Thanks to the Digital Rights Fund and Wikimedia, I was able to spend two and a half weeks (12 working days) with like-minded people and organisations and bring a new blast of energy to my efforts to fight the copyright censorship machine and snippet tax.

Here is how it went. First the advocacy lessons, then the numbers.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

May 16 was an intense day. It was the first day of my exchange in Brussels and the first visit to the new EDRi office. However, I had to put the kind welcome from the EDRi team behind me quite quickly because because I got an impromptu call from an MEP to reschedule a meeting about the new copyright proposal for that day. This meeting turned out to be like no other. It lasted four hours and the output was a letter, which was sent by the MEP to the JURI committee members asking to vote against Articles 11 and 13.

Victory? Well, looking back I’m thinking: wow, this was quite something! But while it happened I didn’t register it as a success. The main reason for this is that the meeting really was a power show. The MEP had an important message for me. No matter how well prepared you are, no matter how well you know all the details of this and that article, what always wins is whether you have the ability to explain in such simple but impactful language that inspires a positive reaction and support for what you are proposing. Therefore, all my in depth analysis and all my legal arguments were distilled to blunt, stark messages wrapped in this short letter. This was the first reality check for my advocacy skills.

Did this letter influence or change anything? Hard to tell. Looking at Julia Reda’s vote count, there’s still a lot to be done. By the way, did you join our Action Day on 12th June? If not, after 20th of June it’s not too late to pick up the phone and Save Your Internet. Every call, email, post, video, shout for your freedom of expression counts!

Coming back to the advocacy lessons learned, the second reality check for my advocacy skills was an advice I received from one of the decision makers I met: the most effective and sure-to-be-taken-into-consideration format in which they would ideally like to receive amendments is by sending a simple table with two columns. One column with the proposed legal text and the other column with how I want it to be changed. Simple and straightforward.

However, what I also know from previous experience is that decision makers also need long and boring analysis to be able to point to and base their decision. Therefore, in depth analysis and formal opinions need not be underestimated. They just need to be complemented with catchy, simple and distilled documents. How to find resources in small organisations to be able to do both, is still for me to find out in my next quest on advancing digital rights movements.

Now here’s how the exchange programme looked like in numbers:

    • 3 MEP meetings
    • 2 Member State Permanent Representation meetings
    • 3 copyright reform document reviews
    • 7 letters on copyright reform sent to decision makers
    • 2 European Parliament hearings on Cambridge Analytica
    • 1 ePrivacy meeting with Council attachées & civil society & lots of networking and Belgian fries 🙂

But this is just the content part. The other half of my exchange was focused on how to grow an organisation. On the admin side, Kirsten and Katarina fully emerged me in strategic planning and fundraising. As concrete results, I built with their help a Case for Support document for ApTI which will be used in fundraising activities. While there a lot of tips & tricks that I can immediately implement, I am also more confident on how to take strategic planning by the horns once I’m back.

Big “Thank You” to the entire EDRi team for welcoming me into their busy office and to all EDRi members for supporting my Digital Rights Fund application and making this possible! Also, many thanks to Wikimedia for knowledge sharing and preparatory meetings!

ApTI tweets in English @ApTI_ro and started a Bucharest Digital Rights Meetup channel.

Read more:


EDRi’s “Brussels Exchange Programme” – turning theory into practice (07.02.2018)

(Contribution by Valentina Pavel, EDRi member ApTI, Romania)



13 Jun 2018

EU – Japan trade agreement undermines algorithmic transparency

By Vrijschrift

The EU trade agreement with Japan undermines algorithmic transparency, Dutch EDRi member Vrijschrift wrote in a letter to the Dutch Parliament. In order to have regulatory supervision, we need access to source code and algorithms. The Volkswagen emissions scandal has shown that devices can be programmed to be misleading. In addition, algorithms in decision making software can be biased. Facebook’s role in elections and referendums shows that the use of personal data is not only a civil rights issue, but may compromise the integrity of our institutions.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Politicians call for algorithmic transparency and software audits. However, the EU-Japan trade agreement’s software code clause limits the possibilities to audit software and algorithms. Under the agreement’s article 8.73 the EU and Japan may not require the transfer of, or access to, source code of software owned by a person of the other Party. The article provides some exceptions, but they have a limited scope or are limited by strict conditions.The clause is in conflict with important policy objectives; Vrijschrift calls for a parliamentary scrutiny reservation.

You can read Vrijschrift’s letter to the chairman of the trade committee Raymond de Roon below:

We would like to express our concerns regarding the trade agreements with Japan and Singapore. These agreements fall under the EU’s competence; no ratification by the Netherlands is necessary. The EU can already decide to sign the treaties on June 26. We believe that the House should make a parliamentary scrutiny reservation.

It has recently become clear that the protection of personal data is not just a matter of civil rights. The scandal surrounding Facebook has shown that also the integrity of our institutions is at stake. The European Commission and European politicians (e.g. Merkel and Verhoeven) rightly want greater algorithmic transparency. However, the EU-Japan trade agreement’s source code clause will undermine the investigation of algorithms. A clear conflict between an important policy objective and a trade agreement.

The European Commission recently proposed a stronger safeguard for the protection of personal data in trade agreements. This safeguard has not been included in the treaties with Japan and Singapore, although these treaties require to allow cross-border data traffic. The Commission provided half-work, which we consider to be irresponsible in the light of the necessity to protect civil rights and the integrity of our institutions.

The treaties with Japan and Singapore limit the possibilities for reforming copyright and patent law. The treaty with Singapore contains higher damages than the ACTA treaty, which was rejected by the European Parliament.

The proposed treaties deserve serious scrutiny; we believe that the House should create room for this.

Read more:

Vrijschrift letter, English translation: EU trade agreement with Japan undermines algorithmic transparency

Vrijschrift letter (original in Dutch):

EU-Japan trade agreement enables Internet of Cheating Things

EU-Japan trade agreement not compatible with EU data protection

EU-Singapore trade agreement not compatible with EU data protection

EU-Japan trade agreement’s intellectual property chapter limits options for reform

ACTA-plus damages in EU-Singapore Free Trade Agreement

(Contribution by Ante Wessels, EDRi member Vrijschrift, the Netherlands)



13 Jun 2018

ePrivacy for Children: What is Data Protection Culture?

By Alternatif Bilisim

The General Data Protection Regulation (GDPR) attracted widespread attention and comment in recent weeks when it came into force on 25 May 2018. Having taken several years to get from being proposed by the European Commission to entering into force, the GDPR has been designed as a concerted, holistic and unifying effort to regulate personal data protection in the digital age.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

At a time when many public, private and third sector organisations have only recently ‘gone digital’ and when data has very rapidly becoming seen as ‘a new currency,’ the scope of application of the GDPR is vast. Serious fines can applied to firms, that do not abide by the new rules. This is no coincidence of course; recent Cambrige Analytica and Facebook violations of privacy forced the public debate to grow and with that awareness of what is at stake.

It is not only the scandals on the surface that have piqued the interest of the average user, though; the capital and energy spent on the data gathering fetish of social media platforms is also a key determinant of the process. The right to erasure is also more easily applicable from now on, signifying more meaningful control over data and the erosion of post-capitalist surveillance society. However, in the decade of tl;dr (too-long-did-not-read) and post-truth, this type of detailed regulation might be a little too complicated to understand for internet users of all ages.

Through the lens of a researcher-mother, one is quickly struck by the image of hyper-socialised millennium generation on massive platforms like Facebook and Instagram. GDPR brings special conditions for childrens’ data. Well, living in Turkey with your child right beside you is not a comfort; you are still spending 16+ hours of your day connected to inter-networks.

The GDPR makes some specific requirements in respect of children’s data, for reasons set out in recital 38: “Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child. The consent of the holder of parental responsibility should not be necessary in the context of preventive or counseling services offered directly to a child.”

While this statement has much merit, it is only an explanatory recital, guiding implementation of the GDPR but lacking the legal force of an article. In a recent London School of Economics Media Policy Project roundtable, it became clear that there is considerable scope for interpretation, if not confusion, regarding the legal basis for processing (including, crucially, when processing should be based on consent), the definition of an information society service (ISS) and the meaning of the phrase “directly offered to a child” in Article 8 (which specifies a so-called “digital age of consent” for children), the rules on profiling children, how parental consent is to be verified (for children younger than the age of consent), and when and how risk-based impact assessments should be conducted (including how they should cover intended or actual child users). It is also unclear in practice just how children will be enabled to claim their rights or seek redress when their privacy is infringed.

Already there are some surprises. WhatsApp, currently used by 24% of UK 12-15 year olds, announced it will restrict its services to those aged 16+, regardless of the fact that in many countries in Europe the digital age of consent is set at 13. Instagram is now asking its users if they are under or over 18 years old, perhaps because this is the age of majority in the United Nations Convention on the Rights of the Child (UNCRC)? We will see how things will unfold in the coming months.

In the meantime, a few suggestions are made by Sonia Livingstone of the London School of Economics in the light of a new project. For exploring how children themselves understand how their personal data is used and how their data literacy develops through the years from 11-16 years old, (1) conducting focus group research with children; (2) organising child deliberation panels for formulating child-inclusive policy and educational/awareness-raising recommendations; and (3) creating an online toolkit to support and promote children’s digital privacy skills and awareness. The young generation reminds us once again of the responsibility for creating commons data culture at grassroots level.

Do such changes mean effective age verification will now be introduced (leading to social media collecting even more personal data?), or will the GDPR become an unintended encouragement for children to lie about their age to gain access to beneficial services, as part of their right to participate? How will this protect them better? And what does this increasingly complex landscape mean for media literacy education, given that schools are often expected to overcome regulatory failures by teaching children how to engage with the internet critically? As in the case of Turkey, teachers digital literacy skills need a serious and rapid boost and even more primarily, policies regarding internet governance and community education must be redrafted.

Translated from the Original Text by Asli Telli Aydemir, Alternative Informatics (Alternatif Bilisim)
You can read the original text in Turkish here.

Read more:

A Digestible Guide to Individual’s Rights under GDPR (29.05.2018)

EDRi General Director Joe McNamee live interview on TRTWorld

GDPR Exlained Campaign

Time to Disagree Campaign

EDRi`s privacy for kids booklet

(Contribution by Alternatif Bilisim, EDRi member, Turkey)



13 Jun 2018

Litigation against the Danish government over data retention

By IT-Pol

Despite two rulings from the Court of Justice of the European Union (CJEU) in 2014 and 2016 against general and undifferentiated (blanket) data retention, a majority of EU Member States still have national data retention laws in place. Denmark is one these Member States.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Two months after the Tele2 judgment of 21 December 2016 (joined cases C-203/15 and C-698/15), the Danish Minister of Justice told the Legal Affairs Committee of the Parliament that the Danish data retention framework does not comply with EU law because it covers every subscriber. At the same time, the Minister of Justice refused to repeal the illegal data retention provisions and argued that there was not a specific deadline for how quickly EU Member States are required to adapt their national laws to comply with a judgment from the CJEU.

As of June 2018, the Danish Ministry of Justice is officially waiting for guidance from the European Commission before a new data retention law can be proposed to the Danish Parliament. The European Commission has promised to deliver guidance on how Member States can comply with the Tele2 judegment which bans blanket data retention, but does not rule out targeted data retention. From the outside, it would appear that the Danish government has very little interest in resolving the current deadlock, since the existing data retention provisions are still in place. The same applies to the provisions for access to the retained data which also require substantial changes to comply with the second part of the Tele2 judgment. For example, access to certain types of retained data, in particular location data, is not limited to cases involving serious crime.

A majority in the Danish Parliament has so far approved the government’s plan to postpone the revision of the now illegal data retention law. However, the Danish government is facing a new challenge since the Associaton Against Illegal Surveillance filed a lawsuit on 1 June 2018 against the Minister of Justice demanding the immediate annulment of the data retention provisions. The Association Against Illegal Surveillance was formed in the beginning of 2018, shortly after the Minister of Justice announced that contrary to his earlier expectations, there would not be a revision of the data retention law in the parliamentary year 2017-18. The association, led by spokesperson Rasmus Malver, with a professional background in human rights law, initiated a very successful crowdfunding campaign on social media, which has so far collected 60,000 euros from more than 1000 individuals and some larger donations from civil rights organisations, including Amnesty International Denmark.

On 4 June 2018, the data retention lawsuit against the Danish government received a considerable economic boost when the Danish Civil Rights Fund (Borgerretsfonden) backed the lawsuit with a guarantee to cover up to 50% of the legal expenses. This effectively doubles the contributions received through crowdfunding. The objective of the Civil Rights Fund is promote the rights of the individual versus the state and to provide legal assistance to individual citizens whose rights have been violated by public authorities. Law professor and board member of the Civil Rights Fund Hanne Marie Motzfeldt gave the following explanation for the large economic donation to the data retention lawsuit, in an interview with the Danish newspaper Politiken: ”The foundation of our form of government is that public authorities comply with the law. If they fail to do so, we must use the courts.”

The Minister of Justice has not yet responded to the lawsuit. Denmark does not have a constitutional court or a specialised legal system for challenging the validity of laws or administrative provisions. Such legal challenges are handled by the ordinary court system as civil litigation procedures, starting at the lowest court level (District Courts). The Association Against Illegal Surveillance has applied to have the case transferred to the High Court, from which the appellant court would be the Danish Supreme Court. If approved, this is likely to save time.

Since legal challenges against laws or government decisions occur very infrequently in Denmark, it is difficult to predict how long the legal proceedings will take. The Ministry of Justice has various options for delaying the case. One of these options is to ask the court to rule on whether the plaintiff has legal standing in the case. The plaintiff has carefully addressed the issue of legal standing in the complaint to the court, but a dispute over legal standing will inevitably delay the proceedings and the possibility of getting a court ruling against data retention in Denmark.

Read more:

Denmark: Our data retention law is illegal, but we keep it for now (08.03.2017)

Eurojust: No progress to comply with CJEU data retention judgements (29.11.2017)

Website of the Association Against Illegal Surveillance (”Foreningen mod ulovlig logning”)

Website of the Civil Rights Fund (”Borgerretsfonden”), in Danish

The citizens’ lawsuit against the Minister of Justice receives a large economic boost, Politiken (in Danish, 04.06.2018)

(Contribution by Jesper Lund,  EDRi member IT – Political Association of Denmark (IT-Pol), Denmark)



13 Jun 2018

Wiretapping & data access by foreign courts? Why not!

By Anamarija Tomicic

After the European Commission published two new legislative proposals for law enforcement authorities to be able to reach across EU borders to have access to data directly from service providers, the EU Member States started working on this new “e-evidence” package. The proposal has so far become the object of wide-spread criticism from service providers, and civil society organizations, including EDRi, because it raises serious questions concerning privacy, data protection and basic principles such as the right to defence and access to effective remedies. At the request of very few Member States, the EU Council had a discussion about including two new elements to the already broad scope of the proposals, direct access to data and real-time interception of data. These new proposals add yet more concerns regarding individuals’ rights.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

While the objective of this legislationis to improve criminal investigations by facilitating law enforcement authorities’ cross-border access to data in another EU Member State, the proposal creates a shortcut giving private companies a role previously carried out by judicial authorities. According to the existing proposal, companies that store individual’s data – including big companies such as Facebook, Google and Microsoft, but more crucially, small companies without the resources or expertise of the internet giants – will be obliged to give access to individuals’ data if demanded by law enforcement authorities, in some cases without the intermediation of a court.

The new proposal of some EU Member States to have direct access to data takes this practice a step further. It would mean that authorities could access data stored on private companies’ servers at any time. In combination with real-time interception of data, the proposed law could enable mass surveillance of individuals across Europe without appropriate safeguards. On 4 June the EU Council decided to postpone the discussions on whether to include direct access to and real time interception of data in the Regulation to October 2018.

In addition to working on its own legislation, the EU is considering an executive agreement with the US Government based on the flawed US CLOUD Act, which would enable the US to directly access data from European companies and viceversa, and potentially include real time surveillance like in the US CLOUD Act. The agreement would grant the EU and the US mutual access to data.

In conclusion, some EU Member States more making these proposals even more extreme than they already are by pushing for real time interception and direct access to citizens’ data without appropriate safeguards and an agreement with the US that could have even further implications on mass surveillance and individuals’ rights such as data protection and privacy. EDRi warned against these proposals even before the drafts were published and will keep working with other stakeholders and policy-makers to change this worrisome situation.

Read more:

Council Presidency Note on “E-evidence” (28.05.2018)

Outcome of Council of meeting on Justice and Home Affairs (04-05.06.2018)

EU “e-evidence” proposals turn service providers into judicial authorities (17.04.2018)

EDRi’s response and annex to the European Commission’s consultation on cross-border access to e-evidence (16-27.10.2017)


(Contribution by Anamarija Tomicic, EDRi Communications and Community Officer)



13 Jun 2018

Civil society calls for protection of communications confidentiality

By Diego Naranjo

On 31 May EDRi, Access Now, and Privacy International met attachés to the EU Council (representatives of EU Member States) who work on the ePrivacy Regulation proposal. Following up to our recent two letters on ePrivacy (here and here), the Dutch Permanent Representation in Brussels and the Bulgarian EU Council Presidency kindly hosted us to discuss the ePrivacy proposal.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

During our meeting with the attachés we expressed, first of all, the need to adopt a strong ePrivacy Regulation in 2018. In an interconnected world, where our online behaviour and our private communications can be tracked by a duopoly of advertisers and a murky cloud of data brokers, the EU needs to take a step forward and ensure a high level of protection for the confidentiality of our electronic communications.

Second we also highlighted the need to clarify that privacy and confidentiality should cover our devices and information about them (location, type of software, etc.). We also highlighted the need to protect communications metadata and for clarification regarding the use of offline tracking for “measuring” purposes.

In addition to this, we stressed the threat to confidentiality that tracking walls represent, and how they are the opposite of informed consent. The current situation, where users are required to agree to an unlimited “take it or leave it” amount of unnecessary processing of their personal data, needs to be corrected by the ePrivacy Regulation.

Finally, we called for a strong provision requiring privacy by design and by default in software and hardware used for electronic communications. In order for our devices (IoT objects, laptops, smartphones…) to operate securely, the settings of all the components of terminal equipment placed on the market should be configured by design and by default to prevent third parties from storing information, processing information already stored in the terminal equipment and preventing the use by third parties of the equipment’s processing capabilities.

The EU Council needs to finalise negotiations during the Austrian Presidency (starting July 2018) while taking care of the details that need to be improved. Given that the GDPR has entered into application in the European Union, the ePrivacy Regulation needs a final commitment from EU policy makers in order to ensure legal certainty, enhanced privacy protections and a ban on pervasive tracking of individuals.

Read more:

Document presented to TELE Council attachés as our key points during the meeting with them on 31.05.2018

Mythbusting – Killing the lobby myths that are polluting the preparation of the e-Privacy Regulation

EU Member States fight to retain data retention in place despite CJEU rulings (02.05.2018)

ePrivacy: Civil society letter calls to ensure privacy and reject data retention (24.04.2018)

Cambridge Analytica access to Facebook messages a privacy violation (18.04.2018)

(Contribution by Diego Naranjo, EDRi Senior Policy Adviser)