06 Sep 2017

Winter is here

By Heini Järvinen

This autumn announces itself much colder and threatening for our rights and freedoms than we thought: The e-Privacy Regulation and copyright reform are the two main pieces of EU legislation that will keep the digital rights defenders of EDRi’s Brussels office busy. We will also continue our work on implementation of the General Data Protection Regulation (GDPR), the Audiovisual Media Services Directive (AVMSD), encryption, cross-border access to electronic evidence, and intermediary liability, among other dossiers.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------


In January 2017, the European Commission published its proposal for an e-Privacy Regulation (ePR), which will cover privacy and data protection issues specific to electronic communications. Our longer position paper and quick guide provide an introduction to the most important points of the proposal. The next steps with this dossier will be the key votes in the European Parliament (EP). Some Committees are scheduled to vote on an Opinion in late September, and the lead Committee (Civil Liberties, Justice and Home Affairs, LIBE) is likely to vote on its final Report in October. The good news is that the LIBE draft Report already contains a number of amendments to the original Commission text that are in line with our suggestions. After the LIBE vote, the text is likely to go through “trilogues”, which are informal negotiations between the Council of the European Union, the European Parliament and the Commission. The text will then be adopted in the Parliament’s Plenary session. This is likely to happen at the earliest in spring 2018.

Copyright reform

In September 2016, The European Commission published its proposal for a new Copyright Directive that aims at modernising EU copyright rules. The proposal poses a number of threats to our online freedoms, of which the most distressing is the introduction of a “censorship machine”, which would filter all uploads to the internet (Article 13 of the proposal) in contradiction to at least four European court rulings and existing EU secondary law. Another paragraph introduces the so-called “link tax” (Article 11), which has already been an expensive failure in Germany and Spain. In addition to our continuous efforts to convince the politicians to abandon the most damaging points of the proposal, we are also meeting and exchanging with activists around Europe to increase cooperation. Our event, the “School of Rock(ing) Copyright” will take place in September in Ljubljana, and in October in Budapest and in Lisbon.

General Data Protection Regulation (GDPR)

The General Data Protection Regulation (GDPR), the main text of EU legislation dealing with the protection of personal data, was finalised in 2016. However, because of the numerous, unpredictable flexibilities in the legislation, our work is not over yet. We are working together with many EDRi members, the European Consumer Organisation (BEUC) and academics, to promote the best possible implementation of the GDPR. We will be working on a “compliance check list for users”, general research about the effects of the Regulation, and technical tools to help citizens to exercise their rights.

E-evidence and cybercrime

The European Commission is preparing to present plans on dealing with access to electronic evidence (“e-evidence”). In addition, an optional protocol to the Cybercrime Convention (also known as “the Budapest Convention”) is currently being prepared, to be finalised by the end of 2019. We will be following the process closely, and sending submissions to the Council of Europe (CoE) to ensure that our rights and freedoms are considered in the final protocol. The first meeting of the drafting group will take place on 19-20 September 2017.

Audiovisual Media Services Directive (AVMSD)

In May 2016, the European Commission proposed to reform the Audiovisual Media Services Directive (AVMSD). The current AVMS Directive regulates traditional TV broadcasters and on-demand services in the EU. The new proposal broadens the scope of the Directive to cover the regulation of video-sharing platforms and potentially even other social media companies. Our main concern is the lack of clarity and safeguards for respecting the rule of law and protecting fundamental rights. The trilogue negotiations on the proposal have now started, following a vote in favour by 17 (in the Committee that took the decision) of the 751 Members of the European Parliament (adopting the Parliament’s negotiating position) and none of the EU Member States (adopting the negotiating position of the Council of the European Union). A few policy-makers will continue with the aim of reaching a political agreement by the end of the year. EDRi will issue recommendations and try to obtain improvements in the opaque process.

In addition to the priorities listed above, we will also be working on other topics, such as Notice and Action, digital trade, a Fundamental Rights Review Project on surveillance instruments, and following developments on net neutrality and whistleblowing protection.

e-Privacy revision: Document pool

Copyright reform: Document pool

The School of Rock(ing) EU Copyright 2017

Proceed with caution: Flexibilities in the General Data Protection Regulation

Access to e-evidence: Inevitable sacrifice of our right to privacy?

Audiovisual Media Services Directive reform: Document pool



28 Jun 2017

Are we on the right track for a strong e-Privacy Regulation?

By Diego Naranjo

European legislation protecting your personal data (the General Data Protection Regulation and Law Enforcement Directive on Data Protection) was updated in 2016, but the battle to keep your personal data safe is not over yet. The European Union is revising its legislation on data protection, privacy and confidentiality of communications in the digital environment: the e-Privacy Directive. This piece of legislation contains specific rules related to your freedoms online.

In today’s interconnected societies, the way we frame technology defines if we are able to ensure the privacy of our most intimate conversations and thoughts. If the policy-makers fail to achieve this and end up with a vague text full of loopholes because of political “compromises”, it will have a far-reaching impact on our online freedoms for the decades to come.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

In January 2017, the European Commission launched the reform of the e-Privacy legislation by proposing a harmonised framework. The text needs improving. Tracking walls and offline tracking should be banned, and encryption should be ensured, among other issues. Despite the flaws of the proposal, there are also positive aspects to build on.

On 9 June 2017, the lead committee of the European Parliament in charge of the dossier for the e-Privacy reform, the Committee on Civil Liberties (LIBE), published its draft Report on the ePrivacy Regulation including amendments to the Commission’s original proposal. Marju Lauristin, the Parliamentarian in charge of the file for LIBE has shown great determination to improve the protection of citizens’ privacy by proposing numerous positive changes to the Commission’s text. The changes proposed by LIBE will help to ensure legal certainty by limiting the ways data can be used (strict grounds for processing), broadening the type of trackers that will be regulated (not only “cookies”), and reinforcing users’ rights by promoting end-to-end encryption without backdoors. Ms Lauristin also proposed introducing a household exception similar to the one in the General Data Protection Regulation (GDPR), in order make certain that accessibility tools are not unintentionally restricted by the legislation. In addition to this, the draft Report broadens the scope to include the protection of employees from surveillance by their employers, and adds the possibility of collective redress for organisations. However, the text could have gone one step further, for example, the absence of a stronger text opposing offline tracking in the proposed amendments is regrettable. It is difficult to imagine how consent in those situations (ones’ movements being tracked through Bluetooth or WiFi, as one wanders around a town or a shopping centre) can be informed, how data could be meaningfully anonymised and how opt-out would work without excluding users of certain services.

The LIBE Committee put forward a stronger text than the original proposal. It is, however, to be seen if strong opponents of the e-Privacy Regulation, such as the Rapporteur of the Committee on Legal Affairs (JURI) Axel Voss, will succeed to undermine the key elements of the text. Only few Member States seem to have a strong position on this dossier, which makes it even harder to guess what the final result of this reform will look like. Member States have been heavily lobbied by the most regressive parts of the online industry for years. This resulted in fourteen Member States calling for “the right balance between digital products and services and the fundamental rights of data subjects” – as ridiculous as it seems to demand a balance between “products” and fundamental rights.

A lot of work still needs to be done to keep the best parts of the proposals, and to avoid the amount of disharmony and “flexibility” we ended up with in the GDPR. The way we will communicate with others, and the way our interconnected devices will work depends greatly on the outcome of this new Regulation. Will the EU set up the standards of protection high enough? The next months will give us an answer to this question.

Draft Report on the e-Privacy Regulation of the Committee on Civil Liberties, Justice and Home Affairs (LIBE) (09.06.2017)

e-Privacy revision: Document pool

Your privacy, security and freedom online are in danger

EDRi’s proposal for amendments to proposal for the e-Privacy Regulation

(Contribution by Diego Naranjo, EDRi)



17 May 2017

Big Data for Big Impact – but not only a positive one

By Guest author

Technology has changed and keeps dramatically changing our everyday life by transforming the human species to advanced networked societies. To celebrate this digital revolution, 17 May is dedicated to the “World Telecommunication and Information Society Day” (WTISD-17).

The theme for this year’s celebration is “Big Data for Big Impact”. Not so surprisingly, the buzzword “big data” echoes in our daily commutes over the internet world. The chosen theme focuses on harnessing the power of big data to turn complex and imperfect pieces of data into a meaningful and actionable source of information for the social good.

Big data has a potential to improve society – much like electricity or antibiotics. From health care and education to urban planning and protecting the environment, the applications of big data are remarkable. However, big data comes with big negative impacts. Big data can be used – by both advertisers and government agencies – to violate privacy. The power of big data can be exploited to monitor every single detail of people’s activities globally.

With 29 million streaming customers, Netflix is one of the largest providers of commercial media in the world. It has also become a trove of data for advertisers as it collects data on users’ activities – what, when and where they are watching, what device they are using, when they fast-forward, pause or stop. Just imagine a representative of Netflix sitting behind your couch, looking over your shoulder and making notes whenever you turn on the service. This applies to many online services, such as Google, Amazon, Facebook or YouTube.

Mass surveillance initiatives by intelligence agencies such as the US National Security Agency (NSA) and the UK Government Communications Headquarters (GCHQ) take this power to the next level to knock down every bit of personal space. Without big data, the scale at which such profiling is done today would not be possible.

It is very tempting to use the benefits of big data for all sorts of purposes. Hiring new employees based on their social media activities, granting insurances based on fitness tracker data, airport security check ups and future crime predictions based on cell phone call logs, to mention a few. But there are some fundamental problems with applying big data to services.

The first problem is that, knowingly or unknowingly, we all have biases when making decisions. If decisions made by millions of employers, policemen or judges over a long period are collected together, it brings in all those biases, on a bigger scale. Big data may just refer to a large chunk of unstructured data, but the insights deduced from it will rely on machine learning – which accumulates all possible biases, such as gender and race. Algorithmic decision-making could turn out to be more biased than ever before, which would have a terrible effect on the society.

The second problem is the error rates: A study on automatic face recognition software found that the error rates can vary between 3% and 20%. This means that your face could match with one in the database of potential terrorist the next time you go to the airport and you could be pulled out for questioning or get into even more trouble. This is happening in the international airport transit on a daily basis. It is not possible to create 100% accurate models, and every time the assumptions are made on a missing data sample, the errors are inevitable.

Therefore, when dealing with big data, it is crucial to be extremely cautious about the quality and sources of the data, as well as about who can access it, and to what extent. If a data set stemming from diverse sources is handled with special care and anonymised thoroughly to protect privacy rights, big data can be used to solve complex societal problems. But if it is left unregulated or not properly regulated, and not tested for its fairness and biases, it can pose a serious threats to our human rights and fundamental freedoms.

EDRi has fought for the EU General Data Protection Regulation (GDPR) to regulate this practice. Now EU Member States are implementing the GDPR, and it is up to them not to abuse the weak points of the Regulation to undermine the protection of the European citizens’ data.

Video by EDRi member Privacy International: Big Data

Creating a Big Impact with Big Data

(Contribution by Siddharth Rao, Ford-Mozilla Open Web Fellow, EDRi)


03 May 2017

EU data protection watchdogs support stronger ePrivacy legislation

By Guest author

On 10 January 2017, the European Commission (EC) published its long-awaited proposal for an e-Privacy Regulation (ePR) to replace the 2002 e-Privacy Directive (ePD). In April 2017, two Opinions were issued to provide comments and recommendations on how to better safeguard the right to privacy, confidentiality of communications, and the protection of personal data in the proposed ePR; one by the Article 29 Data Protection Working Party (WP29), and another one by the European Data Protection Supervisor (EDPS).

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Both Opinions share the idea that the EC took the right decision when proposing this legislation. As mentioned by WP29 and the EDPS, the proposal has several positive elements. However, the Council of the European Union and European Parliament now need to focus on fixing the negative aspects that undermine the level of protection accorded by the General Data Protection Regulation (GDPR). The most sensitive issues among the improvements identified by both Opinions are:

Keep definitions in Regulation: Both the EDPS and WP29 share the opinion that the definitions under the ePR could become “moving targets”, if they are imported from the still unfinished European Electronic Communications Code (EECC). WP29 is proposing alternatives, including additional clarifications in the ePR or a simultaneous adoption of both proposals. The EDPS is asking for independent terms, as the definitions created for purposes of economic (market) regulation cannot be expected to be adequate for the protection of fundamental rights.

Privacy by default and by design are essential and not optional: The principle of “privacy by default”, as provided in the GDPR, has been replaced with “privacy by option” in the ePR. This implies that end-users would be given the “option” to determine through software settings whether they allow third parties to access or store information on their devices. Given the inconsistency of this provision with Article 25 of the GDPR, both authorities are proposing to impose an obligation on hardware and software providers to implement default settings that protect end-users’ devices against any unauthorised access to or storage of information on their devices. The EDPS goes even a step further and argues for a provision that would allow users not only be informed about privacy settings during installation or first use of the software, but also at other moments when users make significant changes to their devices or software.

Tearing down “tracking walls”: Tracking walls deny users access to the websites that they are seeking to use, because they do not consent to being tracked across other sites by large numbers of companies. Both Opinions are advising against this possibility to continue allowing tracking walls, with some nuances. While WP29 recommends a weaker solution, the EDPS is asking for a complete and explicit ban on tracking walls. The EDPS argues that according to the GDPR, giving consent has to be a genuinely free choice, and these digital walls cannot result in real consent.

Neither online nor offline tracking: WP29 addresses the issue of offline tracking, and argues that data controllers should, only in limited number of circumstances, “be allowed to process the information emitted by the terminal equipment for the purposes of tracking their physical movements without consent of the individual concerned”. WP29 Opinion also suggests that device tracking should only be permitted if the personal data collected is anonymised. Moreover, the EDPS recommends that the provisions allowing for device tracking be deleted and replaced by a simpler requirement of consent (by all end-users concerned).

Keep an eye on the restrictions: Under the current Directive and the proposed Regulation, non-targeted data retention measures are allowed. Both Opinions re-state that national data retention regimes have to comply with the requirements of the European Union Charter of Fundamental Rights and of the case law of the Court of Justice of the European Union (CJEU), both of which require strict safeguards for mass storage of data.

Give redress to both individuals and organisations: The EC’s proposal leaves the right to collective redress out of the ePR Regulation text, which is puzzling. The EPDS took note of this omission and made it clear that an explicit provision for collective redress and effective remedies (or more simply a reference to Article 80 of the GDPR) are needed. Including such provision is essential to ensure consistency with the GDPR, and to allow individuals to access collective redress through, for example, consumer groups.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

WP29: Opinion 01/2017 on the Proposed Regulation for the ePrivacy Regulation (2002/58/EC) (04.04.2017)

EDPS: Opinion 6/2017 on the Proposal for a Regulation on Privacy and Electronic Communications (ePrivacy Regulation) (24.04.2017)

New e-Privacy rules need improvements to help build trust (09.03.2017)

e-Privacy Directive revision: Document pool

(Contribution by Romina Lupseneanu, EDRi intern)



19 Apr 2017

Dangerous myths peddled about data subject access rights

By Guest author

Now that the date on which the General Data Protection Regulation (GDPR) becomes enforceable is rapidly approaching, the European Data Protection Authorities (DPAs) are in the process of clarifying what their shared positions will be on various topics, including profiling. This is done through stakeholder consultation meetings.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

During the latest meeting, one of the more contentious issues surrounding profiling turned out to be the transparency requirements regarding the algorithms used for automated decision making and profiling. While industry representatives in general provided constructive input on the various topics, this issue was more challenging. Several industry representatives were pushing for a very narrow interpretation of the right to access regarding the logic in automated decision making.

The basic argument is that industry has a right to hide the precise details of the calculations used to make decisions that discriminate against individuals. Three points were made in support of claims that the right of information regarding the logic of processing should not extend to disclosing the actual algorithms used:

  1. they would be protected trade secrets;
  2. intellectual property rights would preclude such disclosure;
  3. it would create a moral hazard in case of applications of profiling in fraud prevention.

Regarding the protection of trade secrets, the situation is fairly simple. The Trade Secrets Directive (2016/943/EU), for all its flaws, mentions specifically in its recitals that it shall not affect, among other rights, the right to access for data subjects. Since this Directive has to be implemented by June 2018, there is only a window of a few weeks in which trade secrets protections in some member states could, theoretically, prejudice data subject access to the logic used in automated decision making. So for all practical intents and purposes, trade secret legislation cannot be invoked to prevent disclosure of such underlying algorithms.

As far as intellectual property rights are involved, this is even more of a non-issue. The only so-called intellectual property rights that bear relevance here are copyright law and patent law.

Software copyright law does not explicitly cover underlying algorithms, a view that is reiterated in the ruling of the SAS Institute Inc. v World Programming Ltd case (C‑406/10 CJEU), in which the Court of Justice of the European Union (CJEU) ruled that the functionality of a computer program is not protected by copyright under Computer Programs Directive (91/250/EEC).

As far as patent law is involved, the European Patent Convention states that “schemes, rules and methods for performing mental acts, playing games or doing business, and programs for computers” shall not be regarded as patentable inventions (article 52(2) sub c). It would be difficult to argue that the logic for automated decision making in profiling of personal data is not a method for doing business. A requirement for patent law protection is disclosure of the underlying technology, which makes it even less likely to support an argument that it might prejudice disclosure of logic in automated decision making. Given that none of the other intellectual property rights even come close to covering the logic of algorithms, it follows that there are no barriers in intellectual property laws to disclosure of logic for automated decision making.

Even if there were intellectual property rights covering the underlying logic of software algorithms, it would still not necessarily be a given that these should override the data protection legislation. The CJEU has repeatedly considered competition law interests outweighing intellectual property interests in cases where it had to balance competition against intellectual property.

The last argument, that of a moral hazard, may or may not come into play in the context of fraud detection and insurance risk assessment. First of all, the European legislator has never made any exceptions for it in the GDPR, secondly, this can be addressed by disclosure of the logic as applied to a specific data subject instead of disclosure of the general logic as applied to all data subjects affected.

The logical conclusion for DPAs enforcing the GDPR in the future is to interpret the aforementioned arguments from parts of industry with a great deal of cynicism. They simply have no basis in the EU law and/or reality.

Rejections of data subject access requests to the underlying logic of automated decision making based on “trade secrets” or “intellectual property rights” should be treated by DPAs as violations of the GDPR and addressed accordingly.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

The Trade Secrets Directive (2016/943/EU)

Ruling of the SAS Institute Inc. v World Programming Ltd case

European Patent Convention

Insurance: How a simple query could cost you a premium penalty (30.09.2013)

(Contribution by Walter van Holst, EDRi member Vrijschrift, the Netherlands)



06 Apr 2017

The European Parliament adopts another resolution critical of the Privacy Shield


On 6 April 2017, the European Parliament (EP) voted a motion for a resolution on the adequacy of the protection afforded by the EU-US Privacy Shield. The scheme gives the United States a unique arrangement for the transfer of personal data from the European Union to the United States. The Privacy Shield replaced the Safe Harbor decision, which used to serve the same purpose, until the Court of Justice of the European Union (CJEU) invalidated it in the Schrems case in 2015.

The EU-US Privacy Shield has been showered with criticism from the moment the details of the new(ish) rules were published. However, the European Commission (EC) proposed and adopted it anyway.

The Article 29 Data Protection Working Party of national data protection authorities and the European Union Data Protection Supervisor (EDPS) issued opinions expressing numerous concerns regarding the level of protection offered by the Privacy Shield and its compliance with the right to the protection of personal data and the right to privacy. Moreover, the EP adopted a similar resolution in May 2016, when the draft decision on Privacy Shield was adopted, but its recommendations seemed to be ignored.

Today, the EP has adopted a new resolution which regards many of the Privacy Shield’s provisions as inadequate. The resolution lists several problems in the agreement and calls on the Commission to thoroughly examine them in its first annual review in September 2017.

Among the issues listed in the resolution, the EP raises awareness about the lack of specific rules on automated decisions and of a general right to object and the need for stricter guarantees on the independence and powers of the Ombuds mechanism, the current non-quorate status of the Privacy and Civil Liberties Oversight board, as well as the lack of concrete assurances that the US agencies have established safeguards against mass and indiscriminate collection of personal data (bulk collection). Another flaw mentioned in the Parliament’s criticism is the fact that the Privacy Shield is based on voluntary self-certification and therefore applies only to US organisations which have voluntarily signed up to it, which means that many companies are not covered by the scheme.

Furthermore, the resolution asks the Commission to seek (long overdue) clarification on the legal status of the “written assurances provided” made by the US and to make sure the commitments taken under the new decision will be kept by the new US administration. Furthermore, the resolution calls on the European data protection authorities (DPAs) to monitor the functioning of Privacy Shield and to exercise their powers to suspend or ban data transfers “if they consider that the fundamental rights to privacy and the protection of personal data of the Union’s data subjects are not ensured.”

Unsurprisingly, the Parliament “with concern” the dismantling of the FCC’s privacy rules. Last but not least, the EP calls on the Commission to take all the necessary measures for the Privacy Shield to comply with the General Data Protection Regulation (GDPR) and with the Charter of Fundamental Rights of the European Union.

The Privacy Shield has already been brought to the CJEU by two advocacy groups: EDRi member Digital Rights Ireland (case number T-670/16) and EDRi observer La Quadrature du Net (case number T-738/16). If the CJEU applies the same reasoning as for the former Safe Harbour agreement, the Privacy Shield will need a replacement very soon. It is to be hoped that the EC is preparing the contingency plan to resolve this situation as soon as possible and not wait (again, like it did with Safe Harbour and the two Data Retention rulings) until it is forced to act by the Court of Justice. If the Commission does this then maybe, finally, fundamental rights can be protected on both sides of the Atlantic and both citizens and businesses can enjoy the benefits of increased trust in the online environment.

Civil society letter: Without reforms in US surveillance laws, the Privacy Shield must be suspended (02.03.2017)

Privacy Shield: Privacy Sham (12.07.2016)

European Parliament confirms that “Privacy Shield” is inadequate (26.05.2016)


22 Feb 2017

Consultation on multilateral investment court misses the point

By Guest author

The European Commission has launched a consultation on establishing a multilateral investment court, which would serve as a permanent body to decide investment disputes. The court would replace controversial investor-to-state dispute settlement (ISDS) mechanisms in existing and future trade and investment treaties. It would interpret the substantive rules in these treaties, which provide a high level of legal protection for investors. This would leave states no or a very limited right to regulate, as regulation would always happen under the (real or perceived) threat of supranational litigation.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The issue at hand is that the consultation has a narrow scope with no regard to social impacts, including fundamental rights. Therefore it is crucial to react. The deadline for submitting comments on the questionnaire on options for a multilateral reform of investment dispute resolution is 15 March 2017.

The multilateral investment court proposal is based on an Inception Impact Assessment which presents various scenarios. Its baseline scenario – what would happen without EU policy changes – is just one sentence long and doesn’t expect the court to have social (or environmental) impacts. The baseline scenario ignores existing impacts, a huge expansion, through new treaties, of covered foreign direct investment, and a greater scope, as EU trade and investment treaties bring EU decisions under the scope of investment mechanisms. A more comprehensive baseline scenario would address growing social impacts.

Compared to ISDS, a multilateral investment court would bring institutional improvements. Such improvements, however, do not solve systemic issues with specialised and supranational adjudications, which create a high risk of expansive interpretations of investors’ rights. Specialised courts tend to interpret expansively and the supranational level lacks effective instruments to correct expansive interpretations.

A multilateral investment court would shift the balance between investments on the one hand and democracy and fundamental rights on the other. This undermines our values, ability to reform, and ability to respond to crises.

Foreign investors would be able to use a multilateral investment court to challenge EU data protection enforcement measures. This could apply to, for instance, the suspension of cross-border data flows or fines imposed by supervisory authorities on data controllers and data processors under the General Data Protection Regulation (GDPR). A multilateral investment court would also impede reform of “intellectual property” rights.

The Commission’s consultation seems designed to keep social (and environmental) impacts out of the consultation’s results. In light of the need to protect fundamental rights, the EU cannot ignore, legitimise, or perpetuate increasing impacts. With a baseline scenario showing growing impacts on fundamental rights, the Commission should work out scenarios which will decrease them.

General Data Protection Regulation: Document pool

Questionnaire on options for a multilateral reform of investment dispute resolution

Multilateral investment court assessment obscures social and environmental impacts

Defend democracy: draft answers for new ISDS consultation

ENDitorial: EU Commission ISDS proposal – a threat to democracy

(Contribution by EDRi member Vrijschrift, The Netherlands)



25 Jan 2017

e-Privacy Regulation: Good intentions but a lot of work to do

By Diego Naranjo

On 10 January 2017, the European Commission published its long-awaited proposal for an e-Privacy Regulation (Regulation on Privacy and Electronic Communications, ePR) to replace the 2002 e-Privacy Directive (Directive 2002/58/EC, ePD).

EU legislation on data protection is divided between general legislation (the 1995 Directive, soon to be replaced by the General Data Protection Regulation) and legislation specifically covering privacy in the communications sector, the e-Privacy Directive.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The ePD has two functions. Firstly, it provides additional clarity and predictability to allow the principles in the general legislation to be implemented in the complex environment of communications. Secondly, it serves as the EU legislative instrument to give meaning to the fundamental right to freedom of communications.

The proposed draft Regulation contains a number of provisions which, if adopted and effectively implemented, should address some of the current gaps or lack of clarity in protection of the confidentiality of electronic communications and information stored on users devices. The process of consultation and polls have shown that citizens are concerned about their privacy and about how companies make use of their personal information online. Although the Commission has rightly identified and addressed most of the key issues and objectives in the proposal, strong forces seem to have watered down the text considerably, compared to the earlier version that was leaked in December 2016. For example, the reference to “privacy by design and by default” that was changed in Article 10 will need to be put back in order not to lower down the protections to the current “privacy by option”, options on the degree of online privacy that the browser would offer to the user.

Among the improvements needed, the European Parliament will need to make sure that the definitions of the text (cross-referenced to the European Electronic Communications Code, EECC, which is still being discussed) do not lead to a reduced scope of the e-Privacy Regulation. Furthermore, the scope of these definitions in the ePR relates to electronic communication networks, while in the leaked version it also referred to electronic communication services. This is a significant reduction in the scope of the proposed ePR.

Regarding the substance of the proposal, one of the key issues, the processing of content (“what we talk about”) and metadata (“when and with whom we communicate”), raise some concerns: both the content and the metadata, which can sometimes be more sensitive than content of our online interactions, could be used for additional purposes by, for example, our email providers, if the user has “consented” to this. The way this consent is obtained in practice will need to be carefully addressed. If the legislator cannot avoid that, in practice, the consent is considered valid if done for example under over-broad Terms and Conditions, or through pre-ticked boxes, the e-Privacy Regulation would be going below the standards needed to effectively protect our communications.

The section on access to devices is probably the one that has drawn the most attention to the proposal, since it regulates the use of tracking technologies such as tracking cookies. The text establishes that terminal equipment of end-users (smartphones, laptops but also, arguably, an e-fitness device or any other device that is part of what we call the “Internet of Things”) are part of the individual’s private sphere. Access to these devices and to any information stored in or emitted by such equipment would be under the scope of the ePR. However, here too, “consent” is the key that could give access to our personal devices, with the same risks commented above. Finally, the exceptions for Member States to restrict the same protections that the Regulation is trying to provide is one of the most worrying parts of the text, along with the unexpected absence of reference to collective redress in the article on remedies (Article 21).

Citizens have expressed repeatedly the need for strong protections for privacy and confidentiality of communications. However, there seems to be a lot of work ahead to complement and particularise the text presented by the Commission.

EDRi: e-Privacy document pool

Proposal for a Regulation on Privacy and Electronic Communications (10.01.2017)

Eurobarometer on ePrivacy (19.12.2016)

(Contribution by Diego Naranjo, EDRi)



11 Jan 2017

2017 – another extremely challenging year for digital rights


The agenda of the year 2016 for the protection of digital rights was filled with challenges, and it looks like 2017 is not going to be any easier.

Since the Digital Single Market is one of the priorities of the Maltese presidency of the Council of the European Union, we can expect more policy developments affecting citizens’ rights and freedoms online in 2017. In its work programme, Malta pledges to pursue talks on geoblocking, roaming fees, connectivity, high frequencies and cross-border portability.

While taking advantage of the single market to benefit the economies by scrapping trade barriers and providing European citizens access to services, it is crucial to keep the focus on improving data protection, freedom of expression and defending citizens’ right to privacy.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

What were the crucial policy developments in 2016? What we expect to happen in 2017, and what are our key priorities for the year ahead?

Data protection and privacy

In 2016, the European Parliament adopted the General Data Protection Regulation (GDPR) and the Law Enforcement Data Protection Directive (LEDP), which are set to enter into force in 2018. EDRi welcomed the overall positive outcome of the GDPR, but regrets that the initial high expectations were not realised. The Commission adopted the Privacy Shield adequacy decision that has already been challenged in front of the Court of Justice of the European Union (CJEU) and rejected by the European Parliament. The EU/US Umbrella Agreement, which was judged to be incompatible with EU law by the European Parliament’s legal service, was also approved.

As for 2017, e-Privacy will be one of EDRi’s main priorities. On 10 January, the European Commission published its proposal for the e-Privacy Regulation. This legislation is crucial to provide clear rules on tracking individuals as they surf the web, and freedom of communication more generally. To promote trust, privacy and innovation, the proposal needs significant improvement.


In 2017, we will provide input on discussions around cross-border access to evidence and the protection of encryption. We will also provide input on discussions around the Council of Europe’s Budapest Convention on Cybercrime, also with a particular interest in the hot topic of “access to evidence”. Weakening of procedural rules for access to communications data by foreign governments would obviously have major implications for privacy and security.

Net neutrality

In 2016, the Body of European Regulators of Electronic Communications (BEREC) published its guidelines on the implementation of European net neutrality rules. Thanks to our hard and persistent work, the guidelines reflect our recommendations quite well.

In 2017 we will keep on campaigning for net neutrality by providing input to discussions around the BEREC regulation, and monitoring the Telecoms Package review. In December, we reported on the success of one of our Austrian members in ensuring the effective implementation of the new rules.


The current European copyright system is broken and must be changed. The European Commission has set in its agenda reforming copyright as one of the foundations to build the Digital Single Market. In 2016, the Commission issued a highly criticised draft legislation. The proposed Copyright Directive could not conceivably be worse, even including a proposal for upload filtering, despite the fact that the Court of Justice of the European Union has already rejected this approach.

In 2017, the European Parliament and Council will discuss the new proposal. We will closely follow the discussions and advocate for amendments to improve the parts of the text that can be improved and rejection of the parts that cannot.



23 Nov 2016

#5 Freedom not to be labelled: How to fight profiling


This is the fifth blogpost of our series dedicated to privacy, security and freedoms. In the next weeks, we will explain how your freedoms are under threat, and what you can do to fight back.


Profiling: What is it and how does it work?

Algorithms gather data from your social media activities, emails, browsing history and so on. Now that the Internet of Things is becoming more and more used, it adds its share to the amount of information collected and stored. As a result of all this data available about your personality, preferences and activities, you can be more and more easily labelled and placed in categories.

These categories may or may not be correct. You might end up “mislabelled” and put into a wrong category. For example, according to a French government website, you might be in the process of being radicalised if you change your eating habits, leave full-time education or stop your sporting activities and stop watching TV. Of course, you might just be a student writing your thesis.

Research has shown that for example a person’s ethnic group, sexual orientation, religion or relationship status can be surprisingly accurately guessed from simply assessing their Facebook “likes”. These insights are possible, even though many users avoid clicking on links that would obviously reveal these details.

Based on this “labelling”, decisions can be taken about you: if you will be selected for a job interview, or picked for a special security screening at the airport. Or you could be offered either a discount or higher prices for a service or a product.

How to claim back your freedom not to be labelled

If you believe that a profiling measure has produced legal effects or significantly affected you (credit worthiness, reliability, conduct) you can contact Data Protection Authorities (DPA) to exercise your rights, such as the right to object to automatic decision-making and the right access to the information collected about you. Unfortunately, not all the DPAs have a user-friendly approach, and issuing a request can be fairly complex in some countries, such as Belgium. However, in other countries like France, the authorities offer a template-based model to simplify the complaint system for their citizens. The new General Data Protection Regulation (GDPR), which is due to become binding law in all EU Member States in 2018, will strengthen and clarify both these rights and the ability of national data protection authorities to implement them.

Random Agent Spoofer is an add-on for Firefox browser. It hinders browser fingerprinting – collecting information that allows to identify you – by allowing you to automatically choose random browser profiles.

Self-Destructing Cookies is an add-on that removes the general purpose cookies when they are no longer used by open browser tabs. Also, it detects and removes the tracking cookies as soon as they are spotted.

$heriff allows you to know differential pricing in real time.

In the webseries “Do Not Track”, produced by ARTE TV in collaboration – with Mozilla, you can discover more about profiling, for example how much data you provide when “liking” things on Facebook, and how that affects not only you but also your friends and relatives. Watch the third episode, “Like mining” here:


What can politicians do to safeguard your freedoms online?

The rules on online privacy in the EU (ePrivacy Directive) will be soon updated. This law deals with privacy and confidentiality of communications for the entire EU, and it affects tracking and other issues related to your freedoms online. Are politicians ready to fight for your protection?

Read our previous blogposts here, and stay tuned to our next blogposts to know more about your freedoms online, and how they are threatened!

Read more:

6 times it’s more expensive to be a woman (12.04.2016)

Need a Reservation? That Could Depend On How Big You Are on Twitter (Really) (30.09.2010)

Is social profiling discrimination? (03.05.2012)

The dangers of high-tech profiling, using Big Data (07.08.2014)

Do Not Track: Episode 3 – Like Mining