07 Feb 2018

Data protection – time for action

By Anne-Morgane Devriendt

On 24 January 2018, the European Commission (EC) published a Communication on the implementation of the General Data Protection Regulation (GDPR), entering into force on 25 May 2018: “Stronger protection, new opportunities”.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The Communication describes the preparatory works by the Commission to help with the implementation of the GDPR and what the Commission plans to help Member States and companies to comply with the new data protection framework.

Most of the work at the EU level has been done by the group of Data Protection Authorities (the so-called Article 29 Working Party). It has been preparing guidelines, on the basis of extensive consultations and workshops with a variety of stakeholders. More work still needs to be done in order to ensure the effective implementation of the new rules.

Although the GDPR is a Regulation and therefore applies “as is” in all Member States, some legislation needs to be adapted to the new obligations set by the GDPR, especially regarding the flexibilities with which the “Regulective” can be implementedat Member States’ discretion, on automated decision making and transfer of personal data to third countries, among other things i. Ironically, while industry demanded harmonisation at the start of the legislative process, it spent most of the decision-making process demanding national flexibilities and exceptions, leading to the opposite outcome to the one it initially asked for. Sometimes, one is left with the impression that lobbyists are working to create work for themselves.

At the moment of publication of this Communication, just four months before the GDPR enters into force, only two out of 28 Member States (Austria and Germany) have finished this legislative preparation. It is also to be clarified how Member States will ensure that national Data Protection Authorities (DPAs) are given the means to fulfill their new functions as prescribed by the GDPR.

Finally, the Communication stresses that the core principles of data protection are not affected by the new Regulation As a result, few changes are needed from organisations, if they already comply with the existing Data Protection Directive. However, the Commission notes that citizens and small and medium-sized companies are not well informed about the provisions of the GDPR. It has launched guidelines on the new rules for business and rights for citizens.

One cannot help but wonder why neither Member States nor companies seem to be prepared for new legislation that has been discussed since the adoption of the Commission’s initial Communication in November 2010 and in the four years of legislative discussion, that were shaped by an unprecedented lobbying campaign by parts of the industry. This ostensible lack of preparedness is also surprising bearing in mind that the Regulation does not change existing core principles that should already be respected by controllers through the transposition (and enforcement) of the Data Protection Directive into national law since 1998.

Communication from the Commission (24.01.2018)

Commission’s GDPR guidelines for citizens and small and medium companies

PROCEED WITH CAUTION: Flexibilities in the General Data Protection Regulation (05.07.2016)

General Data Protection Regulation: Document pool

(Contribution by Anne-Morgane Devriendt, EDRi intern)



07 Feb 2018

The Bulgarian EU Council presidency & the latest assault on ePrivacy

By Anne-Morgane Devriendt

In January 2018, the Bulgarian Presidency of the Council of the European Union (EU) picked up where the Estonian Presidency left off on the ePrivacy Regulation. It issued two examinations of the last Estonian “compromise” proposal and asked national delegations for guidance on some issues. Together, the documents cover most of the key points of the text. While the Bulgarian Presidency brings clarity on some points, its questions pave the way to undermine the text – and therefore threatens the protection of citizens’ privacy, confidentiality of communications of both citizens and businesses, as well as the positions of innovative EU companies and trust in the online economy.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

One of the main lobbying devices used against the ePrivacy proposal is its alleged redundancy, due to the General Data Protection Regulation (GDPR) coming into force in May 2018. The processing of personal data is already covered by the GDPR, why would we need an additional text? The Bulgarian Presidency addresses this question by clarifying the ePrivacy Regulation’s role as lex specialis of the GDPR. Effectively, the ePrivacy Regulation complements the GDPR, and if the two texts overlap, then ePrivacy applies, as it provides for a higher level of protection of communications data, which are sensitive data.

On privacy settings, covered by Article 10, the Bulgarian Presidency proposes to keep the choices presented by the Estonian Presidency, providing for privacy by default and an easy way to change the settings, or to require more granularity in the settings by blocking the storage or the processing of data by third parties. This offer users a degree of control over third-party activities on their devices.

After this welcome clarification on this (rather simple) issue and this relatively privacy-friendly proposal, the Bulgarian Presidency then follows up on the undermining of the text already initiated by the Estonian Presidency in December 2017.

In the second document that deals with the third Chapter of the proposal on the “rights to control electronic communications”, the Bulgarian Presidency mostly follows the Estonian proposal, except for publicly available directories. There, it proposes to either put obligations both on the providers of number-based communication services and on publicly available directories, or the harmonisation of the rules with opt-in or right to object. As for direct marketing, the Bulgarian Presidency asks the national delegations to give their opinion on the need for uniform rules on voice-to-voice calls.

The Bulgarian Presidency also asks the national delegations to choose between two proposals concerning permitted processing of communications data (provided in Article 6): a middle ground that would be to allow further processing if it has no impact on privacy; or the inclusion of a “legitimate interest” ground for further processing of metadata. It is hard to understand what kind of further processing of communication data – or metadata – would not impact privacy (not least following the latest revelations of security breaches due to “non-personal” data, or how there could be a “legitimate interest” for the further processing of communication metadata, not least due to contrary positions already taken by the Court of Justice of the European Union in the Tele 2 case.

On storage and erasure of electronic communications data, regarding data that is no longer needed to provide a service, the Bulgarian Presidency proposes to either delete the provisions on the deletion of data, or to keep them while deleting the provisions authorising recording or storage of the data by the end-user or a third-party entrusted by them. The first possibility would remove the protection of communication data at rest – ironically creating, at the request of industry lobbyists, the kind of incoherence between ePrivacy and the GDPR of which industry lobbyists have been warning. The second would keep the level of protection agreed upon by the European Parliament.

The worst attack of the Bulgarian Presidency on the text concerns the protection of terminal equipment (Article 8). In addition to the proposals put on the table by the Estonian Presidency, the Bulgarian Presidency proposes different exemptions to the need for consent for the processing of data from an individual device: for “non-privacy intrusive purposes”; based on a “harm based approach” that would consider the levels of impact of different techniques on privacy. It also proposes to couple together the addition of a “legitimate interest to deliver targeted advertisement” and the right to object; and even asks whether the text should cover the “access to services in the absence of consent to process information”. Again, it is hard to see how there could be a “legitimate interest to deliver targeted advertisement”, and how this would contribute to the protection of privacy. Such a convoluted legal construction would, in any event, be only usable by the largest targeted (or “surveillance”) advertising companies. If this approach is followed, the EU would end up with legislation (ePrivacy) that would make it easier to access data on a computer system, as well as legislation (attacks against computer systems – Directive 2013/40/EU) criminalising access to a computer system.

Although the Bulgarian Presidency did take a progressive stance on the links between the GDPR and ePrivacy, the rest of its proposals systematically undermine the text by lowering the level of protection of the communications and privacy.

ePrivacy Regulation proposal – Examination (1) of the Presidency discussion paper (11.01.2018)

ePrivacy Regulation proposal – Examination of Articles 12 to 16 (25.01.2018)

Latest proposal by the Estonian Presidency (05.12.2017)

ePrivacy proposal undermined by EU Member States (10.01.2018)

(Contribution by Anne-Morgane Devriendt, EDRi intern)



10 Jan 2018

ePrivacy proposal undermined by EU Member States


The discussions on the ePrivacy Regulation continue in the European Union (EU) legislative process. They were on hold for a few weeks because of ongoing negotiations on the European Electronic Communications Code (EECC) – another big “telecoms” file that the Council of the European Union is working on.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

On 5 December 2017, the Estonian Presidency of the Council proposed new compromises on key articles. This latest proposal for amendments is related to Articles 6, 7 and 8 of the draft ePrivacy Regulation, which concern permitted processing (Art. 6), storage and erasure of communications data (Art. 7) and the protection of stored communications in users’ devices (Art. 8).

Permitted processing

The provisions on permitted processing cover the circumstances under which electronic communications data may be processed.

The Estonian Presidency text suggests a few adaptations to be in line with the General Data Protection Regulation (GDPR) by including the legal ground of vital interest in Article 6(2)(d) and a new recital 17a, as well as provisions for accessibility in Article 6(3)(aa) and the new recital 19a. These additions should not add any risks for privacy issues in the way they are currently designed.

Much more concerning is the addition in Article 6(2)(e) and a recital 17b of a legal ground for scientific research and statistical purposes, similar to the one in Article 9(2)(j) of the GDPR (research, unlike the archiving, need not be “in the public interest”). The text of the recital and the Article state that this “type of processing should be subject to further safeguards to ensure privacy of the end-users by employing appropriate security measures such as encryption and pseudonymisation.” The use of “such as” means that these are just possibilities, not requirements. On top of that, a lot of flexibility would be given to Member States, since these measures must be “based on Union or Member State law, which shall be proportionate to the aim pursued and provide for specific measures”. This creates risks for privacy, security and the economic benefits generated by a more predictable, harmonising measure.

Storage and erasure

The provisions on storage and erasure cover what protection should apply to different types of data and the deletion of data that is no longer needed to perform a service.

On storage and erasure, the Estonian Presidency “invites delegations to reflect on the need for” Art. 7(1) which ensures protection of communication data when it is at rest (i.e. stored in the provider’s network). Not including the protection of communications data at rest in the ePrivacy regulation means that an e-mail would be sensitive data subject to the standards of the ePrivacy Regulation while being transmitted and suddenly, upon arrival in the (online) mailbox of the provider, be subject to the General Data Protection Regulation. This would create the option for processing of the content as non-sensitive data under the “legitimate interest” exception in the GDPR, in order to facilitate invasive surveillance of content, of the kind previously done by Gmail. Bizarrely, businesses lobby both for clear, predictable rules and unclear and unpredictable rules like this.

Protection of terminal equipment

The provisions on protection of terminal equipment cover the rule for installing or using data on an individual’s communications device.

As regards terminal equipment, recital 21 adds precision on the use of cookies. Cookies can be used for both tracking and non-tracking purposes. The text recognises “authentication session cookies used to verify the identity of end-users engaged in online transactions” as legitimate, as well as some audience measuring. However, Articles 8(1)(d) and 8(2)(c) authorise audience measuring “by a third party on behalf of the provider of the information society service” and statistical counting without making pseudonymisation mandatory. This would facilitate the kind of cross-platform data processing done by, for example, Google Analytics.

Recital 21 and Article 8(1)(e) also allow for installation of security updates without the consent of the end-user, provided they are necessary, that the user is aware of them and that the user can delay them. While security updates are particularly important to protect the user from attacks or breaches, consent should remain as the sole legal basis for any process linked to accessing a terminal equipment. That way, instead of knowing that a “security update” is being installed on your phone, computer or other connected device, the software provider would have an incentive to be more transparent and give you more information on the update and what it is for.

Although not every proposed amendment threatens fundamental rights, the Estonian Presidency proposed to broaden the scope of exceptions in significant ways. It suggested authorising some processing that goes beyond what is strictly necessary, not keeping consent as sole legal basis, and not putting up strong safeguards to limit the impact of this broadening on privacy. This weakening of protections and predictability brings us closer to the kind of security and privacy chaos that the United States is experiencing. It would without doubt create the “chill on discourse and economic activity” that failure to implement privacy and security measures has caused in the US. But at least Facebook and Google will be happy.

Presidency text leaked by Austrian government (05.12.2017)

Presidency text (05.12.2017)

e-Privacy: what happened and what happens next (29.11.2017)

e-Privacy revision: Document pool

(Contribution by Anne-Morgane Devriendt and Diego Naranjo, EDRi)



28 Sep 2017

European Parliament Consumer Protection Committee chooses Google ahead of citizens – again

By Joe McNamee

On 28 September, the European Parliament Committee on Internal Market and Consumer Protection (IMCO) adopted its Opinion on the proposed e-Privacy Regulation. Just as it did when reviewing the General Data Protection Regulation (GDPR), it is fighting hard to minimise the privacy and security of the European citizens it is meant to defend.

Currently, the surveillance-based online advertising market is dominated by Facebook and Google. It was estimated that, in the US, 99% of growth in the sector is being absorbed by those two companies. Most of the amendments adopted in IMCO serve the purpose of defending this anti-competitive, anti-choice, anti-privacy, anti-innovation framework.

Some of the many egregious efforts to water down the proposal include:

  • The Opinion adds a loophole to the text to reduce protection of communications. It suggests that the confidentiality of emails and other electronic communications should be protected only when they are “in transit”. This contradicts the entire logic of the legislation and, crucially, will allow companies such as Google to monitor the content of communications – when not “in transit”, but stored in their servers – to intensify their profiling of users.
  • It supported the European Commission’s position that devices and software should not prevent unauthorised access by default. Instead, there should simply be an option – possibly hidden somewhere in the settings, as is typical – to set security levels. Ironically, this position completely contradicts other EU legislation, which criminalises unauthorised trespassing on computer systems. It also contradicts the GDPR, which foresees data protection by design and by default.
  • The Opinion suggests that “further processing” of metadata – information about our location, the times we communicate, the people we communicate with and so on – should not require consent. The activity permitted by this is not vastly different from the definition of stalking in Dutch law: “systematically and deliberately intrudes someones personal environment with the intention to force the other to do something, not to do something or to tolerate something…”(translation adapted from Wikipedia)
  • Instead of requiring providers to anonymise or delete personal data that is no longer needed, the IMCO Committee would also allow unnecessary personal data, if they use pseudonymisation. This means that the data collected and used by the provider could, at any moment be re-identified, creating unnecessary security, data protection and privacy risks for the individuals concerned.
  • Rather cleverly, the Committee has exploited child protection to make profiling of users easier for companies. By saying that electronic communications data “shall not be used for profiling or behaviourally targeted advertising purposes” for children, it implies that this is acceptable for adults.
  • An obligation to inform users about security risks that have been discovered and how to minimise risks has been simply deleted from the Opinion.
  • The committee did not amend the Commission’s proposal to allow people to be tracked through their mobile devices as they move around physically (in towns, malls, airports, and so on). Individuals are expected to opt out individually each time they enter an area with one or more tracking networks – on condition that they see and react to the one or more signs that indicate that tracking is happening.

The extremist anti-consumer position of the Committee on Internal Market and Consumer Protection was rightfully ignored in the adoption of the General Data Protection Regulation. We can only hope that, in addition to having its name changed to the “Committee on Internal Market and Consumer Protection”, its Opinion will be ignored also this time.

IMCO Compromise Amendments to the proposal for a e-Privacy Regulation

e-Privacy revision: Document pool

Are we on the right track for a strong e-Privacy Regulation? (28.07.2017)

(Contribution by Joe McNamee, EDRi)


06 Sep 2017

Winter is here

By Heini Järvinen

This autumn announces itself much colder and threatening for our rights and freedoms than we thought: The e-Privacy Regulation and copyright reform are the two main pieces of EU legislation that will keep the digital rights defenders of EDRi’s Brussels office busy. We will also continue our work on implementation of the General Data Protection Regulation (GDPR), the Audiovisual Media Services Directive (AVMSD), encryption, cross-border access to electronic evidence, and intermediary liability, among other dossiers.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------


In January 2017, the European Commission published its proposal for an e-Privacy Regulation (ePR), which will cover privacy and data protection issues specific to electronic communications. Our longer position paper and quick guide provide an introduction to the most important points of the proposal. The next steps with this dossier will be the key votes in the European Parliament (EP). Some Committees are scheduled to vote on an Opinion in late September, and the lead Committee (Civil Liberties, Justice and Home Affairs, LIBE) is likely to vote on its final Report in October. The good news is that the LIBE draft Report already contains a number of amendments to the original Commission text that are in line with our suggestions. After the LIBE vote, the text is likely to go through “trilogues”, which are informal negotiations between the Council of the European Union, the European Parliament and the Commission. The text will then be adopted in the Parliament’s Plenary session. This is likely to happen at the earliest in spring 2018.

Copyright reform

In September 2016, The European Commission published its proposal for a new Copyright Directive that aims at modernising EU copyright rules. The proposal poses a number of threats to our online freedoms, of which the most distressing is the introduction of a “censorship machine”, which would filter all uploads to the internet (Article 13 of the proposal) in contradiction to at least four European court rulings and existing EU secondary law. Another paragraph introduces the so-called “link tax” (Article 11), which has already been an expensive failure in Germany and Spain. In addition to our continuous efforts to convince the politicians to abandon the most damaging points of the proposal, we are also meeting and exchanging with activists around Europe to increase cooperation. Our event, the “School of Rock(ing) Copyright” will take place in September in Ljubljana, and in October in Budapest and in Lisbon.

General Data Protection Regulation (GDPR)

The General Data Protection Regulation (GDPR), the main text of EU legislation dealing with the protection of personal data, was finalised in 2016. However, because of the numerous, unpredictable flexibilities in the legislation, our work is not over yet. We are working together with many EDRi members, the European Consumer Organisation (BEUC) and academics, to promote the best possible implementation of the GDPR. We will be working on a “compliance check list for users”, general research about the effects of the Regulation, and technical tools to help citizens to exercise their rights.

E-evidence and cybercrime

The European Commission is preparing to present plans on dealing with access to electronic evidence (“e-evidence”). In addition, an optional protocol to the Cybercrime Convention (also known as “the Budapest Convention”) is currently being prepared, to be finalised by the end of 2019. We will be following the process closely, and sending submissions to the Council of Europe (CoE) to ensure that our rights and freedoms are considered in the final protocol. The first meeting of the drafting group will take place on 19-20 September 2017.

Audiovisual Media Services Directive (AVMSD)

In May 2016, the European Commission proposed to reform the Audiovisual Media Services Directive (AVMSD). The current AVMS Directive regulates traditional TV broadcasters and on-demand services in the EU. The new proposal broadens the scope of the Directive to cover the regulation of video-sharing platforms and potentially even other social media companies. Our main concern is the lack of clarity and safeguards for respecting the rule of law and protecting fundamental rights. The trilogue negotiations on the proposal have now started, following a vote in favour by 17 (in the Committee that took the decision) of the 751 Members of the European Parliament (adopting the Parliament’s negotiating position) and none of the EU Member States (adopting the negotiating position of the Council of the European Union). A few policy-makers will continue with the aim of reaching a political agreement by the end of the year. EDRi will issue recommendations and try to obtain improvements in the opaque process.

In addition to the priorities listed above, we will also be working on other topics, such as Notice and Action, digital trade, a Fundamental Rights Review Project on surveillance instruments, and following developments on net neutrality and whistleblowing protection.

e-Privacy revision: Document pool

Copyright reform: Document pool

The School of Rock(ing) EU Copyright 2017

Proceed with caution: Flexibilities in the General Data Protection Regulation

Access to e-evidence: Inevitable sacrifice of our right to privacy?

Audiovisual Media Services Directive reform: Document pool



28 Jun 2017

Are we on the right track for a strong e-Privacy Regulation?

By Diego Naranjo

European legislation protecting your personal data (the General Data Protection Regulation and Law Enforcement Directive on Data Protection) was updated in 2016, but the battle to keep your personal data safe is not over yet. The European Union is revising its legislation on data protection, privacy and confidentiality of communications in the digital environment: the e-Privacy Directive. This piece of legislation contains specific rules related to your freedoms online.

In today’s interconnected societies, the way we frame technology defines if we are able to ensure the privacy of our most intimate conversations and thoughts. If the policy-makers fail to achieve this and end up with a vague text full of loopholes because of political “compromises”, it will have a far-reaching impact on our online freedoms for the decades to come.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

In January 2017, the European Commission launched the reform of the e-Privacy legislation by proposing a harmonised framework. The text needs improving. Tracking walls and offline tracking should be banned, and encryption should be ensured, among other issues. Despite the flaws of the proposal, there are also positive aspects to build on.

On 9 June 2017, the lead committee of the European Parliament in charge of the dossier for the e-Privacy reform, the Committee on Civil Liberties (LIBE), published its draft Report on the ePrivacy Regulation including amendments to the Commission’s original proposal. Marju Lauristin, the Parliamentarian in charge of the file for LIBE has shown great determination to improve the protection of citizens’ privacy by proposing numerous positive changes to the Commission’s text. The changes proposed by LIBE will help to ensure legal certainty by limiting the ways data can be used (strict grounds for processing), broadening the type of trackers that will be regulated (not only “cookies”), and reinforcing users’ rights by promoting end-to-end encryption without backdoors. Ms Lauristin also proposed introducing a household exception similar to the one in the General Data Protection Regulation (GDPR), in order make certain that accessibility tools are not unintentionally restricted by the legislation. In addition to this, the draft Report broadens the scope to include the protection of employees from surveillance by their employers, and adds the possibility of collective redress for organisations. However, the text could have gone one step further, for example, the absence of a stronger text opposing offline tracking in the proposed amendments is regrettable. It is difficult to imagine how consent in those situations (ones’ movements being tracked through Bluetooth or WiFi, as one wanders around a town or a shopping centre) can be informed, how data could be meaningfully anonymised and how opt-out would work without excluding users of certain services.

The LIBE Committee put forward a stronger text than the original proposal. It is, however, to be seen if strong opponents of the e-Privacy Regulation, such as the Rapporteur of the Committee on Legal Affairs (JURI) Axel Voss, will succeed to undermine the key elements of the text. Only few Member States seem to have a strong position on this dossier, which makes it even harder to guess what the final result of this reform will look like. Member States have been heavily lobbied by the most regressive parts of the online industry for years. This resulted in fourteen Member States calling for “the right balance between digital products and services and the fundamental rights of data subjects” – as ridiculous as it seems to demand a balance between “products” and fundamental rights.

A lot of work still needs to be done to keep the best parts of the proposals, and to avoid the amount of disharmony and “flexibility” we ended up with in the GDPR. The way we will communicate with others, and the way our interconnected devices will work depends greatly on the outcome of this new Regulation. Will the EU set up the standards of protection high enough? The next months will give us an answer to this question.

Draft Report on the e-Privacy Regulation of the Committee on Civil Liberties, Justice and Home Affairs (LIBE) (09.06.2017)

e-Privacy revision: Document pool

Your privacy, security and freedom online are in danger

EDRi’s proposal for amendments to proposal for the e-Privacy Regulation

(Contribution by Diego Naranjo, EDRi)



17 May 2017

Big Data for Big Impact – but not only a positive one

By Guest author

Technology has changed and keeps dramatically changing our everyday life by transforming the human species to advanced networked societies. To celebrate this digital revolution, 17 May is dedicated to the “World Telecommunication and Information Society Day” (WTISD-17).

The theme for this year’s celebration is “Big Data for Big Impact”. Not so surprisingly, the buzzword “big data” echoes in our daily commutes over the internet world. The chosen theme focuses on harnessing the power of big data to turn complex and imperfect pieces of data into a meaningful and actionable source of information for the social good.

Big data has a potential to improve society – much like electricity or antibiotics. From health care and education to urban planning and protecting the environment, the applications of big data are remarkable. However, big data comes with big negative impacts. Big data can be used – by both advertisers and government agencies – to violate privacy. The power of big data can be exploited to monitor every single detail of people’s activities globally.

With 29 million streaming customers, Netflix is one of the largest providers of commercial media in the world. It has also become a trove of data for advertisers as it collects data on users’ activities – what, when and where they are watching, what device they are using, when they fast-forward, pause or stop. Just imagine a representative of Netflix sitting behind your couch, looking over your shoulder and making notes whenever you turn on the service. This applies to many online services, such as Google, Amazon, Facebook or YouTube.

Mass surveillance initiatives by intelligence agencies such as the US National Security Agency (NSA) and the UK Government Communications Headquarters (GCHQ) take this power to the next level to knock down every bit of personal space. Without big data, the scale at which such profiling is done today would not be possible.

It is very tempting to use the benefits of big data for all sorts of purposes. Hiring new employees based on their social media activities, granting insurances based on fitness tracker data, airport security check ups and future crime predictions based on cell phone call logs, to mention a few. But there are some fundamental problems with applying big data to services.

The first problem is that, knowingly or unknowingly, we all have biases when making decisions. If decisions made by millions of employers, policemen or judges over a long period are collected together, it brings in all those biases, on a bigger scale. Big data may just refer to a large chunk of unstructured data, but the insights deduced from it will rely on machine learning – which accumulates all possible biases, such as gender and race. Algorithmic decision-making could turn out to be more biased than ever before, which would have a terrible effect on the society.

The second problem is the error rates: A study on automatic face recognition software found that the error rates can vary between 3% and 20%. This means that your face could match with one in the database of potential terrorist the next time you go to the airport and you could be pulled out for questioning or get into even more trouble. This is happening in the international airport transit on a daily basis. It is not possible to create 100% accurate models, and every time the assumptions are made on a missing data sample, the errors are inevitable.

Therefore, when dealing with big data, it is crucial to be extremely cautious about the quality and sources of the data, as well as about who can access it, and to what extent. If a data set stemming from diverse sources is handled with special care and anonymised thoroughly to protect privacy rights, big data can be used to solve complex societal problems. But if it is left unregulated or not properly regulated, and not tested for its fairness and biases, it can pose a serious threats to our human rights and fundamental freedoms.

EDRi has fought for the EU General Data Protection Regulation (GDPR) to regulate this practice. Now EU Member States are implementing the GDPR, and it is up to them not to abuse the weak points of the Regulation to undermine the protection of the European citizens’ data.

Video by EDRi member Privacy International: Big Data

Creating a Big Impact with Big Data

(Contribution by Siddharth Rao, Ford-Mozilla Open Web Fellow, EDRi)


03 May 2017

EU data protection watchdogs support stronger ePrivacy legislation

By Guest author

On 10 January 2017, the European Commission (EC) published its long-awaited proposal for an e-Privacy Regulation (ePR) to replace the 2002 e-Privacy Directive (ePD). In April 2017, two Opinions were issued to provide comments and recommendations on how to better safeguard the right to privacy, confidentiality of communications, and the protection of personal data in the proposed ePR; one by the Article 29 Data Protection Working Party (WP29), and another one by the European Data Protection Supervisor (EDPS).

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Both Opinions share the idea that the EC took the right decision when proposing this legislation. As mentioned by WP29 and the EDPS, the proposal has several positive elements. However, the Council of the European Union and European Parliament now need to focus on fixing the negative aspects that undermine the level of protection accorded by the General Data Protection Regulation (GDPR). The most sensitive issues among the improvements identified by both Opinions are:

Keep definitions in Regulation: Both the EDPS and WP29 share the opinion that the definitions under the ePR could become “moving targets”, if they are imported from the still unfinished European Electronic Communications Code (EECC). WP29 is proposing alternatives, including additional clarifications in the ePR or a simultaneous adoption of both proposals. The EDPS is asking for independent terms, as the definitions created for purposes of economic (market) regulation cannot be expected to be adequate for the protection of fundamental rights.

Privacy by default and by design are essential and not optional: The principle of “privacy by default”, as provided in the GDPR, has been replaced with “privacy by option” in the ePR. This implies that end-users would be given the “option” to determine through software settings whether they allow third parties to access or store information on their devices. Given the inconsistency of this provision with Article 25 of the GDPR, both authorities are proposing to impose an obligation on hardware and software providers to implement default settings that protect end-users’ devices against any unauthorised access to or storage of information on their devices. The EDPS goes even a step further and argues for a provision that would allow users not only be informed about privacy settings during installation or first use of the software, but also at other moments when users make significant changes to their devices or software.

Tearing down “tracking walls”: Tracking walls deny users access to the websites that they are seeking to use, because they do not consent to being tracked across other sites by large numbers of companies. Both Opinions are advising against this possibility to continue allowing tracking walls, with some nuances. While WP29 recommends a weaker solution, the EDPS is asking for a complete and explicit ban on tracking walls. The EDPS argues that according to the GDPR, giving consent has to be a genuinely free choice, and these digital walls cannot result in real consent.

Neither online nor offline tracking: WP29 addresses the issue of offline tracking, and argues that data controllers should, only in limited number of circumstances, “be allowed to process the information emitted by the terminal equipment for the purposes of tracking their physical movements without consent of the individual concerned”. WP29 Opinion also suggests that device tracking should only be permitted if the personal data collected is anonymised. Moreover, the EDPS recommends that the provisions allowing for device tracking be deleted and replaced by a simpler requirement of consent (by all end-users concerned).

Keep an eye on the restrictions: Under the current Directive and the proposed Regulation, non-targeted data retention measures are allowed. Both Opinions re-state that national data retention regimes have to comply with the requirements of the European Union Charter of Fundamental Rights and of the case law of the Court of Justice of the European Union (CJEU), both of which require strict safeguards for mass storage of data.

Give redress to both individuals and organisations: The EC’s proposal leaves the right to collective redress out of the ePR Regulation text, which is puzzling. The EPDS took note of this omission and made it clear that an explicit provision for collective redress and effective remedies (or more simply a reference to Article 80 of the GDPR) are needed. Including such provision is essential to ensure consistency with the GDPR, and to allow individuals to access collective redress through, for example, consumer groups.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

WP29: Opinion 01/2017 on the Proposed Regulation for the ePrivacy Regulation (2002/58/EC) (04.04.2017)

EDPS: Opinion 6/2017 on the Proposal for a Regulation on Privacy and Electronic Communications (ePrivacy Regulation) (24.04.2017)

New e-Privacy rules need improvements to help build trust (09.03.2017)

e-Privacy Directive revision: Document pool

(Contribution by Romina Lupseneanu, EDRi intern)



19 Apr 2017

Dangerous myths peddled about data subject access rights

By Guest author

Now that the date on which the General Data Protection Regulation (GDPR) becomes enforceable is rapidly approaching, the European Data Protection Authorities (DPAs) are in the process of clarifying what their shared positions will be on various topics, including profiling. This is done through stakeholder consultation meetings.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

During the latest meeting, one of the more contentious issues surrounding profiling turned out to be the transparency requirements regarding the algorithms used for automated decision making and profiling. While industry representatives in general provided constructive input on the various topics, this issue was more challenging. Several industry representatives were pushing for a very narrow interpretation of the right to access regarding the logic in automated decision making.

The basic argument is that industry has a right to hide the precise details of the calculations used to make decisions that discriminate against individuals. Three points were made in support of claims that the right of information regarding the logic of processing should not extend to disclosing the actual algorithms used:

  1. they would be protected trade secrets;
  2. intellectual property rights would preclude such disclosure;
  3. it would create a moral hazard in case of applications of profiling in fraud prevention.

Regarding the protection of trade secrets, the situation is fairly simple. The Trade Secrets Directive (2016/943/EU), for all its flaws, mentions specifically in its recitals that it shall not affect, among other rights, the right to access for data subjects. Since this Directive has to be implemented by June 2018, there is only a window of a few weeks in which trade secrets protections in some member states could, theoretically, prejudice data subject access to the logic used in automated decision making. So for all practical intents and purposes, trade secret legislation cannot be invoked to prevent disclosure of such underlying algorithms.

As far as intellectual property rights are involved, this is even more of a non-issue. The only so-called intellectual property rights that bear relevance here are copyright law and patent law.

Software copyright law does not explicitly cover underlying algorithms, a view that is reiterated in the ruling of the SAS Institute Inc. v World Programming Ltd case (C‑406/10 CJEU), in which the Court of Justice of the European Union (CJEU) ruled that the functionality of a computer program is not protected by copyright under Computer Programs Directive (91/250/EEC).

As far as patent law is involved, the European Patent Convention states that “schemes, rules and methods for performing mental acts, playing games or doing business, and programs for computers” shall not be regarded as patentable inventions (article 52(2) sub c). It would be difficult to argue that the logic for automated decision making in profiling of personal data is not a method for doing business. A requirement for patent law protection is disclosure of the underlying technology, which makes it even less likely to support an argument that it might prejudice disclosure of logic in automated decision making. Given that none of the other intellectual property rights even come close to covering the logic of algorithms, it follows that there are no barriers in intellectual property laws to disclosure of logic for automated decision making.

Even if there were intellectual property rights covering the underlying logic of software algorithms, it would still not necessarily be a given that these should override the data protection legislation. The CJEU has repeatedly considered competition law interests outweighing intellectual property interests in cases where it had to balance competition against intellectual property.

The last argument, that of a moral hazard, may or may not come into play in the context of fraud detection and insurance risk assessment. First of all, the European legislator has never made any exceptions for it in the GDPR, secondly, this can be addressed by disclosure of the logic as applied to a specific data subject instead of disclosure of the general logic as applied to all data subjects affected.

The logical conclusion for DPAs enforcing the GDPR in the future is to interpret the aforementioned arguments from parts of industry with a great deal of cynicism. They simply have no basis in the EU law and/or reality.

Rejections of data subject access requests to the underlying logic of automated decision making based on “trade secrets” or “intellectual property rights” should be treated by DPAs as violations of the GDPR and addressed accordingly.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

The Trade Secrets Directive (2016/943/EU)

Ruling of the SAS Institute Inc. v World Programming Ltd case

European Patent Convention

Insurance: How a simple query could cost you a premium penalty (30.09.2013)

(Contribution by Walter van Holst, EDRi member Vrijschrift, the Netherlands)



06 Apr 2017

The European Parliament adopts another resolution critical of the Privacy Shield


On 6 April 2017, the European Parliament (EP) voted a motion for a resolution on the adequacy of the protection afforded by the EU-US Privacy Shield. The scheme gives the United States a unique arrangement for the transfer of personal data from the European Union to the United States. The Privacy Shield replaced the Safe Harbor decision, which used to serve the same purpose, until the Court of Justice of the European Union (CJEU) invalidated it in the Schrems case in 2015.

The EU-US Privacy Shield has been showered with criticism from the moment the details of the new(ish) rules were published. However, the European Commission (EC) proposed and adopted it anyway.

The Article 29 Data Protection Working Party of national data protection authorities and the European Union Data Protection Supervisor (EDPS) issued opinions expressing numerous concerns regarding the level of protection offered by the Privacy Shield and its compliance with the right to the protection of personal data and the right to privacy. Moreover, the EP adopted a similar resolution in May 2016, when the draft decision on Privacy Shield was adopted, but its recommendations seemed to be ignored.

Today, the EP has adopted a new resolution which regards many of the Privacy Shield’s provisions as inadequate. The resolution lists several problems in the agreement and calls on the Commission to thoroughly examine them in its first annual review in September 2017.

Among the issues listed in the resolution, the EP raises awareness about the lack of specific rules on automated decisions and of a general right to object and the need for stricter guarantees on the independence and powers of the Ombuds mechanism, the current non-quorate status of the Privacy and Civil Liberties Oversight board, as well as the lack of concrete assurances that the US agencies have established safeguards against mass and indiscriminate collection of personal data (bulk collection). Another flaw mentioned in the Parliament’s criticism is the fact that the Privacy Shield is based on voluntary self-certification and therefore applies only to US organisations which have voluntarily signed up to it, which means that many companies are not covered by the scheme.

Furthermore, the resolution asks the Commission to seek (long overdue) clarification on the legal status of the “written assurances provided” made by the US and to make sure the commitments taken under the new decision will be kept by the new US administration. Furthermore, the resolution calls on the European data protection authorities (DPAs) to monitor the functioning of Privacy Shield and to exercise their powers to suspend or ban data transfers “if they consider that the fundamental rights to privacy and the protection of personal data of the Union’s data subjects are not ensured.”

Unsurprisingly, the Parliament “with concern” the dismantling of the FCC’s privacy rules. Last but not least, the EP calls on the Commission to take all the necessary measures for the Privacy Shield to comply with the General Data Protection Regulation (GDPR) and with the Charter of Fundamental Rights of the European Union.

The Privacy Shield has already been brought to the CJEU by two advocacy groups: EDRi member Digital Rights Ireland (case number T-670/16) and EDRi observer La Quadrature du Net (case number T-738/16). If the CJEU applies the same reasoning as for the former Safe Harbour agreement, the Privacy Shield will need a replacement very soon. It is to be hoped that the EC is preparing the contingency plan to resolve this situation as soon as possible and not wait (again, like it did with Safe Harbour and the two Data Retention rulings) until it is forced to act by the Court of Justice. If the Commission does this then maybe, finally, fundamental rights can be protected on both sides of the Atlantic and both citizens and businesses can enjoy the benefits of increased trust in the online environment.

Civil society letter: Without reforms in US surveillance laws, the Privacy Shield must be suspended (02.03.2017)

Privacy Shield: Privacy Sham (12.07.2016)

European Parliament confirms that “Privacy Shield” is inadequate (26.05.2016)