02 May 2018

Facebook: Unanswered questions

By Joe McNamee

On 9 April 2018, EDRi received an invitation from Facebook to attend a meeting to the loss of trust in Facebook, following the Cambridge Analytica scandal. The meeting was proposed for 26 April.

It struck us that, if Facebook wanted an honest exchange, it would be happy to answer some of the most obvious outstanding issues.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Encouragingly, Facebook said it would welcome the questions and said that they would still also like the meeting on 26 April.

The questions were sent on 16 April and… we never heard from Facebook again…

Here they are:

1. Facebook’s new policy is based on opt-in for facial recognition being applied to inform Facebook users of their faces appearing on photos uploaded by other users. Does this mean that Facebook will index all facial profiles on any photo uploaded, regardless of any consent by any person depicted? Please answer with “yes” or “no” and explain.

1b. More specifically, will Facebook refrain from analysing any photograph uploaded by any user for biometric data about persons depicted on those photos until it has received an opt-in by every person depicted on those photos? Please answer with “yes” or “no”.

2. You state the following: “Second, we’ll ask people who’ve previously chosen to share their political, religious, and “interested in” information in their profile to check that they want to continue to share it.”

Does the above mean that any of the above data will be deleted if Facebook does not receive an explicit consent to retain it? Please answer with “yes” or “no”.

If “yes”, what will be the cut-off date before Facebook starts deleting such data?

2.b If by “sharing” it is meant that the scope of the discontinuation is limited to sharing with other Facebook users and/or Facebook affiliates, how does Facebook consider that this complies with the requirements of art. 9 GDPR for processing these special categories of data?

3. Privacy International created a new Facebook profile to test default settings. By default, everyone can see your friends list & look you up using the phone number you provided. This is not what proactive privacy protections looks like. How does this protect users by design and by default?

4. According to your notification, a “small number of people who logged into ‘This Is Your Digital Life’ also shared their own News Feed, timeline, posts and messages which may have included posts and messages from you”. Why was this not notified to the appropriate national authorities immediately? Are other apps also able to share / receive messages from me?

5. If a similar situation to the one involving Cambridge Analytica were, despite your efforts, to arise again, who would be responsible, Facebook Inc or Facebook Ireland?

6. Why do privacy settings continue to only focus on what friends can & can’t see? If the recent FB scandal has showed one thing, it is that FB’s ad policies have far-reaching consequences for users’ privacy. When are you going to treat ad settings as privacy settings?

7. The GDPR includes new provisions on profiling and automated decision-making. How are you going to change your ad targeting practices to be compliant?

8. The Economist recently reported on how difficult it is for Europeans to download their personal data from Facebook, and Mark Zuckerberg’s testimony described your systems as more transparent than they actually are. How and when, if at all, do you plan to address these issues?

9. You claim to offer a way for users to download their data with one click. Can you confirm that the downloaded files contain all the data that Facebook holds on each user?

You claim to offer a single place to control your privacy. This does not seem to include ways to opt out of ad targeting or to avoid being tracked outside Facebook. Will you offer a single place where users can control every privacy aspect of Facebook, even for people who have no Facebook account?

10. The GDPR gives individuals the right to access and verify their profiles, including marketing profiles based on so called derived data (data that were not disclosed by the user but interpreted from his/her behaviour). Is Facebook going to give its users full access to their marketing profiles? Please answer with “yes” or “no” and explain.

11. Speaking about derived data and marketing profiles, does Facebook process for marketing purposes any data that reveal (directly or indirectly) political opinions of its users? Please answer with “yes” or “no” and explain.

12. Do Facebook apps use smartphone microphones in any way, without this being made clear to the user? If this were to happen, would you consider that lawful?

13. Facebook has voluntary agreements with the Swedish intelligence services to share data. How do you reconcile that with the GDPR?

We are expecting Facebook’s answers any day now…maybe not today, maybe not tomorrow, but soon. If not, we’ll always have Cambridge.

(Contribution by Joe McNamee, EDRi)



18 Apr 2018

Fighting for migrants’ data protection rights in the UK

By Guest author

Since 2014, the United Kingdon (UK) government has steadily rolled out policies to make the country a “hostile environment”  for migrants, in the words of Prime Minister Theresa May.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

This has involved turning various ordinary institutions into border protection agencies. Banks have to collect and supply data to the Home Office (the UK’s interior ministry) on their customers’ immigration status. Landlords are required to check immigration documents before rental. Schools were checking pupils’ nationality and also sharing information with the Home Office, before a boycott campaign put an end to the practice in April 2018. Hospitals, too, must process immigration paperwork before they can deliver any non-urgent treatment. The police, in some regions, are piloting a handheld biometric ID device that instantly gives street officers access to an immigration database.

In the “hostile environment”, migrants are losing the right to live free of pervasive monitoring. They’re also losing the right to basic data protection. This is particularly evident in the case of a data-sharing agreement between the National Health Service (NHS), the Department of Health, and the Home Office. This agreement, established through a Memorandum of Understanding (MoU) in late 2016, without any consultation of professionals or the public, allows immigration enforcement officers to request patient data held by NHS Digital, the database manager for public health in the UK.

The Migrants’ Rights Network (MRN) has been at the forefront of civil society responses to this scheme. MRN, together with Doctors of the World UK, Docs not Cops (a group of professionals resisting the implementation of “hostile environment” measures in the health sector), and civil rights organisation Liberty, argues that sharing data between health services and immigration control officers violates migrants’ fundamental right to patient confidentiality. Such a breach of fundamental privacy rights is all the more worrying that the Home Office has error margins of 10 percent in its decisions to target “immigration offenders” – meaning they would routinely request data for the wrong individuals.

Crucially, introducing the possibility that health services might hand over patient data to the Home Office will make many vulnerable migrants afraid to seek care. This is already a reality. During a parliamentary hearing in January 2018, elected representatives heard the tragic story of an undocumented domestic worker who avoided treatment out of fear that she could be deported, and died of otherwise preventable complications.

MRN argues that such a situation dismantles the very principles of public health, starting with duty of care and public trust in health providers. The Home Office and NHS Digital have denied this, and argue that data-sharing for immigration enforcement is “in the public interest.” Yet the only other reason NHS Digital normally supplies confidential patient data to the Home Office is in the case of serious crime, such as child abuse or murder. By putting immigration and serious crime on a similar level, this data-sharing arrangement contributes to the dramatic criminalisation of undocumented existence (already exemplified in everyday language by the expression “illegal migrant”).

The UK Parliament’s Health Committee and the British Medical Association have both asked for data-sharing to stop. The Home Office have responded by saying they need to gather more evidence of the scheme’s impact, which could take more than a year. MRN believe this is unacceptable, as lives are currently at risk. MRN is thus challenging the data-sharing agreement in court. The organisation has obtained permission for judicial review (after appeal), likely to take place during the summer 2018, and is currently raising funds to cover its potential court costs.

MRN’s legal challenge is rooted in a desire to protect public health principles and vulnerable lives, but it also has broader implications for data protection in the UK. It aims to send a clear signal that data rights cannot be stripped on the basis of nationality. This is absolutely crucial at a moment when the UK’s latest data protection law, currently being debated in Parliament, includes an exemption clause for immigration enforcement, which would prevent migrants from exercising their full rights under the EU General Data Protection Regulation (GDPR). MRN thus hopes to set a positive precedent for judicial activism on these matters, and make a strong case for non-discrimination as a pillar of data justice.

Against Borders for Children campaign: We won! DfE are ending the nationality school census!

Crowdjustice fundraiser: Stop data-sharing between the NHS and the Home Office

Making the NHS a ‘hostile environment’ for migrants demeans our country (24.10.2017)

‘Hostile environment’: the hardline Home Office policy tearing families apart (28.11.2017)

NHS accused of breaching doctor-patient confidentiality for helping Home Office target foreigners (09.11.2017)

Migrants’ Rights Network granted permission for judicial review of patient data-sharing agreement between NHS Digital and the Home Office (01.03.2018)

MRN legal challenge against NHS data-sharing deal (29.11.2017)

(Contribution by Fabien Cante, LSE Media & Communications / Migrants’ Rights Network, the United Kingdom)


18 Apr 2018

Privacy at ICANN: WHOIS winning?

By Guest author

The Internet Corporation for Assigned Names and Numbers (ICANN) has struggled over the publication of the name, address, phone number, and email address of domain name registrants since its inception in 1998. That registry is called WHOIS.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

WHOIS might have worked well during the 1980s when only a few researchers had domain names, but now it exposes millions of individuals to harassment and spam. So far, neither the efforts of civil society who volunteer at this multi-stakeholder organisation (notably the Noncommercial Users Constituency), nor the repeated interventions of the Data Commissioners of the world have had a lot of impact. However, there is a huge struggle going on now over compliance with the European General Data Protection Regulation (GDPR). Registrars who collect registrant data and provide it according to their contracts with ICANN have obtained legal advice that indicates they are vulnerable to significant fines.

ICANN continues to try to maintain a registrant directory that permits the continued access of many third parties, notably law enforcement agencies, trade mark and copyright holders, and private sector cybercrime investigators and reputational “blacklisters”. There has been a flurry of activity to address long-neglected privacy rights, and CEO Goran Marby has been asking for advice from the Article 29 Working Party. They answered on 11 April 2018 in a letter which was quite clear about ICANN’s failure to comply.

According to the Non-Commercial Stakeholder Group (NCSG), key issues that remain are:

  1. There is no multistakeholder process at the moment, and in recognition of the work which was going on in the WHOIS policy development process has been temporarily suspended. The CEO and the Board will make a decision, claiming it to be based on advice from the Article 29 Working Party and on “community input”. That interim policy is good for a year, during which time the community can propose changes, through a normal policy development process. Once the year is over (and the process takes a couple of months in itself to vote through a policy) the interim policy will become the final policy unless there is an agreed replacement. Given the recent history of the Registration Directory Services Policy Development Process (RDS PDP), it is highly unlikely that consensus to change the interim solution in less than a year would be achieved. This appears to be abandonment of the multi-stakeholder process, and requires close scrutiny. A multi-stakeholder process needs to remain in place to reach some kind of consensus on the biggest policy debate that ICANN has confronted in its history.
  2. The purpose of the collection, use and disclosure of registrant data is being construed to include feeding the third party actors who have always had free access to the data (in the NCSG view, often illegally).
  3. The issue of public safety and consumer protection as a reason to permit widespread access to data is unsupported by recent accurate data.
  4. The risks to individuals and small organisations have never been measured.
  5. The proposed tiered access model depends for its efficacy on a serious accreditation process. Because there is no time to develop one before 25 May, of the day the General Data Protection Regulation becomes law, an interim self-accreditation process is proposed. There may not be an appetite to work on proper standards that engage the data protection authorities, and the interim solution will not simply expose individuals to marketing, domain expropriation, spam, and risk from political adversaries. Self-accreditation risks setting up an anti-competitive regime where registrant data is held by dominant players.
  6. ICANN is still not clear as to whether it regards itself as a data controller, although a long-serving member of the ICANN community challenged them publicly on this matter at ICANN61 meeting in March 2018.It has also thus far refused to appoint a privacy officer for any registrant data related issues. What is clear to the NCSG is that ICANN is the only contracting party who has access to all escrowed data of registrants, and that they set the terms for that escrow arrangement. They also set the terms for the contracts with registries and registrars, and enforce their compliance through the Global Domains Division (compliance branch). It is worth noting that one of the recommendations of the business community proposal is that ICANN must retain access to all registrant data at all times, whatever the solution selected.
  7. For those not following the GDPR closely, the issue of who is the controller may be extremely important in terms of liability.
  8. NCSG is working on a standards development project led by a University of Toronto team, to develop proper accreditation standards for third parties to whom personal data is released by data controllers and processors. There must be strong management practices in place to ensure that the entities asking for the data are indeed who they say they are, and that their purported reasons to request the data are legitimate, limited, and proportionate. There should also be standards to ensure proper safeguarding and eventual destruction of the data, and access rights for individuals, as well as transparency except in exceptional circumstances. The Article 29 Working Party released a paper in February detailing their expectations and their own involvement in the accreditation of various processors under the GDPR; this standards proposal is working in the same vein, to explore what best management practices look like.

Working Paper International Working Group on Data Protection in Telecommunications

Working Paper on Privacy and Data Protection Issues with Regard to Registrant data and the WHOIS Directory at ICANN (27-28.11.2017)

Non-Commercial Stakeholder Group (NCSG) Positions on Whois Compliance with GDPR (16.04.2018)

ICANN: Data Protection/Privacy – Latest Announcements, Updates & Blogs

ICANN Receives Data Protection/Privacy Guidance from Article 29 Working Party (12.04.2018)

(Contribution by Stephanie Perrin, University of Toronto, NCSG Councilor)



18 Apr 2018

Cambridge Analytica access to Facebook messages a privacy violation

By Gemma Shields

Less than one month after Cambridge Analytica Whistleblower Christopher Wiley exposed the abuse of (so far) 87 million Facebook users’ data, Facebook Co-Founder, Chairman, and CEO Mark Zuckerburg testified before the US Congress.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

On 10 and 11 April, Zuckerberg provided testimony in a joint hearing of the Senate Judiciary and the Senate Committee on Commerce, Science, and Transportation, and then to the House Energy and Commerce Committee. He faced questions on a number of democracy-disrupting and privacy-violating issues to which the social media giant has been a party, not least the composition – and use – of personally identifiable data as part of the Facebook-Cambridge Analytica scandal.

This scrutiny gave rise to uncertainty over what Facebook user data Cambridge Analytica had access to, and of just what this personal data comprised. What began as the personality app “This is Your Digital Life”, designed by researcher Aleksander Kogan and installed by 270 000 Facebook users (which in turn provide access to the data of at least 87 million users), resulted in data consulting firm Cambridge Analytica having access to the private inbox messages of users.

This revelation, whilst a part of the unfolding exposé, was confirmed in the notifications that began appearing at the top of users News Feeds which read “a small number of people who logged in to ‘This is Your Digital Life’ also shared their own News Feed, timeline, posts, and messages which may have included posts and messages from you.”

With a global reach, the scandal has implications for users worldwide. In the European Union, such access to personal data would be prohibited by the proposed ePrivacy Regulation. Current ePrivacy rules on access to the content of communications do not cover Facebook, although this would change under the proposed ePrivacy Regulation.

So far, lobbyists from Facebook and its allies have lobbied Member States in the EU Council successfully to slow down the adoption of the new Regulation – and not even this scandal has been able to persuade EU Ministers (many of whom signed a letter arguing that our fundamental rights should be “balanced” with “digital products and services” of the need that Facebook’s access to private communications needs to be restricted.

On how such abuse could happen, a Facebook spokesperson said: “In 2014, Facebook’s platform policy allowed developers to request mailbox permissions but only if the person explicitly gave consent for this to happen. At the time when people provided access to their mailboxes – when Facebook messages were more of an inbox and less of a real-time messaging service – this enabled things like desktop apps that combined Facebook messages with messages from other services like SMS so that a person could access their messages all in one place. According to our records only a very small number of people explicitly opted into sharing this information. The feature was turned off in 2015.”

Conditions for consent – as per Article 7 of the General Data Protection Regulation (GDPR) – cannot have been met, however, and in particular, the explicit consent of 87 million users to access to and repurposing of their personal data has not been obtained.

Users can check if their personal data was harvested and misused by Cambridge Analytica here: https://www.facebook.com/help/1873665312923476?helpref=search&sr=1&query=cambridge

Transcript of Zuckerberg’s appearance before the House committee (11.04.18)

Facebook scandal: I am being used as scapegoat – academic who mined data (21.03.18)

Revealed: Aleksandr Kogan collected Facebook users’ direct messages (13.04.18)

Cambridge Analytica Could Have Also Accessed Private Facebook Messages (04.10.18)

How can I tell if my info was shared with Cambridge Analytica?

(Contribution by Gemma Shields, EDRi intern)



07 Feb 2018

Data protection – time for action

By Anne-Morgane Devriendt

On 24 January 2018, the European Commission (EC) published a Communication on the implementation of the General Data Protection Regulation (GDPR), entering into force on 25 May 2018: “Stronger protection, new opportunities”.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The Communication describes the preparatory works by the Commission to help with the implementation of the GDPR and what the Commission plans to help Member States and companies to comply with the new data protection framework.

Most of the work at the EU level has been done by the group of Data Protection Authorities (the so-called Article 29 Working Party). It has been preparing guidelines, on the basis of extensive consultations and workshops with a variety of stakeholders. More work still needs to be done in order to ensure the effective implementation of the new rules.

Although the GDPR is a Regulation and therefore applies “as is” in all Member States, some legislation needs to be adapted to the new obligations set by the GDPR, especially regarding the flexibilities with which the “Regulective” can be implementedat Member States’ discretion, on automated decision making and transfer of personal data to third countries, among other things i. Ironically, while industry demanded harmonisation at the start of the legislative process, it spent most of the decision-making process demanding national flexibilities and exceptions, leading to the opposite outcome to the one it initially asked for. Sometimes, one is left with the impression that lobbyists are working to create work for themselves.

At the moment of publication of this Communication, just four months before the GDPR enters into force, only two out of 28 Member States (Austria and Germany) have finished this legislative preparation. It is also to be clarified how Member States will ensure that national Data Protection Authorities (DPAs) are given the means to fulfill their new functions as prescribed by the GDPR.

Finally, the Communication stresses that the core principles of data protection are not affected by the new Regulation As a result, few changes are needed from organisations, if they already comply with the existing Data Protection Directive. However, the Commission notes that citizens and small and medium-sized companies are not well informed about the provisions of the GDPR. It has launched guidelines on the new rules for business and rights for citizens.

One cannot help but wonder why neither Member States nor companies seem to be prepared for new legislation that has been discussed since the adoption of the Commission’s initial Communication in November 2010 and in the four years of legislative discussion, that were shaped by an unprecedented lobbying campaign by parts of the industry. This ostensible lack of preparedness is also surprising bearing in mind that the Regulation does not change existing core principles that should already be respected by controllers through the transposition (and enforcement) of the Data Protection Directive into national law since 1998.

Communication from the Commission (24.01.2018)

Commission’s GDPR guidelines for citizens and small and medium companies

PROCEED WITH CAUTION: Flexibilities in the General Data Protection Regulation (05.07.2016)

General Data Protection Regulation: Document pool

(Contribution by Anne-Morgane Devriendt, EDRi intern)



07 Feb 2018

The Bulgarian EU Council presidency & the latest assault on ePrivacy

By Anne-Morgane Devriendt

In January 2018, the Bulgarian Presidency of the Council of the European Union (EU) picked up where the Estonian Presidency left off on the ePrivacy Regulation. It issued two examinations of the last Estonian “compromise” proposal and asked national delegations for guidance on some issues. Together, the documents cover most of the key points of the text. While the Bulgarian Presidency brings clarity on some points, its questions pave the way to undermine the text – and therefore threatens the protection of citizens’ privacy, confidentiality of communications of both citizens and businesses, as well as the positions of innovative EU companies and trust in the online economy.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

One of the main lobbying devices used against the ePrivacy proposal is its alleged redundancy, due to the General Data Protection Regulation (GDPR) coming into force in May 2018. The processing of personal data is already covered by the GDPR, why would we need an additional text? The Bulgarian Presidency addresses this question by clarifying the ePrivacy Regulation’s role as lex specialis of the GDPR. Effectively, the ePrivacy Regulation complements the GDPR, and if the two texts overlap, then ePrivacy applies, as it provides for a higher level of protection of communications data, which are sensitive data.

On privacy settings, covered by Article 10, the Bulgarian Presidency proposes to keep the choices presented by the Estonian Presidency, providing for privacy by default and an easy way to change the settings, or to require more granularity in the settings by blocking the storage or the processing of data by third parties. This offer users a degree of control over third-party activities on their devices.

After this welcome clarification on this (rather simple) issue and this relatively privacy-friendly proposal, the Bulgarian Presidency then follows up on the undermining of the text already initiated by the Estonian Presidency in December 2017.

In the second document that deals with the third Chapter of the proposal on the “rights to control electronic communications”, the Bulgarian Presidency mostly follows the Estonian proposal, except for publicly available directories. There, it proposes to either put obligations both on the providers of number-based communication services and on publicly available directories, or the harmonisation of the rules with opt-in or right to object. As for direct marketing, the Bulgarian Presidency asks the national delegations to give their opinion on the need for uniform rules on voice-to-voice calls.

The Bulgarian Presidency also asks the national delegations to choose between two proposals concerning permitted processing of communications data (provided in Article 6): a middle ground that would be to allow further processing if it has no impact on privacy; or the inclusion of a “legitimate interest” ground for further processing of metadata. It is hard to understand what kind of further processing of communication data – or metadata – would not impact privacy (not least following the latest revelations of security breaches due to “non-personal” data, or how there could be a “legitimate interest” for the further processing of communication metadata, not least due to contrary positions already taken by the Court of Justice of the European Union in the Tele 2 case.

On storage and erasure of electronic communications data, regarding data that is no longer needed to provide a service, the Bulgarian Presidency proposes to either delete the provisions on the deletion of data, or to keep them while deleting the provisions authorising recording or storage of the data by the end-user or a third-party entrusted by them. The first possibility would remove the protection of communication data at rest – ironically creating, at the request of industry lobbyists, the kind of incoherence between ePrivacy and the GDPR of which industry lobbyists have been warning. The second would keep the level of protection agreed upon by the European Parliament.

The worst attack of the Bulgarian Presidency on the text concerns the protection of terminal equipment (Article 8). In addition to the proposals put on the table by the Estonian Presidency, the Bulgarian Presidency proposes different exemptions to the need for consent for the processing of data from an individual device: for “non-privacy intrusive purposes”; based on a “harm based approach” that would consider the levels of impact of different techniques on privacy. It also proposes to couple together the addition of a “legitimate interest to deliver targeted advertisement” and the right to object; and even asks whether the text should cover the “access to services in the absence of consent to process information”. Again, it is hard to see how there could be a “legitimate interest to deliver targeted advertisement”, and how this would contribute to the protection of privacy. Such a convoluted legal construction would, in any event, be only usable by the largest targeted (or “surveillance”) advertising companies. If this approach is followed, the EU would end up with legislation (ePrivacy) that would make it easier to access data on a computer system, as well as legislation (attacks against computer systems – Directive 2013/40/EU) criminalising access to a computer system.

Although the Bulgarian Presidency did take a progressive stance on the links between the GDPR and ePrivacy, the rest of its proposals systematically undermine the text by lowering the level of protection of the communications and privacy.

ePrivacy Regulation proposal – Examination (1) of the Presidency discussion paper (11.01.2018)

ePrivacy Regulation proposal – Examination of Articles 12 to 16 (25.01.2018)

Latest proposal by the Estonian Presidency (05.12.2017)

ePrivacy proposal undermined by EU Member States (10.01.2018)

(Contribution by Anne-Morgane Devriendt, EDRi intern)



10 Jan 2018

EU-Japan trade agreement not compatible with EU data protection

By Vrijschrift

The EU and Japan have announced the conclusion of the final discussions on a trade agreement, the EU-Japan Economic Partnership Agreement (EPA).

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Regarding cross-border data flows and data protection, the European Commission’s press release states that recent reforms of their respective privacy legislation offer new opportunities to facilitate data exchanges, including through a simultaneous finding of an adequate level of protection by both sides.

But this is not the full story. Besides the possibility to adopt adequacy decisions, the EPA contains explicit data flow commitments in the financial section, implicit data flow commitments in the services chapter, and a review clause. Especially the implicit data flow commitments do not seem compatible with the fundamental right to the protection of personal data.

In addition, a form of investor-to-state dispute settlement (ISDS/ICS) may be added later. All published trade agreement texts are subject to legal scrubbing (adjustment by legal services).

Adequacy decisions

Allowing cross-border data flows through an adequacy decision is, in principle, the correct way to approach this issue. Such decisions are EU decisions (from the EU point of view). If data protection in Japan deteriorates, or if the EU rejects mass surveillance in Japan, the EU can revoke the adequacy status – in principle.

The approach may not work out in practice. It remains to be seen whether the European Commission would really revoke the adequacy status. But should the EU award Japan adequacy status? Graham Greenleaf argues that Japan has serious issues to overcome: a weak personal information definition; a carve-out for “anonymously processed information”; a cross-border privacy rules back-door for onward transfers to the US; no record of enforcement; trivial or missing remedies; carve-outs for big data; carve out for de-identification. (Update: article)

Will the EU take an independent adequacy decision? Take this formulation in the press release:

“This offers new opportunities to facilitate data exchanges, including through a simultaneous finding of an adequate level of protection by both sides.”

The formulation “simultaneous finding of an adequate level” suggests a negotiated compromise where fundamental rights are traded against economic rights.

Implicit cross-border data flow commitments

Many people overlook implicit data flow commitments. Chapter 8 Section C Cross-Border Trade in Services, contains National treatment and Most-favoured-nation treatment clauses.

Cross-border services imply cross-border data flows. (1) We find a safeguard in Section G Exceptions, Article X1 General exceptions, paragraph 2. It is a GATS article XIV kind of exception with many conditions. Such safeguards are insufficient, see Kristina Irion, Svetlana Yakovleva, and Marija Bartl.

The implicit cross-border data flow commitments do not have sufficient safeguards. This is not compatible with the EU Fundamental rights framework.

Explicit data flow commitment

Chapter 8 Section E Sub-section 5 Financial Services, article 6, Transfers of Information and Processing of Information, contains a cross-border data transfer commitment regarding financial data. Paragraph 2 contains a safeguard:

“2. Nothing in paragraph 1 restricts the right of a Party to protect personal data, personal privacy and the confidentiality of individual records and accounts so long as such right is not used to circumvent the provisions of this Article.”

The strength of the exception is limited by a condition (“so long as …”). (2) The safeguard seems stronger than the one used in the financial sections in the agreements with Korea, Singapore, Vietnam and Ukraine. It is also stronger than the general exception in Section G Exceptions, mentioned above. The safeguard is based on the 1994 Understanding on Commitments in Financial Services (article B. 8).

Marija Bartl and Kristina Irion noted:

“The formulation used in CETA is likely more prudent compared to the language proposed in the Agreement with Japan. From the outset, it lays down a better division of labor between trade law and domestic data protection law. Given that the EU trade negotiators tend to work on blueprints of their earlier agreements reverting to the language of the 1994 Understanding on Financial Services and the text of the earlier EU – Singapore Free Trade Agreement 16 would mean a regressive development for the safeguards on data privacy.” (3)

It would seem that it is open to debate which one is stronger, as the formulation in the EU-Canada CETA has weaknesses as well. More importantly, neither safeguards respects the European Parliament demands for TTIP and TiSA. (4)

Review clause

The draft EPA contains a review clause. Chapter 8 Section F Electronic Commerce Article 12 Free Flow of Data reads:

“The Parties shall reassess the need for inclusion of an article on the free flow of data within three years of the entry into force of this Agreement.”

There is a lot of discussion on free flow of data commitments – while the draft EPA already contains (often overlooked) implicit data flow commitments. The review clause may act as a distraction.


The Commission’s press release notes that negotiations continue on investment protection standards and investment protection dispute resolution. Adding ISDS/ICS or an investment court could have a negative impact on data protection. See here and here.

After writing the first version of this blog, someone alerted me to this analysis, from October 2017: Marija Bartl and Kristina Irion, The Japan EU Economic Partnership Agreement: Flows of Personal Data to the Land of the Rising Sun. Recommended reading.

This article was originally published at https://blog.ffii.org/eu-japan-trade-agreement-not-compatible-with-eu-data-protection/

(Contribution by Ante Wessels, EDRi member Vrijschrift, the Netherlands)




(1) See page 1 (after the Roman numerals) Kristina Irion, Svetlana Yakovleva, and Marija Bartl.

(2) See, in general, without, for now, the part on ISDS, here.

(3) Note there is an important difference between the 1994 Understanding and the EU-Singapore FTA texts. Singapore, 8.54 (2): “Each Party shall, adopt or maintain appropriate safeguards to protect privacy and personal data, including individual records and accounts, as long as these safeguards is not used to circumvent the provisions of this Agreement.” 1994 Understanding, article B.8: “(…) Nothing in this paragraph restricts the right of a Member to protect personal data, personal privacy and the confidentiality of individual records and accounts so long as such right is not used to circumvent the provisions of the Agreement.” The first is a commitment, the second an exception to a commitment.

(4) TTIP 2 (b) (xii); TiSA 1 (c) (iii) reads: “(…) to incorporate a comprehensive, unambiguous, horizontal, self-standing and legally binding provision based on GATS Article XIV which fully exempts the existing and future EU legal framework for the protection of personal data from the scope of this agreement, without any conditions that it must be consistent with other parts of the TiSA; (…)” The articles in the agreements are not unambiguous, and do not fully exempt (…) from the scope of the agreement.

10 Jan 2018

ePrivacy proposal undermined by EU Member States


The discussions on the ePrivacy Regulation continue in the European Union (EU) legislative process. They were on hold for a few weeks because of ongoing negotiations on the European Electronic Communications Code (EECC) – another big “telecoms” file that the Council of the European Union is working on.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

On 5 December 2017, the Estonian Presidency of the Council proposed new compromises on key articles. This latest proposal for amendments is related to Articles 6, 7 and 8 of the draft ePrivacy Regulation, which concern permitted processing (Art. 6), storage and erasure of communications data (Art. 7) and the protection of stored communications in users’ devices (Art. 8).

Permitted processing

The provisions on permitted processing cover the circumstances under which electronic communications data may be processed.

The Estonian Presidency text suggests a few adaptations to be in line with the General Data Protection Regulation (GDPR) by including the legal ground of vital interest in Article 6(2)(d) and a new recital 17a, as well as provisions for accessibility in Article 6(3)(aa) and the new recital 19a. These additions should not add any risks for privacy issues in the way they are currently designed.

Much more concerning is the addition in Article 6(2)(e) and a recital 17b of a legal ground for scientific research and statistical purposes, similar to the one in Article 9(2)(j) of the GDPR (research, unlike the archiving, need not be “in the public interest”). The text of the recital and the Article state that this “type of processing should be subject to further safeguards to ensure privacy of the end-users by employing appropriate security measures such as encryption and pseudonymisation.” The use of “such as” means that these are just possibilities, not requirements. On top of that, a lot of flexibility would be given to Member States, since these measures must be “based on Union or Member State law, which shall be proportionate to the aim pursued and provide for specific measures”. This creates risks for privacy, security and the economic benefits generated by a more predictable, harmonising measure.

Storage and erasure

The provisions on storage and erasure cover what protection should apply to different types of data and the deletion of data that is no longer needed to perform a service.

On storage and erasure, the Estonian Presidency “invites delegations to reflect on the need for” Art. 7(1) which ensures protection of communication data when it is at rest (i.e. stored in the provider’s network). Not including the protection of communications data at rest in the ePrivacy regulation means that an e-mail would be sensitive data subject to the standards of the ePrivacy Regulation while being transmitted and suddenly, upon arrival in the (online) mailbox of the provider, be subject to the General Data Protection Regulation. This would create the option for processing of the content as non-sensitive data under the “legitimate interest” exception in the GDPR, in order to facilitate invasive surveillance of content, of the kind previously done by Gmail. Bizarrely, businesses lobby both for clear, predictable rules and unclear and unpredictable rules like this.

Protection of terminal equipment

The provisions on protection of terminal equipment cover the rule for installing or using data on an individual’s communications device.

As regards terminal equipment, recital 21 adds precision on the use of cookies. Cookies can be used for both tracking and non-tracking purposes. The text recognises “authentication session cookies used to verify the identity of end-users engaged in online transactions” as legitimate, as well as some audience measuring. However, Articles 8(1)(d) and 8(2)(c) authorise audience measuring “by a third party on behalf of the provider of the information society service” and statistical counting without making pseudonymisation mandatory. This would facilitate the kind of cross-platform data processing done by, for example, Google Analytics.

Recital 21 and Article 8(1)(e) also allow for installation of security updates without the consent of the end-user, provided they are necessary, that the user is aware of them and that the user can delay them. While security updates are particularly important to protect the user from attacks or breaches, consent should remain as the sole legal basis for any process linked to accessing a terminal equipment. That way, instead of knowing that a “security update” is being installed on your phone, computer or other connected device, the software provider would have an incentive to be more transparent and give you more information on the update and what it is for.

Although not every proposed amendment threatens fundamental rights, the Estonian Presidency proposed to broaden the scope of exceptions in significant ways. It suggested authorising some processing that goes beyond what is strictly necessary, not keeping consent as sole legal basis, and not putting up strong safeguards to limit the impact of this broadening on privacy. This weakening of protections and predictability brings us closer to the kind of security and privacy chaos that the United States is experiencing. It would without doubt create the “chill on discourse and economic activity” that failure to implement privacy and security measures has caused in the US. But at least Facebook and Google will be happy.

Presidency text leaked by Austrian government (05.12.2017)

Presidency text (05.12.2017)

e-Privacy: what happened and what happens next (29.11.2017)

e-Privacy revision: Document pool

(Contribution by Anne-Morgane Devriendt and Diego Naranjo, EDRi)



06 Oct 2017

ePrivacy : Foire Aux Questions


Original version here (English)

Qu’est-ce que le Réglement vie privée et communications électroniques ?

Le Réglement vie privée et communications électroniques ou e-Privacy est un Réglement qui couvre des problèmes spécifiques de la vie privée et de la protection des données dans le domaine des communications. Elle a été adoptée en 2002 et révisée en 2009. Le texte officiel de la version actuelle peut être trouvé ici.


Pourquoi avons-nous besoin de cet instrument ?

Le Réglement e-Privacy a été crée pour garantir la vie privée et protéger les données personnelles dans le domaine des communications électroniques, en “complétant et détaillant” les sujets abordés dans l’outil juridique principal, c’est-à-dire la Directive sur la Protection des Données, désormais appelée Règlement Général sur la Protection des Données (RGPD). Par exemple, l’e-Privacy protège la confidentialité du contenu des communications, des informations stockées et de leur accès sur l’appareil d’un individu. Le RGPD ne couvre pas cela spécifiquement.

La confidentialité des communications est très complexe. Elle couvre non seulement votre droit à la vie privée et à la protection des données, mais aussi votre liberté d’expression et de communication. Sans une législation qui définit clairement le sens de ces droits fondamentaux dans cet environnement complexe, la protection de la confidentialité et la sécurité des communications seraient moins prévisibles et plus difficilement applicables. Un manque de règles précises rend aussi plus difficile pour les entreprises le développement de nouveaux services innovants.

Le Règlement Général sur la Protection des Données (RGPD) ne suffit-il pas?

Même si le RGPD couvre de nombreux sujets en lien avec la protection des données, il ne couvre pas directement et précisément le droit à la vie privée et, plus particulièrement, le droit à la liberté de communication, qui sont deux droits fondamentaux distincts. Ainsi, l’e-Privacy est un niveau de précision nécessaire pour assurer une protection efficace et prévisible des droits qui ne sont pas couverts par le RGPD avec une précision suffisante. De plus, la e-Privacy couvre également des activités où le traitement des données personnelles n’est pas le sujet principal, comme l’envoi non sollicité de messages (par exemple les pourriels ou marketing direct). Elle fournit aussi une base pour la protection des informations stockées sur l’appareil d’un individu. Il est important de se souvenir que le but de l’e-Privacy n’est pas de créer de nouveaux droits, mais de compléter des règles existantes, à la fois pour le bien des individus et des sociétés.

Le besoin d’une législation sur la vie privée et la sécurité des données personnelles dans le domaine des communications électroniques augmente. Le suivi en ligne et la surveillance des e-mails à des fins publicitaires sont des pratiques de plus en plus courantes ; alors que les entreprises télécom tentent de copier les entreprises en ligne en tirant des profits des masses de données des clients qu’elles possèdent (y compris des données de localisation). De plus, l’e-Privacy doit être mise à jour pour rester en adéquation avec les dernières innovations technologiques, comme l’utilisation d’applications de messagerie instantanée (chat) à la place des SMS ou mails.

Quels droits fondamentaux sont touchés par le Réglement e-Privacy ?

  • Le droit fondamental à la confidentialité des communications, entériné dans l’article 7 de la Charte des Droits Fondamentaux de l’Union Européenne. Le nouvel instrument qui va remplacer ou réviser l’e-Privacy devrait clarifier de façon précise que ce principe s’applique totalement aux données des activités en ligne et aux communications, incluant le trafic et les données de localisation, comme définis actuellement dans le Réglement e-Privacy. De plus, il devrait aussi s’appliquer à toute donnée similaire créé ou utilisée en ligne, comme les données de localisation, de navigation, d’utilisation des e-books, d’utilisation des applications mobiles, de recherche, etc. et à toute autre nouvelle donnée en résultant. Le nouvel instrument doit aussi apporter de la clarté sur les conceptions techniques et l’application par défaut de la protection de la vie privée dans ce contexte.
  • Les droits fondamentaux à la protection des données personnelles et à la liberté d’expression, comme entérinés dans l’article 8 de la Charte citée plus haut. Pour la plupart des personnes dans l’UE, la façon la plus facile d’accéder à l’information implique l’internet. Pour protéger cela, l’instrument révisé devrait bannir l’obligation d’accepter le suivi de leurs activités, ainsi que le profilage et la prise de décision automatique qui s’ensuivent (par exemple, en acceptant les cookies avant de pouvoir accéder à un site internet). Cela est particulièrement important pour l’accès à des informations sur des sujets liés à des données sensibles, ou lors de l’accès à des services du secteur public.

Quelles activités sont couvertes dans l’e-Privacy ?

  • la confidentialité et la sécurité des communications ;
  • le trafic et les données de localisation produits par les appareils personnels ;
  • le suivi des utilisateurs, y compris lors de l’utilisation d’appareils personnels (comme pour des publicités par analyse comportementale) ;
  • les cookies ;
  • les mesures de sécurité des appareils personnels ;
  • la facturation détaillée ;
  • l’identification des numéros d’appel ;
  • les annuaires publics et privés ;
  • les pourriels et appels non sollicités à but de prospection commerciale ;
  • les notifications de violation de données (spécifiées plus tard dans le Réglement de l’UE 611/2013).

Quels sont les éléments qui doivent être mis à jour?

Tout ce qui à trait aux activités en ligne dans l’e-Privacy (comme la confidentialité et la sécurité des communications et des appareils personnels, ainsi que le suivi des utilisateurs) doit être mis à jour pour correspondre aux innovations technologiques présentes et futures. Les réglementations sur la facturation détaillée, les registres d’utilisateurs, et les communications non-sollicitées doivent être réévalués, pour vérifier si elles sont en accord avec le RGPD. Certains de ses aspects, comme la façon dont on doit traiter les violations de données, ne requièrent pas une législation spécifique. Ils peuvent donc être supprimés. Ainsi on pourrait résoudre cela en faisant référence au RGPD, afin d’éviter toute redondance.

J’en ai assez de voir des bannières qui me demandent d’accepter les cookies. Est-ce que cela va encore en rajouter ?

L’e-Privacy essaye actuellement de donner aux utilisateurs un peu de contrôle sur le suivi en ligne. En revanche, elle le fait d’une façon plutôt brutale. Les enseignements tirés de l’expérience et des évolutions technologiques suggèrent que la disposition qui régule les cookies dans l’e-Privacy devrait être améliorée, afin de permettre des mécanismes de consentement plus faciles à utiliser.

Comme nous l’avons expliqué dans un article précédent, les cookies sont une des façons de laisser des traces numériques derrière vous lorsque vous naviguez. Ce sont des bouts d’information qui s’installent automatiquement sur votre appareil lorsque vous visitez des sites web. Les règles révisées sur les cookies dans l’e-Privacy devraient permettre une navigation plus agréable en supprimant l’obligation de consentement pour les cookies qui ne concernent pas la collecte et le traitement de données personnelles (comme le traçage avec des services tiers des utilisateurs et des appareils). Cela s’appliquerait par exemple aux statistiques qui comptabilisent quelles sont les pages d’un site web les plus visitées. Ces statistiques collectées par le propriétaire d’un site (“cookies de premier parti” ou “cookies internes”) n’impliquent pas de traitement des données personnelles inutile. Généralement, nous faisons référence aux lignes directrices sur les cookies du Groupe de travail Article 29 sur la protection des données à ce propos.

Quel est le lien avec la protection contre la surveillance de masse ?

Sans aucun doute nous pouvons nous attendre à un usage croissant des appareils personnels électroniques (smartphones, tablettes, ordinateurs) ainsi que des technologies liées qui sont connectées à Internet (comme dans l’internet des objets). Ces évolutions créent de nouvelles opportunités pour la communication en ligne, mais comportent aussi des risques pour la confidentialité et d’autres droits fondamentaux. La communication en ligne implique souvent de nombreuses personnes au delà des frontières nationales, sans que les utilisateurs en soient pleinement conscients.

Nous sommes d’accord avec le Contrôleur européen de la protection des données (CEPD) sur l’idée que le nombre et la fréquence des requêtes gouvernementales faites aux services internet (Twitter, Gmail et autres) devraient être rendus publics, de façon à donner aux individus une vision plus claire sur la façon dont ces pouvoirs gouvernementaux envahissants sont utilisés, en pratique. Si le public est au courant de la conduite du gouvernement, il sera dans une position plus à même de lui demander des comptes. Dans ce contexte, plus de transparence pourrait permettre de restaurer la confiance que les personnes accordent au secteur des communications électroniques.

Quel est le lien avec la sécurité de mes appareils électroniques, comme mon smartphone ?

Le RGPD inclut des obligations en matière de sécurité sur le traitement des données personnelles, alors que l’e-Privacy permet l’inclusion d’obligations en matière de sécurité qui sont plus spécifiquement adaptées à nos communications en ligne. Ces obligations en matière de sécurité devraient non seulement s’appliquer aux fournisseurs de communications électroniques (les télécoms), mais aussi couvrir les développeurs d’application et les fabricants d’appareils électroniques, par exemple. Les entreprises derrière les applications et les appareils ne sont pas toujours les principaux responsables légaux. Pourtant, en raison de leur rôle important dans la protection de la sécurité et la confidentialité des communications personnelles, ils devraient aussi être soumis à des normes de sécurité. Nous faisons plus particulièrement référence aux recommandations sur les normes de sécurité et de vie privée pour les fournisseurs de systèmes d’exploitation, les fabricants d’appareils et autres acteurs principaux formulés par le Groupe de travail Article 29 sur la protection des données dans son Opinion 8/2014 sur l’Internet des Objets.

Cette FAQ a été préparé par l’office d’EDRi à Bruxelles et des membres Open Rights Group, fIPR, Bits of Freedom, Access Now, Panoptykon and Privacy International.

Translation by volunteers Pierre, Florian and Gilles.

28 Sep 2017

European Parliament Consumer Protection Committee chooses Google ahead of citizens – again

By Joe McNamee

On 28 September, the European Parliament Committee on Internal Market and Consumer Protection (IMCO) adopted its Opinion on the proposed e-Privacy Regulation. Just as it did when reviewing the General Data Protection Regulation (GDPR), it is fighting hard to minimise the privacy and security of the European citizens it is meant to defend.

Currently, the surveillance-based online advertising market is dominated by Facebook and Google. It was estimated that, in the US, 99% of growth in the sector is being absorbed by those two companies. Most of the amendments adopted in IMCO serve the purpose of defending this anti-competitive, anti-choice, anti-privacy, anti-innovation framework.

Some of the many egregious efforts to water down the proposal include:

  • The Opinion adds a loophole to the text to reduce protection of communications. It suggests that the confidentiality of emails and other electronic communications should be protected only when they are “in transit”. This contradicts the entire logic of the legislation and, crucially, will allow companies such as Google to monitor the content of communications – when not “in transit”, but stored in their servers – to intensify their profiling of users.
  • It supported the European Commission’s position that devices and software should not prevent unauthorised access by default. Instead, there should simply be an option – possibly hidden somewhere in the settings, as is typical – to set security levels. Ironically, this position completely contradicts other EU legislation, which criminalises unauthorised trespassing on computer systems. It also contradicts the GDPR, which foresees data protection by design and by default.
  • The Opinion suggests that “further processing” of metadata – information about our location, the times we communicate, the people we communicate with and so on – should not require consent. The activity permitted by this is not vastly different from the definition of stalking in Dutch law: “systematically and deliberately intrudes someones personal environment with the intention to force the other to do something, not to do something or to tolerate something…”(translation adapted from Wikipedia)
  • Instead of requiring providers to anonymise or delete personal data that is no longer needed, the IMCO Committee would also allow unnecessary personal data, if they use pseudonymisation. This means that the data collected and used by the provider could, at any moment be re-identified, creating unnecessary security, data protection and privacy risks for the individuals concerned.
  • Rather cleverly, the Committee has exploited child protection to make profiling of users easier for companies. By saying that electronic communications data “shall not be used for profiling or behaviourally targeted advertising purposes” for children, it implies that this is acceptable for adults.
  • An obligation to inform users about security risks that have been discovered and how to minimise risks has been simply deleted from the Opinion.
  • The committee did not amend the Commission’s proposal to allow people to be tracked through their mobile devices as they move around physically (in towns, malls, airports, and so on). Individuals are expected to opt out individually each time they enter an area with one or more tracking networks – on condition that they see and react to the one or more signs that indicate that tracking is happening.

The extremist anti-consumer position of the Committee on Internal Market and Consumer Protection was rightfully ignored in the adoption of the General Data Protection Regulation. We can only hope that, in addition to having its name changed to the “Committee on Internal Market and Consumer Protection”, its Opinion will be ignored also this time.

IMCO Compromise Amendments to the proposal for a e-Privacy Regulation

e-Privacy revision: Document pool

Are we on the right track for a strong e-Privacy Regulation? (28.07.2017)

(Contribution by Joe McNamee, EDRi)