24 May 2018

Press Release: GDPR: A new philosophy of respect

By EDRi

The General Data Protection Regulation (GDPR) is going in effect tomorrow, on 25 May 2018, strengthening and harmonising individuals rights in regards to personal data. A much celebrated success for all privacy advocates, GDPR is more than just a law.

GDPR is a new philosophy that promotes a culture of trust and security and that enables an environment of Respect-by-Default

said Joe McNamee, Executive Director of European Digital Rights.

The Directive adopted in 1995 was characterised by a tendency towards bureaucratic compliance with little enforcement. The GDPR represents a recalibration of focus, establishing a new balance between companies, people and data. The framework does not only protect, but also changes, perceptions of personal data. On one hand, GDPR protects individuals from companies and governments abusing their personal data and promotes privacy as a standard. On the other, it gives businesses the chance to develop processes with privacy-by-default in mind, ensuring in this way both individuals’ trust and legal compliance . GDPR minimises the risk of some companies’ bad behaviour undermining trust in all actors.

The GDPR is capable of setting the highest regional standards for the protection of personal data; once well implemented, we need updated global rules

said Diego Naranjo, Senior Policy Advisor of European Digital Rights.

While not perfect, because no legislation is perfect, the GDPR is probably the best possible outcome in the current political context. We will now have to rely on each EU Member State’s Data Protection Authority (DPA) to do their jobs correctly and on governments to ensure enough resources have been allocated to allow this to happen.

To promote educational efforts around GDPR, we have developed an online resources that help everyone better understand their new rights and responsibilities, the “GDPR Explained” campaign which will be launched shortly.

Read more:

The four year battle for the protection of your data (24.05.2018)
https://edri.org/four-year-battle-protection-of-your-data-gdpr/

EU Data Protection Package – Lacking ambition but saving the basics (17.12.2015)
https://edri.org/eu-data-protection-package-lacking-ambition-but-saving-the-basics/

close
24 May 2018

The four year battle for the protection of your data

By Bits of Freedom

In 2012, what would become a four-year process started: the creation of new European data protection rules. The General Data Protection Regulation would replace the existing European Data Protection Directive adopted in 1995 and enhance and harmonise data protection levels across Europe. The result is an influential piece of legislation that touches on the lives of 500 millions of people and creates the highest regional standard for data protection.

A lobbyist feeding frenzy

With so much at stake, civil society was preparing for strong push-back from companies. But we could never have dreamed just how dead set corporate lobbyists were on undermining citizens’ rights – or the lengths they would go to to achieve their goals. Former European Commissioner Viviane Reding said it was the most aggressive lobbying campaign she had ever encountered. The European Parliament was flooded with the largest lobby offensive in its political history.

Civil society fights back

The European Digital Rights network worked together and continued to fight back. Among other things we had to explain that data leaks are dangerous and need to be reported, and that it’s not acceptable to track and profile people without their consent. We were up against the combined resources of the largest multinational corporations and data-hungry governments, but we also had two things in our favor: the rapporteur Jan Philipp Albrecht and his team were adamant about safeguarding civil rights, and in 2013 the Snowden-revelations made politicians more keen on doing the same. Against all odds, we prevailed!

GDPR isn’t perfect, but it is a way forward

The General Data Protection Regulation that was adopted in 2016, and will be enforced starting May 25th, is far from perfect. As we pointed out in 2015, we did however manage to save “the essential elements of data protection in Europe”, and now have a tool with which to hold companies and governments using your data to account. We are committed to doing just that. We will continue to fight for your privacy, speak out when and where it is necessary and help you do the same.

EU Data Protection Package – Lacking ambition but saving the basics (17.12.2015)
https://edri.org/eu-data-protection-package-lacking-ambition-but-saving-the-basics/ 

EDRi GDPR document pool
https://edri.org/gdpr-document-pool/

(Contribution by Evelyn Austin, EDRi member Bits of Freedom, the Netherlands)

Twitter_tweet_and_follow_banner

close
02 May 2018

Are GDPR certification schemes the next data transfer disaster?

By Foundation for Information Policy Research

The General Data Protection Regulation (GDPR) encourages the establishment of data protection certification mechanisms, “in particular at [EU] level” (Art. 42(1)). But the GDPR also envisages various types of national schemes, and allows for the approval (“accreditation”) of schemes that are only very indirectly linked to the national data protection authority.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

On 6 February 2018, the Article 29 Working Party (WP29) adopted Draft Guidelines on the accreditation of certification bodies under Regulation (EU) 2016/679 (WP261). On 16 February, it issued a call asking for comments on these draft guidelines. Why can this seemingly technical issue have major implications, in particular in relation to transfers of personal data to third countries without “adequate” data protection (such as the USA)?

The GDPR stipulates that, in relation to several requirements (consent, data subject rights, etc.), a data protection seal (issued at national or EU level) can be used as “an element by which to demonstrate” the relevant matters. This makes such seals useful and valuable, but still allows the data protection authorities to assess whether a product or service for which a seal has been issued really does conform to the GDPR.

However, in one context this is different: in relation to transfers of personal data to third countries without adequate data protection. Such transfers are in principle prohibited, subject to a limited number of exceptions, including where “appropriate safeguards” are provided by the controller or processor (Art. 46). In this regard, the GDPR stipulates that such appropriate safeguards “may be provided for” inter alia by:
an approved certification mechanism pursuant to Article 42 together with binding and enforceable commitments of the controller or processor in the third country to apply the appropriate safeguards, including as regards data subjects’ rights (Art. 46(2)(f)).

In other words, in relation to transfers of personal data to countries without adequate data protection, certifications are conclusive: they provide, in and by themselves, the required safeguards. Indeed, the article adds that certifications can achieve this “without requiring any specific authorisation from a supervisory authority” (leading sentence to Article 46(2)).

In the highly sensitive context of data transfers, it is therefore crucial that certification schemes will ensure that certifications can and will only be issued in cases in which they really provide cast-iron safeguards, “essentially equivalent” to those provided within the European Union and the European Economic Area (EEA) by the GDPR. Otherwise, the very same problems and challenges will arise as arose in relation to the discredited “Safe Harbor” scheme and the not-much-less contestable (and currently contested) “Privacy Shield”.

Unfortunately, the GDPR does not directly guarantee that certification schemes must be demanding and set high standards. Rather, member states can choose from three types of arrangement: the relevant national data protection authority (DPA) issuing seals; the national DPA accrediting other bodies to issue seals; or leaving it to national accreditation bodies to accredit other bodies to issue seals. In the last case, the seal-issuing bodies are therefore two arms-lengths removed from the DPAs. Moreover, national accreditation bodies normally accredit technical standards bodies, for example, for medical devices or toys – they are unsuited to approve mechanisms supposed to uphold fundamental rights. This could lead to low-standard seal schemes, in particular in countries that have always been lax in terms of data protection rules and enforcement, such as the UK and Ireland.

The only safeguard against the creation of weak certification schemes lies in the criteria for accreditation of certification schemes, applied by the relevant accrediting body (which as just mentioned need not be the country’s DPA): those criteria must be approved by the relevant national DPA, subject to the consistency mechanism of the GDPR (which means that ultimately the new European Data Protection Board, created by the GDPR as the successor to the Article 29 Working Party) will have the final say on those criteria. But this is still rather far removed from the actual awarding of certifications.

Surprisingly, the Draft Guidelines on the accreditation of certification bodies, released by the WP29, do not include the very annex that is to contain the accreditation criteria.

To the extent that the WP29 say anything about them, they play them down: the WP29 says that the as-yet-unpublished guidelines in the not-yet-available annex will “not constitute a procedural manual for the accreditation process performed by the national accreditation body or the supervisory authority”, but rather will only “provide […] guidance on structure and methodology and thus a toolbox to the supervisory authorities to identify the additional requirements for accreditation” (p. 12).

As pointed out in a letter to the WP29, “the WP29 Draft Guidelines therefore fail to address the most important issues concerning certification”. The letter calls on the WP29 to:

urgently provide an opinion on the ways in which it can be assured that certification schemes will really only lead to certifications at the highest level, and in particular to ensure that certifications will not be used to undermine the strict regime for transfers of personal data from the EU/EEA to third countries that do not provide “adequate” (that is: “essentially equivalent”) data protection to that provided by the GDPR –

[and to]

urgently move towards the accreditation of (a) pan-EU/EEA certification scheme(s) at the highest level, and adopt a policy that would require controllers and processors involved in cross-border processing operations within the EU/EEA and/or data transfers to third countries without adequate data protection to seek such pan-EU/EEA certifications for such cross-border operations, rather than certifications issued by national schemes.

Draft Guidelines on the accreditation of certification bodies under Regulation (EU) 2016/679 (WP261)
http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=614486

Letter to the Article 29 Working Party
https://edri.org/files/EDRi_comments_on_WP261_re-accreditation.pdf

General Data Protection Regulation (GDPR)
https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=uriserv:OJ.L_.2016.119.01.0001.01.ENG&toc=OJ:L:2016:119:TOC

(Contribution by Douwe Korff, EDRi member Foundation for Information Policy Research – FIPR, United Kingdom)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
02 May 2018

Facebook: Unanswered questions

By Joe McNamee

On 9 April 2018, EDRi received an invitation from Facebook to attend a meeting to the loss of trust in Facebook, following the Cambridge Analytica scandal. The meeting was proposed for 26 April.

It struck us that, if Facebook wanted an honest exchange, it would be happy to answer some of the most obvious outstanding issues.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Encouragingly, Facebook said it would welcome the questions and said that they would still also like the meeting on 26 April.

The questions were sent on 16 April and… we never heard from Facebook again…

Here they are:

1. Facebook’s new policy is based on opt-in for facial recognition being applied to inform Facebook users of their faces appearing on photos uploaded by other users. Does this mean that Facebook will index all facial profiles on any photo uploaded, regardless of any consent by any person depicted? Please answer with “yes” or “no” and explain.

1b. More specifically, will Facebook refrain from analysing any photograph uploaded by any user for biometric data about persons depicted on those photos until it has received an opt-in by every person depicted on those photos? Please answer with “yes” or “no”.

2. You state the following: “Second, we’ll ask people who’ve previously chosen to share their political, religious, and “interested in” information in their profile to check that they want to continue to share it.”

Does the above mean that any of the above data will be deleted if Facebook does not receive an explicit consent to retain it? Please answer with “yes” or “no”.

If “yes”, what will be the cut-off date before Facebook starts deleting such data?

2.b If by “sharing” it is meant that the scope of the discontinuation is limited to sharing with other Facebook users and/or Facebook affiliates, how does Facebook consider that this complies with the requirements of art. 9 GDPR for processing these special categories of data?

3. Privacy International created a new Facebook profile to test default settings. By default, everyone can see your friends list & look you up using the phone number you provided. This is not what proactive privacy protections looks like. How does this protect users by design and by default?

4. According to your notification, a “small number of people who logged into ‘This Is Your Digital Life’ also shared their own News Feed, timeline, posts and messages which may have included posts and messages from you”. Why was this not notified to the appropriate national authorities immediately? Are other apps also able to share / receive messages from me?

5. If a similar situation to the one involving Cambridge Analytica were, despite your efforts, to arise again, who would be responsible, Facebook Inc or Facebook Ireland?

6. Why do privacy settings continue to only focus on what friends can & can’t see? If the recent FB scandal has showed one thing, it is that FB’s ad policies have far-reaching consequences for users’ privacy. When are you going to treat ad settings as privacy settings?

7. The GDPR includes new provisions on profiling and automated decision-making. How are you going to change your ad targeting practices to be compliant?

8. The Economist recently reported on how difficult it is for Europeans to download their personal data from Facebook, and Mark Zuckerberg’s testimony described your systems as more transparent than they actually are. How and when, if at all, do you plan to address these issues?

9. You claim to offer a way for users to download their data with one click. Can you confirm that the downloaded files contain all the data that Facebook holds on each user?

You claim to offer a single place to control your privacy. This does not seem to include ways to opt out of ad targeting or to avoid being tracked outside Facebook. Will you offer a single place where users can control every privacy aspect of Facebook, even for people who have no Facebook account?

10. The GDPR gives individuals the right to access and verify their profiles, including marketing profiles based on so called derived data (data that were not disclosed by the user but interpreted from his/her behaviour). Is Facebook going to give its users full access to their marketing profiles? Please answer with “yes” or “no” and explain.

11. Speaking about derived data and marketing profiles, does Facebook process for marketing purposes any data that reveal (directly or indirectly) political opinions of its users? Please answer with “yes” or “no” and explain.

12. Do Facebook apps use smartphone microphones in any way, without this being made clear to the user? If this were to happen, would you consider that lawful?

13. Facebook has voluntary agreements with the Swedish intelligence services to share data. How do you reconcile that with the GDPR?

We are expecting Facebook’s answers any day now…maybe not today, maybe not tomorrow, but soon. If not, we’ll always have Cambridge.

(Contribution by Joe McNamee, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
18 Apr 2018

Fighting for migrants’ data protection rights in the UK

By Guest author

Since 2014, the United Kingdon (UK) government has steadily rolled out policies to make the country a “hostile environment”  for migrants, in the words of Prime Minister Theresa May.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

This has involved turning various ordinary institutions into border protection agencies. Banks have to collect and supply data to the Home Office (the UK’s interior ministry) on their customers’ immigration status. Landlords are required to check immigration documents before rental. Schools were checking pupils’ nationality and also sharing information with the Home Office, before a boycott campaign put an end to the practice in April 2018. Hospitals, too, must process immigration paperwork before they can deliver any non-urgent treatment. The police, in some regions, are piloting a handheld biometric ID device that instantly gives street officers access to an immigration database.

In the “hostile environment”, migrants are losing the right to live free of pervasive monitoring. They’re also losing the right to basic data protection. This is particularly evident in the case of a data-sharing agreement between the National Health Service (NHS), the Department of Health, and the Home Office. This agreement, established through a Memorandum of Understanding (MoU) in late 2016, without any consultation of professionals or the public, allows immigration enforcement officers to request patient data held by NHS Digital, the database manager for public health in the UK.

The Migrants’ Rights Network (MRN) has been at the forefront of civil society responses to this scheme. MRN, together with Doctors of the World UK, Docs not Cops (a group of professionals resisting the implementation of “hostile environment” measures in the health sector), and civil rights organisation Liberty, argues that sharing data between health services and immigration control officers violates migrants’ fundamental right to patient confidentiality. Such a breach of fundamental privacy rights is all the more worrying that the Home Office has error margins of 10 percent in its decisions to target “immigration offenders” – meaning they would routinely request data for the wrong individuals.

Crucially, introducing the possibility that health services might hand over patient data to the Home Office will make many vulnerable migrants afraid to seek care. This is already a reality. During a parliamentary hearing in January 2018, elected representatives heard the tragic story of an undocumented domestic worker who avoided treatment out of fear that she could be deported, and died of otherwise preventable complications.

MRN argues that such a situation dismantles the very principles of public health, starting with duty of care and public trust in health providers. The Home Office and NHS Digital have denied this, and argue that data-sharing for immigration enforcement is “in the public interest.” Yet the only other reason NHS Digital normally supplies confidential patient data to the Home Office is in the case of serious crime, such as child abuse or murder. By putting immigration and serious crime on a similar level, this data-sharing arrangement contributes to the dramatic criminalisation of undocumented existence (already exemplified in everyday language by the expression “illegal migrant”).

The UK Parliament’s Health Committee and the British Medical Association have both asked for data-sharing to stop. The Home Office have responded by saying they need to gather more evidence of the scheme’s impact, which could take more than a year. MRN believe this is unacceptable, as lives are currently at risk. MRN is thus challenging the data-sharing agreement in court. The organisation has obtained permission for judicial review (after appeal), likely to take place during the summer 2018, and is currently raising funds to cover its potential court costs.

MRN’s legal challenge is rooted in a desire to protect public health principles and vulnerable lives, but it also has broader implications for data protection in the UK. It aims to send a clear signal that data rights cannot be stripped on the basis of nationality. This is absolutely crucial at a moment when the UK’s latest data protection law, currently being debated in Parliament, includes an exemption clause for immigration enforcement, which would prevent migrants from exercising their full rights under the EU General Data Protection Regulation (GDPR). MRN thus hopes to set a positive precedent for judicial activism on these matters, and make a strong case for non-discrimination as a pillar of data justice.

Against Borders for Children campaign: We won! DfE are ending the nationality school census!
https://www.schoolsabc.net/2018/04/we-won/

Crowdjustice fundraiser: Stop data-sharing between the NHS and the Home Office
https://www.crowdjustice.com/case/stopnhsdatasharing/

Making the NHS a ‘hostile environment’ for migrants demeans our country (24.10.2017)
https://www.opendemocracy.net/ournhs/erin-dexter/making-nhs-hostile-environment-for-migrants-demeans-our-country

‘Hostile environment’: the hardline Home Office policy tearing families apart (28.11.2017)
https://www.theguardian.com/uk-news/2017/nov/28/hostile-environment-the-hardline-home-office-policy-tearing-families-apart

NHS accused of breaching doctor-patient confidentiality for helping Home Office target foreigners (09.11.2017)
https://www.independent.co.uk/news/health/home-office-nhs-data-sharing-patients-human-rights-court-challenge-a8045011.html

Migrants’ Rights Network granted permission for judicial review of patient data-sharing agreement between NHS Digital and the Home Office (01.03.2018)
https://www.matrixlaw.co.uk/news/migrants-rights-network-granted-permission-legally-challenge-data-sharing-agreement-nhs-digital-home-

MRN legal challenge against NHS data-sharing deal (29.11.2017)
https://migrantsrights.org.uk/blog/2017/11/09/mrn-legal-challenge-nhs-data-sharing-deal-press-release/

(Contribution by Fabien Cante, LSE Media & Communications / Migrants’ Rights Network, the United Kingdom)

Twitter_tweet_and_follow_banner

close
18 Apr 2018

Privacy at ICANN: WHOIS winning?

By Guest author

The Internet Corporation for Assigned Names and Numbers (ICANN) has struggled over the publication of the name, address, phone number, and email address of domain name registrants since its inception in 1998. That registry is called WHOIS.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

WHOIS might have worked well during the 1980s when only a few researchers had domain names, but now it exposes millions of individuals to harassment and spam. So far, neither the efforts of civil society who volunteer at this multi-stakeholder organisation (notably the Noncommercial Users Constituency), nor the repeated interventions of the Data Commissioners of the world have had a lot of impact. However, there is a huge struggle going on now over compliance with the European General Data Protection Regulation (GDPR). Registrars who collect registrant data and provide it according to their contracts with ICANN have obtained legal advice that indicates they are vulnerable to significant fines.

ICANN continues to try to maintain a registrant directory that permits the continued access of many third parties, notably law enforcement agencies, trade mark and copyright holders, and private sector cybercrime investigators and reputational “blacklisters”. There has been a flurry of activity to address long-neglected privacy rights, and CEO Goran Marby has been asking for advice from the Article 29 Working Party. They answered on 11 April 2018 in a letter which was quite clear about ICANN’s failure to comply.

According to the Non-Commercial Stakeholder Group (NCSG), key issues that remain are:

  1. There is no multistakeholder process at the moment, and in recognition of the work which was going on in the WHOIS policy development process has been temporarily suspended. The CEO and the Board will make a decision, claiming it to be based on advice from the Article 29 Working Party and on “community input”. That interim policy is good for a year, during which time the community can propose changes, through a normal policy development process. Once the year is over (and the process takes a couple of months in itself to vote through a policy) the interim policy will become the final policy unless there is an agreed replacement. Given the recent history of the Registration Directory Services Policy Development Process (RDS PDP), it is highly unlikely that consensus to change the interim solution in less than a year would be achieved. This appears to be abandonment of the multi-stakeholder process, and requires close scrutiny. A multi-stakeholder process needs to remain in place to reach some kind of consensus on the biggest policy debate that ICANN has confronted in its history.
  2. The purpose of the collection, use and disclosure of registrant data is being construed to include feeding the third party actors who have always had free access to the data (in the NCSG view, often illegally).
  3. The issue of public safety and consumer protection as a reason to permit widespread access to data is unsupported by recent accurate data.
  4. The risks to individuals and small organisations have never been measured.
  5. The proposed tiered access model depends for its efficacy on a serious accreditation process. Because there is no time to develop one before 25 May, of the day the General Data Protection Regulation becomes law, an interim self-accreditation process is proposed. There may not be an appetite to work on proper standards that engage the data protection authorities, and the interim solution will not simply expose individuals to marketing, domain expropriation, spam, and risk from political adversaries. Self-accreditation risks setting up an anti-competitive regime where registrant data is held by dominant players.
  6. ICANN is still not clear as to whether it regards itself as a data controller, although a long-serving member of the ICANN community challenged them publicly on this matter at ICANN61 meeting in March 2018.It has also thus far refused to appoint a privacy officer for any registrant data related issues. What is clear to the NCSG is that ICANN is the only contracting party who has access to all escrowed data of registrants, and that they set the terms for that escrow arrangement. They also set the terms for the contracts with registries and registrars, and enforce their compliance through the Global Domains Division (compliance branch). It is worth noting that one of the recommendations of the business community proposal is that ICANN must retain access to all registrant data at all times, whatever the solution selected.
  7. For those not following the GDPR closely, the issue of who is the controller may be extremely important in terms of liability.
  8. NCSG is working on a standards development project led by a University of Toronto team, to develop proper accreditation standards for third parties to whom personal data is released by data controllers and processors. There must be strong management practices in place to ensure that the entities asking for the data are indeed who they say they are, and that their purported reasons to request the data are legitimate, limited, and proportionate. There should also be standards to ensure proper safeguarding and eventual destruction of the data, and access rights for individuals, as well as transparency except in exceptional circumstances. The Article 29 Working Party released a paper in February detailing their expectations and their own involvement in the accreditation of various processors under the GDPR; this standards proposal is working in the same vein, to explore what best management practices look like.

Working Paper International Working Group on Data Protection in Telecommunications
https://www.datenschutz-berlin.de/working-paper.html

Working Paper on Privacy and Data Protection Issues with Regard to Registrant data and the WHOIS Directory at ICANN (27-28.11.2017)
https://www.datenschutz-berlin.de/pdf/publikationen/working-paper/2017/2017-IWGDPT_Working_Paper_WHOIS_ICANN-en.pdf

Non-Commercial Stakeholder Group (NCSG) Positions on Whois Compliance with GDPR (16.04.2018)
https://community.icann.org/x/aoL3B

ICANN: Data Protection/Privacy – Latest Announcements, Updates & Blogs
https://www.icann.org/resources/pages/data-protection-announcements-2017-12-08-en

ICANN Receives Data Protection/Privacy Guidance from Article 29 Working Party (12.04.2018)
https://www.icann.org/news/announcement-2018-04-12-en

(Contribution by Stephanie Perrin, University of Toronto, NCSG Councilor)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
18 Apr 2018

Cambridge Analytica access to Facebook messages a privacy violation

By Gemma Shields

Less than one month after Cambridge Analytica Whistleblower Christopher Wiley exposed the abuse of (so far) 87 million Facebook users’ data, Facebook Co-Founder, Chairman, and CEO Mark Zuckerburg testified before the US Congress.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

On 10 and 11 April, Zuckerberg provided testimony in a joint hearing of the Senate Judiciary and the Senate Committee on Commerce, Science, and Transportation, and then to the House Energy and Commerce Committee. He faced questions on a number of democracy-disrupting and privacy-violating issues to which the social media giant has been a party, not least the composition – and use – of personally identifiable data as part of the Facebook-Cambridge Analytica scandal.

This scrutiny gave rise to uncertainty over what Facebook user data Cambridge Analytica had access to, and of just what this personal data comprised. What began as the personality app “This is Your Digital Life”, designed by researcher Aleksander Kogan and installed by 270 000 Facebook users (which in turn provide access to the data of at least 87 million users), resulted in data consulting firm Cambridge Analytica having access to the private inbox messages of users.

This revelation, whilst a part of the unfolding exposé, was confirmed in the notifications that began appearing at the top of users News Feeds which read “a small number of people who logged in to ‘This is Your Digital Life’ also shared their own News Feed, timeline, posts, and messages which may have included posts and messages from you.”

With a global reach, the scandal has implications for users worldwide. In the European Union, such access to personal data would be prohibited by the proposed ePrivacy Regulation. Current ePrivacy rules on access to the content of communications do not cover Facebook, although this would change under the proposed ePrivacy Regulation.

So far, lobbyists from Facebook and its allies have lobbied Member States in the EU Council successfully to slow down the adoption of the new Regulation – and not even this scandal has been able to persuade EU Ministers (many of whom signed a letter arguing that our fundamental rights should be “balanced” with “digital products and services” of the need that Facebook’s access to private communications needs to be restricted.

On how such abuse could happen, a Facebook spokesperson said: “In 2014, Facebook’s platform policy allowed developers to request mailbox permissions but only if the person explicitly gave consent for this to happen. At the time when people provided access to their mailboxes – when Facebook messages were more of an inbox and less of a real-time messaging service – this enabled things like desktop apps that combined Facebook messages with messages from other services like SMS so that a person could access their messages all in one place. According to our records only a very small number of people explicitly opted into sharing this information. The feature was turned off in 2015.”

Conditions for consent – as per Article 7 of the General Data Protection Regulation (GDPR) – cannot have been met, however, and in particular, the explicit consent of 87 million users to access to and repurposing of their personal data has not been obtained.

Users can check if their personal data was harvested and misused by Cambridge Analytica here: https://www.facebook.com/help/1873665312923476?helpref=search&sr=1&query=cambridge

Transcript of Zuckerberg’s appearance before the House committee (11.04.18)
https://www.washingtonpost.com/news/the-switch/wp/2018/04/11/transcript-of-zuckerbergs-appearance-before-house-committee/

Facebook scandal: I am being used as scapegoat – academic who mined data (21.03.18)
https://www.theguardian.com/uk-news/2018/mar/21/facebook-row-i-am-being-used-as-scapegoat-says-academic-aleksandr-kogan-cambridge-analytica

Revealed: Aleksandr Kogan collected Facebook users’ direct messages (13.04.18)
https://www.theguardian.com/uk-news/2018/apr/13/revealed-aleksandr-kogan-collected-facebook-users-direct-messages

Cambridge Analytica Could Have Also Accessed Private Facebook Messages (04.10.18)
https://www.wired.com/story/cambridge-analytica-private-facebook-messages/

How can I tell if my info was shared with Cambridge Analytica?
https://www.facebook.com/help/1873665312923476?helpref=search&sr=1&query=cambridge

(Contribution by Gemma Shields, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
07 Feb 2018

Data protection – time for action

By Anne-Morgane Devriendt

On 24 January 2018, the European Commission (EC) published a Communication on the implementation of the General Data Protection Regulation (GDPR), entering into force on 25 May 2018: “Stronger protection, new opportunities”.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The Communication describes the preparatory works by the Commission to help with the implementation of the GDPR and what the Commission plans to help Member States and companies to comply with the new data protection framework.

Most of the work at the EU level has been done by the group of Data Protection Authorities (the so-called Article 29 Working Party). It has been preparing guidelines, on the basis of extensive consultations and workshops with a variety of stakeholders. More work still needs to be done in order to ensure the effective implementation of the new rules.

Although the GDPR is a Regulation and therefore applies “as is” in all Member States, some legislation needs to be adapted to the new obligations set by the GDPR, especially regarding the flexibilities with which the “Regulective” can be implementedat Member States’ discretion, on automated decision making and transfer of personal data to third countries, among other things i. Ironically, while industry demanded harmonisation at the start of the legislative process, it spent most of the decision-making process demanding national flexibilities and exceptions, leading to the opposite outcome to the one it initially asked for. Sometimes, one is left with the impression that lobbyists are working to create work for themselves.

At the moment of publication of this Communication, just four months before the GDPR enters into force, only two out of 28 Member States (Austria and Germany) have finished this legislative preparation. It is also to be clarified how Member States will ensure that national Data Protection Authorities (DPAs) are given the means to fulfill their new functions as prescribed by the GDPR.

Finally, the Communication stresses that the core principles of data protection are not affected by the new Regulation As a result, few changes are needed from organisations, if they already comply with the existing Data Protection Directive. However, the Commission notes that citizens and small and medium-sized companies are not well informed about the provisions of the GDPR. It has launched guidelines on the new rules for business and rights for citizens.

One cannot help but wonder why neither Member States nor companies seem to be prepared for new legislation that has been discussed since the adoption of the Commission’s initial Communication in November 2010 and in the four years of legislative discussion, that were shaped by an unprecedented lobbying campaign by parts of the industry. This ostensible lack of preparedness is also surprising bearing in mind that the Regulation does not change existing core principles that should already be respected by controllers through the transposition (and enforcement) of the Data Protection Directive into national law since 1998.

Communication from the Commission (24.01.2018)
https://ec.europa.eu/commission/sites/beta-political/files/data-protection-communication-com.2018.43.3_en.pdf

Commission’s GDPR guidelines for citizens and small and medium companies
https://ec.europa.eu/commission/priorities/justice-and-fundamental-rights/data-protection/2018-reform-eu-data-protection-rules_en

PROCEED WITH CAUTION: Flexibilities in the General Data Protection Regulation (05.07.2016)
https://edri.org/analysis-flexibilities-gdpr/

General Data Protection Regulation: Document pool
https://edri.org/gdpr-document-pool/

(Contribution by Anne-Morgane Devriendt, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
07 Feb 2018

The Bulgarian EU Council presidency & the latest assault on ePrivacy

By Anne-Morgane Devriendt

In January 2018, the Bulgarian Presidency of the Council of the European Union (EU) picked up where the Estonian Presidency left off on the ePrivacy Regulation. It issued two examinations of the last Estonian “compromise” proposal and asked national delegations for guidance on some issues. Together, the documents cover most of the key points of the text. While the Bulgarian Presidency brings clarity on some points, its questions pave the way to undermine the text – and therefore threatens the protection of citizens’ privacy, confidentiality of communications of both citizens and businesses, as well as the positions of innovative EU companies and trust in the online economy.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

One of the main lobbying devices used against the ePrivacy proposal is its alleged redundancy, due to the General Data Protection Regulation (GDPR) coming into force in May 2018. The processing of personal data is already covered by the GDPR, why would we need an additional text? The Bulgarian Presidency addresses this question by clarifying the ePrivacy Regulation’s role as lex specialis of the GDPR. Effectively, the ePrivacy Regulation complements the GDPR, and if the two texts overlap, then ePrivacy applies, as it provides for a higher level of protection of communications data, which are sensitive data.

On privacy settings, covered by Article 10, the Bulgarian Presidency proposes to keep the choices presented by the Estonian Presidency, providing for privacy by default and an easy way to change the settings, or to require more granularity in the settings by blocking the storage or the processing of data by third parties. This offer users a degree of control over third-party activities on their devices.

After this welcome clarification on this (rather simple) issue and this relatively privacy-friendly proposal, the Bulgarian Presidency then follows up on the undermining of the text already initiated by the Estonian Presidency in December 2017.

In the second document that deals with the third Chapter of the proposal on the “rights to control electronic communications”, the Bulgarian Presidency mostly follows the Estonian proposal, except for publicly available directories. There, it proposes to either put obligations both on the providers of number-based communication services and on publicly available directories, or the harmonisation of the rules with opt-in or right to object. As for direct marketing, the Bulgarian Presidency asks the national delegations to give their opinion on the need for uniform rules on voice-to-voice calls.

The Bulgarian Presidency also asks the national delegations to choose between two proposals concerning permitted processing of communications data (provided in Article 6): a middle ground that would be to allow further processing if it has no impact on privacy; or the inclusion of a “legitimate interest” ground for further processing of metadata. It is hard to understand what kind of further processing of communication data – or metadata – would not impact privacy (not least following the latest revelations of security breaches due to “non-personal” data, or how there could be a “legitimate interest” for the further processing of communication metadata, not least due to contrary positions already taken by the Court of Justice of the European Union in the Tele 2 case.

On storage and erasure of electronic communications data, regarding data that is no longer needed to provide a service, the Bulgarian Presidency proposes to either delete the provisions on the deletion of data, or to keep them while deleting the provisions authorising recording or storage of the data by the end-user or a third-party entrusted by them. The first possibility would remove the protection of communication data at rest – ironically creating, at the request of industry lobbyists, the kind of incoherence between ePrivacy and the GDPR of which industry lobbyists have been warning. The second would keep the level of protection agreed upon by the European Parliament.

The worst attack of the Bulgarian Presidency on the text concerns the protection of terminal equipment (Article 8). In addition to the proposals put on the table by the Estonian Presidency, the Bulgarian Presidency proposes different exemptions to the need for consent for the processing of data from an individual device: for “non-privacy intrusive purposes”; based on a “harm based approach” that would consider the levels of impact of different techniques on privacy. It also proposes to couple together the addition of a “legitimate interest to deliver targeted advertisement” and the right to object; and even asks whether the text should cover the “access to services in the absence of consent to process information”. Again, it is hard to see how there could be a “legitimate interest to deliver targeted advertisement”, and how this would contribute to the protection of privacy. Such a convoluted legal construction would, in any event, be only usable by the largest targeted (or “surveillance”) advertising companies. If this approach is followed, the EU would end up with legislation (ePrivacy) that would make it easier to access data on a computer system, as well as legislation (attacks against computer systems – Directive 2013/40/EU) criminalising access to a computer system.

Although the Bulgarian Presidency did take a progressive stance on the links between the GDPR and ePrivacy, the rest of its proposals systematically undermine the text by lowering the level of protection of the communications and privacy.

ePrivacy Regulation proposal – Examination (1) of the Presidency discussion paper (11.01.2018)
http://data.consilium.europa.eu/doc/document/ST-5165-2018-INIT/en/pdf

ePrivacy Regulation proposal – Examination of Articles 12 to 16 (25.01.2018)
http://data.consilium.europa.eu/doc/document/ST-5569-2018-INIT/en/pdf

Latest proposal by the Estonian Presidency (05.12.2017)
http://data.consilium.europa.eu/doc/document/ST-15333-2017-INIT/en/pdf

ePrivacy proposal undermined by EU Member States (10.01.2018)
https://edri.org/eu-member-states-undermine-e-privacy-proposal/

(Contribution by Anne-Morgane Devriendt, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
10 Jan 2018

ePrivacy proposal undermined by EU Member States

By EDRi

The discussions on the ePrivacy Regulation continue in the European Union (EU) legislative process. They were on hold for a few weeks because of ongoing negotiations on the European Electronic Communications Code (EECC) – another big “telecoms” file that the Council of the European Union is working on.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

On 5 December 2017, the Estonian Presidency of the Council proposed new compromises on key articles. This latest proposal for amendments is related to Articles 6, 7 and 8 of the draft ePrivacy Regulation, which concern permitted processing (Art. 6), storage and erasure of communications data (Art. 7) and the protection of stored communications in users’ devices (Art. 8).

Permitted processing

The provisions on permitted processing cover the circumstances under which electronic communications data may be processed.

The Estonian Presidency text suggests a few adaptations to be in line with the General Data Protection Regulation (GDPR) by including the legal ground of vital interest in Article 6(2)(d) and a new recital 17a, as well as provisions for accessibility in Article 6(3)(aa) and the new recital 19a. These additions should not add any risks for privacy issues in the way they are currently designed.

Much more concerning is the addition in Article 6(2)(e) and a recital 17b of a legal ground for scientific research and statistical purposes, similar to the one in Article 9(2)(j) of the GDPR (research, unlike the archiving, need not be “in the public interest”). The text of the recital and the Article state that this “type of processing should be subject to further safeguards to ensure privacy of the end-users by employing appropriate security measures such as encryption and pseudonymisation.” The use of “such as” means that these are just possibilities, not requirements. On top of that, a lot of flexibility would be given to Member States, since these measures must be “based on Union or Member State law, which shall be proportionate to the aim pursued and provide for specific measures”. This creates risks for privacy, security and the economic benefits generated by a more predictable, harmonising measure.

Storage and erasure

The provisions on storage and erasure cover what protection should apply to different types of data and the deletion of data that is no longer needed to perform a service.

On storage and erasure, the Estonian Presidency “invites delegations to reflect on the need for” Art. 7(1) which ensures protection of communication data when it is at rest (i.e. stored in the provider’s network). Not including the protection of communications data at rest in the ePrivacy regulation means that an e-mail would be sensitive data subject to the standards of the ePrivacy Regulation while being transmitted and suddenly, upon arrival in the (online) mailbox of the provider, be subject to the General Data Protection Regulation. This would create the option for processing of the content as non-sensitive data under the “legitimate interest” exception in the GDPR, in order to facilitate invasive surveillance of content, of the kind previously done by Gmail. Bizarrely, businesses lobby both for clear, predictable rules and unclear and unpredictable rules like this.

Protection of terminal equipment

The provisions on protection of terminal equipment cover the rule for installing or using data on an individual’s communications device.

As regards terminal equipment, recital 21 adds precision on the use of cookies. Cookies can be used for both tracking and non-tracking purposes. The text recognises “authentication session cookies used to verify the identity of end-users engaged in online transactions” as legitimate, as well as some audience measuring. However, Articles 8(1)(d) and 8(2)(c) authorise audience measuring “by a third party on behalf of the provider of the information society service” and statistical counting without making pseudonymisation mandatory. This would facilitate the kind of cross-platform data processing done by, for example, Google Analytics.

Recital 21 and Article 8(1)(e) also allow for installation of security updates without the consent of the end-user, provided they are necessary, that the user is aware of them and that the user can delay them. While security updates are particularly important to protect the user from attacks or breaches, consent should remain as the sole legal basis for any process linked to accessing a terminal equipment. That way, instead of knowing that a “security update” is being installed on your phone, computer or other connected device, the software provider would have an incentive to be more transparent and give you more information on the update and what it is for.

Although not every proposed amendment threatens fundamental rights, the Estonian Presidency proposed to broaden the scope of exceptions in significant ways. It suggested authorising some processing that goes beyond what is strictly necessary, not keeping consent as sole legal basis, and not putting up strong safeguards to limit the impact of this broadening on privacy. This weakening of protections and predictability brings us closer to the kind of security and privacy chaos that the United States is experiencing. It would without doubt create the “chill on discourse and economic activity” that failure to implement privacy and security measures has caused in the US. But at least Facebook and Google will be happy.

Presidency text leaked by Austrian government (05.12.2017)
https://www.parlament.gv.at/PAKT/EU/XXVI/EU/00/43/EU_04355/imfname_10770009.pdf

Presidency text (05.12.2017)
http://data.consilium.europa.eu/doc/document/ST-15333-2017-INIT/en/pdf

e-Privacy: what happened and what happens next (29.11.2017)
https://edri.org/e-privacy-what-happened-and-what-happens-next/

e-Privacy revision: Document pool
https://edri.org/eprivacy-directive-document-pool/

(Contribution by Anne-Morgane Devriendt and Diego Naranjo, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close