13 Jun 2018

ePrivacy for Children: What is Data Protection Culture?

By Alternatif Bilisim

The General Data Protection Regulation (GDPR) attracted widespread attention and comment in recent weeks when it came into force on 25 May 2018. Having taken several years to get from being proposed by the European Commission to entering into force, the GDPR has been designed as a concerted, holistic and unifying effort to regulate personal data protection in the digital age.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

At a time when many public, private and third sector organisations have only recently ‘gone digital’ and when data has very rapidly becoming seen as ‘a new currency,’ the scope of application of the GDPR is vast. Serious fines can applied to firms, that do not abide by the new rules. This is no coincidence of course; recent Cambrige Analytica and Facebook violations of privacy forced the public debate to grow and with that awareness of what is at stake.

It is not only the scandals on the surface that have piqued the interest of the average user, though; the capital and energy spent on the data gathering fetish of social media platforms is also a key determinant of the process. The right to erasure is also more easily applicable from now on, signifying more meaningful control over data and the erosion of post-capitalist surveillance society. However, in the decade of tl;dr (too-long-did-not-read) and post-truth, this type of detailed regulation might be a little too complicated to understand for internet users of all ages.

Through the lens of a researcher-mother, one is quickly struck by the image of hyper-socialised millennium generation on massive platforms like Facebook and Instagram. GDPR brings special conditions for childrens’ data. Well, living in Turkey with your child right beside you is not a comfort; you are still spending 16+ hours of your day connected to inter-networks.

The GDPR makes some specific requirements in respect of children’s data, for reasons set out in recital 38: “Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child. The consent of the holder of parental responsibility should not be necessary in the context of preventive or counseling services offered directly to a child.”

While this statement has much merit, it is only an explanatory recital, guiding implementation of the GDPR but lacking the legal force of an article. In a recent London School of Economics Media Policy Project roundtable, it became clear that there is considerable scope for interpretation, if not confusion, regarding the legal basis for processing (including, crucially, when processing should be based on consent), the definition of an information society service (ISS) and the meaning of the phrase “directly offered to a child” in Article 8 (which specifies a so-called “digital age of consent” for children), the rules on profiling children, how parental consent is to be verified (for children younger than the age of consent), and when and how risk-based impact assessments should be conducted (including how they should cover intended or actual child users). It is also unclear in practice just how children will be enabled to claim their rights or seek redress when their privacy is infringed.

Already there are some surprises. WhatsApp, currently used by 24% of UK 12-15 year olds, announced it will restrict its services to those aged 16+, regardless of the fact that in many countries in Europe the digital age of consent is set at 13. Instagram is now asking its users if they are under or over 18 years old, perhaps because this is the age of majority in the United Nations Convention on the Rights of the Child (UNCRC)? We will see how things will unfold in the coming months.

In the meantime, a few suggestions are made by Sonia Livingstone of the London School of Economics in the light of a new project. For exploring how children themselves understand how their personal data is used and how their data literacy develops through the years from 11-16 years old, (1) conducting focus group research with children; (2) organising child deliberation panels for formulating child-inclusive policy and educational/awareness-raising recommendations; and (3) creating an online toolkit to support and promote children’s digital privacy skills and awareness. The young generation reminds us once again of the responsibility for creating commons data culture at grassroots level.

Do such changes mean effective age verification will now be introduced (leading to social media collecting even more personal data?), or will the GDPR become an unintended encouragement for children to lie about their age to gain access to beneficial services, as part of their right to participate? How will this protect them better? And what does this increasingly complex landscape mean for media literacy education, given that schools are often expected to overcome regulatory failures by teaching children how to engage with the internet critically? As in the case of Turkey, teachers digital literacy skills need a serious and rapid boost and even more primarily, policies regarding internet governance and community education must be redrafted.

Translated from the Original Text by Asli Telli Aydemir, Alternative Informatics (Alternatif Bilisim)
You can read the original text in Turkish here.

Read more:

A Digestible Guide to Individual’s Rights under GDPR (29.05.2018)

EDRi General Director Joe McNamee live interview on TRTWorld

GDPR Exlained Campaign

Time to Disagree Campaign

EDRi`s privacy for kids booklet

(Contribution by Alternatif Bilisim, EDRi member, Turkey)



31 May 2018

Xnet: Opposing guarded access to institutional information

By Xnet

In their fight for free access to information and data protection, Spanish EDRi member Xnet contacted the Spanish Data Protection Agency (AEPD). As the AEPD is the institution responsible for the implementation of the General Data Protection Regulation (GDPR) in Spain, Xnet brought up questions about the compliance of the agency’s work with the new regualation.

You can read Xnet’s letter below:

“To whom it may concern,

We have two questions:

1. In order to offer information that should be publicly accessible, you ask for all the personal details of those requesting said information. Does this not clash with article 5 of the GDPR, which states that only the data necessary for performance of the task should be obtained? It is our understanding that for the task of offering information on topics in the public domain, you do not need any data.

As a specific example, in order to ask you this very question, we had to write to you with our electronic certificate, which means you  have access to our personal data. As we understand it, your task is to answer these questions to whomever asks, it should have been possible to ask them without you needing to know who we are. Is this not the case? However, you offer no information by email or by telephone, only in response to communications using electronic certificates.

If our understanding is not correct, we would kindly ask you to send us the legal articles that corroborate your interpretation.

2. We do not understand why the Spanish Data Protection Agency, which as previously mentioned is highly demanding with individuals, does not use https (Secure Hypertext Transfer Protocol) by default in its digital spaces. This leaves the data of those who access your websites vulnerable. We would like to know the reason for this.

Thank you for your attention.



(Contribution by Xnet, EDRi member)

Read more:

A Digestible Guide to Individual’s Rights under GDPR (29.05.2018)

30 May 2018

Your ePrivacy is nobody else’s business

By Maria Roson

The right to privacy is a fundamental right for every individual, enshrined in international human rights treaties. This right is being particularly threatened by political and economic interests, which are having a deep impact on freedom of expression, democratic participation and personal security. The recent Facebook-Cambridge Analytica scandal is a perfect example of the risks that privacy breaches poses to individuals’ rights.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Under the excuse of providing customers with “a better service”, companies are often unnecessarily asking to exploit communications data, and track them online. In practice, these “requests” often leave users without the real possibility of refusing, as this would mean not being allowed to use the service. This is what EDRi member Bits of Freedom calls “tracking walls”. To protect citizens from this and other abusive practices, EU level rules have been developed, namely the ePrivacy Directive. This Directive was adopted in 2002 and revised in 2009. Now, a new proposal for a ePrivacy Regulation is on the table.

The protection of the right to privacy online in the ePrivacy Regulation should be at the centre of EU’s priorities. For this reason, it is important to be aware of the most sensitive issues concerning ePrivacy, to be able to identify when citizens’ rights could be at risk:


Consent is one of the ways to allow your data to be used legally. Through free and informed consent, the users agree that a company to accesses a specific personal information for a specific purpose.. Consent drives the trust that is needed for new services but it needs to be meaningful. It must be freely given, specific, informed and explicit, not the only choice that is available. For example, accepting abusive permissions “required” by an app, when the only alternative is not using the app at all, is not a valid form of requiring consent.

Legitimate interest

“Legitimate interest” means that under exceptional circumstances it would be legal to access personal data without the user’s consent. Communications data – your emails, calls over the internet, chats, and so on – must be treated as sensitive data, as it has been stated by the Court of Justice of the European Union (CJEU). The “legitimate interest” exception allows only the use of non-sensitive data – such as an email address or a telephone number – therefore communications data cannot, logically and legally, be processed under this exception. For this reason, companies should, in no circumstances, be allowed to monetise or otherwise exploit sensitive communications without specific permission.

Given that the scope of the ePrivacy Regulation deals with sensitive data, the legitimate interest exception has no place in it. Any suchexception would fatally undermine users’ control over such information. Moreover, it would affect freedom of expression, as the users would fear having their communications controlled by companies without consent.

Offline tracking

Offline tracking is a highly intrusive technology, which implies being tracked through your electronic device. The location of your device can be used for unlawful purposes involving the use of sensitive data, revealing personal information of the users, particularly when they are in the vicinity of – or in – various service or institutions. The European Commission has proposed to allow this offline tracking as long as the individual notified. However, obtaining this information by tracking individual citizens poses severe privacy risks and possibilities for abuse, including the risk of mass surveillance by commercial or law enforcement entities. For these reasons, every update of the ePrivacy rules must consider less intrusive ways to obtain location-based information.

Privacy by design and by default

In the same way that you expect to use a microwave oven without having to think about a risk of starting a fire in your house, your connected devices should protect your privacy by design and by default. Privacy by design is the principle by which a high level of user privacy protection is incorporated in all stages of a device’s creation Privacy by default means that our devices are set to protect our data, with options to change this, if we wish to do so. As the ePrivacy Regulation will be the main framework to protect your communications online, it is important that hardware and software (not only browsers) will be designed, at all stages, to protect the privacy of individuals by default, and not by option.

The ePrivacy Regulation is currently being revised in the Council of the European Union, and there is an aggressive lobbying campaign to influence the Regulation to allow big business to exploit personal data more easily. Consequently, it will become less favourable for protecting citizens and their privacy online – the very purpose of the Regulation. Some of the [edri.org/files/eprivacy/ePrivacy_mythbusting.pdf arguments promoted by the lobbyists] are that ePrivacy is bad for democracy and for media pluralism, and that it prevents the fight against illegal content. (None of these arguments is actually linked with protecting privacy.) We have busted these myths, as well as the rest of the most common misconceptions related to ePrivacy. You can read more about it here: edri.org/files/eprivacy/ePrivacy_mythbusting.pdf

Being aware of what it is at risk is the best way to fight against lobby campaigns threatening citizens’ rights.

(Contribution by Maria Roson, EDRi Intern)

Read more:

Mythbusting – Killing the lobby myths that are polluting the preparation of the e-Privacy Regulation

EU Member States fight to retain data retention in place despite CJEU rulings (02.05.2018)

ePrivacy: Civil society letter calls to ensure privacy and reject data retention (24.04.2018)

Cambridge Analytica access to Facebook messages a privacy violation (18.04.2018)

(Contribution by Maria Roson, EDRi Intern)



29 May 2018

A Digestible Guide to Individual’s Rights under GDPR

By Guest author

The General Data Protection Regulation went into effect on May 25th and Privacy Policy updates have been flooding inboxes. GDPR enhances everyone’s rights, regardless of nationality, gender, economic status and so on. Unfortunately, the majority of individuals know very little about these rights and GDPR at large. The following guide is part of the GDPRexplained campaign and provides a digestible explanation of individuals’ rights and basic concepts in the EU’s new data protection regulation.

What are my rights under the GDPR?
1. You have the right to information.
  • Companies and organisations are now required to communicate to you, in plain and accessible language, what personal data they process and how they use it. (“Processing” includes anything related to the collection, aggregation, mining or sharing of data.)
  • If a company or organisation builds a profile on you (e.g. from data matched up from different sources), you have the right to know what’s in this profile.
2. You have the right to secure handling.

The GDPR regulates that personal data should be stored and processed securely.

3. You have the right to access the personal data a company/organisation holds on you, at any time.
  • If the data is inaccurate, you can change or complete it.
  • If the data is no longer necessary, you can ask the company/organisation to delete it.
  • If you initially gave the company/organisation more data than was necessary for receiving the service (e.g. for marketing purposes), but no longer want them to have this data, you can ask them to delete it.
4. You have the right to use a service without giving away additional data.

If a company/organisation wants to process personal data that is not strictly necessary for the provision of a particular service (e.g. a transport app that wants access to your phone’s contact list), they need to get your explicit consent to process that data. Note that even if a company believes that certain data is in their interest to process, this does not always mean that it is necessary. If you have already consented to the processing of additional data, you can always withdraw this consent.

5. With automated decisions, you have the right to explanation and human intervention.
  • If a decision has been made about you through automatic mechanisms, you have the right to know how the decision was made (i.e. you are entitled to an explanation of the logic behind the mechanism used).
  • When it comes to automated decision-making, you have a right to human intervention, and the right to contest any decision made.
6. How will these rights be enforced?

Each country will have an independent public Data Protection Authority (DPA) to ensure that companies are in compliance with the regulation. You have the right to lodge a complaint with your DPA or to go to court if you feel that your rights have been violated.

7. Do I need to do anything?

No. It’s up to companies and organisations to make sure that your personal data is protected. There are, however, still decisions you’ll need to make.

  • For new services you want to use: If the company is asking you to give them data, do you really want to agree? (If the service only processes necessary data, they are required to inform you but do not need to ask for special consent to do so. They do, however, need to ask for explicit consent when they want data that’s not necessary).
  • For the services you’re using at the moment: Are you still comfortable with the way the company/ organisation collects, analyses and shares your personal data? If you no longer agree, you can simply say “no”.
  • Finally: if you think your rights are not being upheld, you can decide to report it to your DPA, or even challenge the company in court.
8. Does it mean I can “delete” myself?

Not quite. You can’t delete all your personal data whenever you want to. But you can ask to have your data deleted in a few specific situations – for example if a company/organisation no longer needs it it in order to provide the service you are using, or if you decide to withdraw your consent. However, even in such cases, companies may still have viable reasons to keep your data, for example for tax purposes or to protect themselves from possible future claims.

9. Can I talk to companies about their use of my data?

Absolutely! The GDPR requires that companies and organisations respond to questions about personal data. This includes whether or not they process your personal data in the first place, and if so for what purpose, how long it will be stored, and with whom it is shared. And if you ever change your mind about what you have consented to or accepted, companies and organisations are also required not only to make it easy for you to communicate this choice, but also to act upon it.

10. What can I do if a company is using my personal data against my will?
  • It may be useful to contact the company itself first. Regardless of whether you do that, however, you can also file a complaint with your national Data Protection Authority – even if the company does not have an office in your country. And if you’re not satisfied with the DPA’s decision, you can take the company to court.
  • You can also skip the DPA and go directly to court if you feel your rights have been violated.
  • If as a result of a violation you have suffered material or non-material damage, you can seek financial compensation.
  • Third parties, such as consumer protection agencies, digital rights foundations or other interest groups, could also litigate on behalf of you and others.
11. Why are some companies critical of the GDPR?

Many companies have become used to treating your data as a ‘free resource’ – something they could take without asking permission and exploit for their own financial gain; something they could collect without limit, without protecting it. The GDPR is a powerful tool to force companies to re-evaluate the risks involved – not just to the individuals whose data they process, but also to themselves, in terms of fines and loss of customer trust – and to treat your data with the common-sense care and respect that should really have been in place from the beginning.

12. Does the GDPR apply to the data my employer has on me?

Yes. Your employer, like any other organisation that processes data, has to conform to the GDPR. However each EU member state can adopt more specific rules when it comes to the employment relationship. If you’re interested in this, you should look for more information on your national Data Protection Authority’s website.

13. Does the GDPR apply to US companies?

Yes. As soon as a company monitors or tracks the behaviour of internet users on EU territory, the regulation will kick in – no matter where the company is based.

Read more:

GDPRexplained: a social campaign launched today reminds the new regulation is there to protect our rights (25.05.2018)

Press Release: GDPR: A new philosophy of respect (25.05.2018)

The four year battle for the protection of your data (24.05.2018)


29 May 2018

GDPRexplained Campaign: the new regulation is here to protect our rights

By Panoptykon Foundation

Hundreds of e-mails informing about changes to companies’ privacy policies were sent out across the EU in the name of the GDPR. Both users and companies are confused with the variety of – sometimes contradictory – explanations and interpretations. The #GDPRexplained / #TimeToDisagree campaign launched by Panoptykon together with European Digital Rights and Bits of Freedom reminds everyone that the GDPR is – above all – a new tool to protect our rights.

The new data protection regulations re-emphasize that what we are protecting is living people and not meaningless sets of digits. A person can easily fall victim to wrongdoings concerning personal data. For instance, consumers may be negatively impacted if an insurance company increases a fee, a bank rejects an application for a loan based on unclear criteria or an ISP manipulates their political and consumer decisions by streaming a “tailored” newsfeed on their wall, without explaining the logic behind the choice.

The point of the new regulation – to regain control over who knows what about us and what they do with this information – is buried under the discussion about how companies are not meeting up to their requirements and seeking simple yes or no answers to particular dilemmas. What really matters though is the people and their rights.

Have you ever received a call from an unknown company and the person on the other side of the wire called you by your first name? The GDPR will make it easier to find out where did the company obtained your data from and ask them to erase it. It will challenge the common problem of bullying users to get their consent for data processing. The fuss around the GDPR alone makes so many people think: perhaps I don’t have to agree to all of this? A strong data protection authority and a perspective of real financial sanctions should discourage everyone from taking unnecessary risks associated with violating the rights of their customers.

Our GDPR Explained campaign aims at educating individuals and organisations about the new rights granted to us and the changes to be made when dealing with personal data. We have put together answers to many important questions we have received and built a FAQ for anyone to access.

Visit the campaign at https://gdprexplained.eu.

Read more:

Press Release: GDPR: A new philosophy of respect (25.05.2018)

The four year battle for the protection of your data (24.05.2018)

GDPRexplained: a social campaign launched today reminds the new regulation is there to protect our rights (25.05.2018)

(Contribution by Panoptykon Foundation, EDRi member)


24 May 2018

Press Release: GDPR: A new philosophy of respect


The General Data Protection Regulation (GDPR) is going in effect tomorrow, on 25 May 2018, strengthening and harmonising individuals rights in regards to personal data. A much celebrated success for all privacy advocates, GDPR is more than just a law.

GDPR is a new philosophy that promotes a culture of trust and security and that enables an environment of Respect-by-Default

said Joe McNamee, Executive Director of European Digital Rights.

The Directive adopted in 1995 was characterised by a tendency towards bureaucratic compliance with little enforcement. The GDPR represents a recalibration of focus, establishing a new balance between companies, people and data. The framework does not only protect, but also changes, perceptions of personal data. On one hand, GDPR protects individuals from companies and governments abusing their personal data and promotes privacy as a standard. On the other, it gives businesses the chance to develop processes with privacy-by-default in mind, ensuring in this way both individuals’ trust and legal compliance . GDPR minimises the risk of some companies’ bad behaviour undermining trust in all actors.

The GDPR is capable of setting the highest regional standards for the protection of personal data; once well implemented, we need updated global rules

said Diego Naranjo, Senior Policy Advisor of European Digital Rights.

While not perfect, because no legislation is perfect, the GDPR is probably the best possible outcome in the current political context. We will now have to rely on each EU Member State’s Data Protection Authority (DPA) to do their jobs correctly and on governments to ensure enough resources have been allocated to allow this to happen.

To promote educational efforts around GDPR, we have developed an online resources that help everyone better understand their new rights and responsibilities, the “GDPR Explained” campaign which will be launched shortly.

Read more:

The four year battle for the protection of your data (24.05.2018)

EU Data Protection Package – Lacking ambition but saving the basics (17.12.2015)


24 May 2018

The four year battle for the protection of your data

By Bits of Freedom

In 2012, what would become a four-year process started: the creation of new European data protection rules. The General Data Protection Regulation would replace the existing European Data Protection Directive adopted in 1995 and enhance and harmonise data protection levels across Europe. The result is an influential piece of legislation that touches on the lives of 500 million people and creates the highest regional standard for data protection.

A lobbyist feeding frenzy

With so much at stake, civil society was preparing for strong push-back from companies. But we could never have dreamed just how dead set corporate lobbyists were on undermining citizens’ rights – or the lengths they would go to to achieve their goals. Former European Commissioner Viviane Reding said it was the most aggressive lobbying campaign she had ever encountered. The European Parliament was flooded with the largest lobby offensive in its political history.

Civil society fights back

The European Digital Rights network worked together and continued to fight back. Among other things we had to explain that data leaks are dangerous and need to be reported, and that it’s not acceptable to track and profile people without their consent. We were up against the combined resources of the largest multinational corporations and data-hungry governments, but we also had two things in our favor: the rapporteur Jan Philipp Albrecht and his team were adamant about safeguarding civil rights, and in 2013 the Snowden-revelations made politicians more keen on doing the same. Against all odds, we prevailed!

GDPR isn’t perfect, but it is a way forward

The General Data Protection Regulation that was adopted in 2016, and will be enforced starting May 25th, is far from perfect. As we pointed out in 2015, we did however manage to save “the essential elements of data protection in Europe”, and now have a tool with which to hold companies and governments using your data to account. We are committed to doing just that. We will continue to fight for your privacy, speak out when and where it is necessary and help you do the same.

Read more:

EU Data Protection Package – Lacking ambition but saving the basics (17.12.2015)

EDRi GDPR document pool

(Contribution by Evelyn Austin, EDRi member Bits of Freedom, the Netherlands)


02 May 2018

Are GDPR certification schemes the next data transfer disaster?

By Foundation for Information Policy Research

The General Data Protection Regulation (GDPR) encourages the establishment of data protection certification mechanisms, “in particular at [EU] level” (Art. 42(1)). But the GDPR also envisages various types of national schemes, and allows for the approval (“accreditation”) of schemes that are only very indirectly linked to the national data protection authority.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

On 6 February 2018, the Article 29 Working Party (WP29) adopted Draft Guidelines on the accreditation of certification bodies under Regulation (EU) 2016/679 (WP261). On 16 February, it issued a call asking for comments on these draft guidelines. Why can this seemingly technical issue have major implications, in particular in relation to transfers of personal data to third countries without “adequate” data protection (such as the USA)?

The GDPR stipulates that, in relation to several requirements (consent, data subject rights, etc.), a data protection seal (issued at national or EU level) can be used as “an element by which to demonstrate” the relevant matters. This makes such seals useful and valuable, but still allows the data protection authorities to assess whether a product or service for which a seal has been issued really does conform to the GDPR.

However, in one context this is different: in relation to transfers of personal data to third countries without adequate data protection. Such transfers are in principle prohibited, subject to a limited number of exceptions, including where “appropriate safeguards” are provided by the controller or processor (Art. 46). In this regard, the GDPR stipulates that such appropriate safeguards “may be provided for” inter alia by:
an approved certification mechanism pursuant to Article 42 together with binding and enforceable commitments of the controller or processor in the third country to apply the appropriate safeguards, including as regards data subjects’ rights (Art. 46(2)(f)).

In other words, in relation to transfers of personal data to countries without adequate data protection, certifications are conclusive: they provide, in and by themselves, the required safeguards. Indeed, the article adds that certifications can achieve this “without requiring any specific authorisation from a supervisory authority” (leading sentence to Article 46(2)).

In the highly sensitive context of data transfers, it is therefore crucial that certification schemes will ensure that certifications can and will only be issued in cases in which they really provide cast-iron safeguards, “essentially equivalent” to those provided within the European Union and the European Economic Area (EEA) by the GDPR. Otherwise, the very same problems and challenges will arise as arose in relation to the discredited “Safe Harbor” scheme and the not-much-less contestable (and currently contested) “Privacy Shield”.

Unfortunately, the GDPR does not directly guarantee that certification schemes must be demanding and set high standards. Rather, member states can choose from three types of arrangement: the relevant national data protection authority (DPA) issuing seals; the national DPA accrediting other bodies to issue seals; or leaving it to national accreditation bodies to accredit other bodies to issue seals. In the last case, the seal-issuing bodies are therefore two arms-lengths removed from the DPAs. Moreover, national accreditation bodies normally accredit technical standards bodies, for example, for medical devices or toys – they are unsuited to approve mechanisms supposed to uphold fundamental rights. This could lead to low-standard seal schemes, in particular in countries that have always been lax in terms of data protection rules and enforcement, such as the UK and Ireland.

The only safeguard against the creation of weak certification schemes lies in the criteria for accreditation of certification schemes, applied by the relevant accrediting body (which as just mentioned need not be the country’s DPA): those criteria must be approved by the relevant national DPA, subject to the consistency mechanism of the GDPR (which means that ultimately the new European Data Protection Board, created by the GDPR as the successor to the Article 29 Working Party) will have the final say on those criteria. But this is still rather far removed from the actual awarding of certifications.

Surprisingly, the Draft Guidelines on the accreditation of certification bodies, released by the WP29, do not include the very annex that is to contain the accreditation criteria.

To the extent that the WP29 say anything about them, they play them down: the WP29 says that the as-yet-unpublished guidelines in the not-yet-available annex will “not constitute a procedural manual for the accreditation process performed by the national accreditation body or the supervisory authority”, but rather will only “provide […] guidance on structure and methodology and thus a toolbox to the supervisory authorities to identify the additional requirements for accreditation” (p. 12).

As pointed out in a letter to the WP29, “the WP29 Draft Guidelines therefore fail to address the most important issues concerning certification”. The letter calls on the WP29 to:

urgently provide an opinion on the ways in which it can be assured that certification schemes will really only lead to certifications at the highest level, and in particular to ensure that certifications will not be used to undermine the strict regime for transfers of personal data from the EU/EEA to third countries that do not provide “adequate” (that is: “essentially equivalent”) data protection to that provided by the GDPR –

[and to]

urgently move towards the accreditation of (a) pan-EU/EEA certification scheme(s) at the highest level, and adopt a policy that would require controllers and processors involved in cross-border processing operations within the EU/EEA and/or data transfers to third countries without adequate data protection to seek such pan-EU/EEA certifications for such cross-border operations, rather than certifications issued by national schemes.

Draft Guidelines on the accreditation of certification bodies under Regulation (EU) 2016/679 (WP261)

Letter to the Article 29 Working Party

General Data Protection Regulation (GDPR)

(Contribution by Douwe Korff, EDRi member Foundation for Information Policy Research – FIPR, United Kingdom)



02 May 2018

Facebook: Unanswered questions

By Joe McNamee

On 9 April 2018, EDRi received an invitation from Facebook to attend a meeting to the loss of trust in Facebook, following the Cambridge Analytica scandal. The meeting was proposed for 26 April.

It struck us that, if Facebook wanted an honest exchange, it would be happy to answer some of the most obvious outstanding issues.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Encouragingly, Facebook said it would welcome the questions and said that they would still also like the meeting on 26 April.

The questions were sent on 16 April and… we never heard from Facebook again…

Here they are:

1. Facebook’s new policy is based on opt-in for facial recognition being applied to inform Facebook users of their faces appearing on photos uploaded by other users. Does this mean that Facebook will index all facial profiles on any photo uploaded, regardless of any consent by any person depicted? Please answer with “yes” or “no” and explain.

1b. More specifically, will Facebook refrain from analysing any photograph uploaded by any user for biometric data about persons depicted on those photos until it has received an opt-in by every person depicted on those photos? Please answer with “yes” or “no”.

2. You state the following: “Second, we’ll ask people who’ve previously chosen to share their political, religious, and “interested in” information in their profile to check that they want to continue to share it.”

Does the above mean that any of the above data will be deleted if Facebook does not receive an explicit consent to retain it? Please answer with “yes” or “no”.

If “yes”, what will be the cut-off date before Facebook starts deleting such data?

2.b If by “sharing” it is meant that the scope of the discontinuation is limited to sharing with other Facebook users and/or Facebook affiliates, how does Facebook consider that this complies with the requirements of art. 9 GDPR for processing these special categories of data?

3. Privacy International created a new Facebook profile to test default settings. By default, everyone can see your friends list & look you up using the phone number you provided. This is not what proactive privacy protections looks like. How does this protect users by design and by default?

4. According to your notification, a “small number of people who logged into ‘This Is Your Digital Life’ also shared their own News Feed, timeline, posts and messages which may have included posts and messages from you”. Why was this not notified to the appropriate national authorities immediately? Are other apps also able to share / receive messages from me?

5. If a similar situation to the one involving Cambridge Analytica were, despite your efforts, to arise again, who would be responsible, Facebook Inc or Facebook Ireland?

6. Why do privacy settings continue to only focus on what friends can & can’t see? If the recent FB scandal has showed one thing, it is that FB’s ad policies have far-reaching consequences for users’ privacy. When are you going to treat ad settings as privacy settings?

7. The GDPR includes new provisions on profiling and automated decision-making. How are you going to change your ad targeting practices to be compliant?

8. The Economist recently reported on how difficult it is for Europeans to download their personal data from Facebook, and Mark Zuckerberg’s testimony described your systems as more transparent than they actually are. How and when, if at all, do you plan to address these issues?

9. You claim to offer a way for users to download their data with one click. Can you confirm that the downloaded files contain all the data that Facebook holds on each user?

You claim to offer a single place to control your privacy. This does not seem to include ways to opt out of ad targeting or to avoid being tracked outside Facebook. Will you offer a single place where users can control every privacy aspect of Facebook, even for people who have no Facebook account?

10. The GDPR gives individuals the right to access and verify their profiles, including marketing profiles based on so called derived data (data that were not disclosed by the user but interpreted from his/her behaviour). Is Facebook going to give its users full access to their marketing profiles? Please answer with “yes” or “no” and explain.

11. Speaking about derived data and marketing profiles, does Facebook process for marketing purposes any data that reveal (directly or indirectly) political opinions of its users? Please answer with “yes” or “no” and explain.

12. Do Facebook apps use smartphone microphones in any way, without this being made clear to the user? If this were to happen, would you consider that lawful?

13. Facebook has voluntary agreements with the Swedish intelligence services to share data. How do you reconcile that with the GDPR?

We are expecting Facebook’s answers any day now…maybe not today, maybe not tomorrow, but soon. If not, we’ll always have Cambridge.

(Contribution by Joe McNamee, EDRi)



18 Apr 2018

Fighting for migrants’ data protection rights in the UK

By Guest author

Since 2014, the United Kingdon (UK) government has steadily rolled out policies to make the country a “hostile environment”  for migrants, in the words of Prime Minister Theresa May.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

This has involved turning various ordinary institutions into border protection agencies. Banks have to collect and supply data to the Home Office (the UK’s interior ministry) on their customers’ immigration status. Landlords are required to check immigration documents before rental. Schools were checking pupils’ nationality and also sharing information with the Home Office, before a boycott campaign put an end to the practice in April 2018. Hospitals, too, must process immigration paperwork before they can deliver any non-urgent treatment. The police, in some regions, are piloting a handheld biometric ID device that instantly gives street officers access to an immigration database.

In the “hostile environment”, migrants are losing the right to live free of pervasive monitoring. They’re also losing the right to basic data protection. This is particularly evident in the case of a data-sharing agreement between the National Health Service (NHS), the Department of Health, and the Home Office. This agreement, established through a Memorandum of Understanding (MoU) in late 2016, without any consultation of professionals or the public, allows immigration enforcement officers to request patient data held by NHS Digital, the database manager for public health in the UK.

The Migrants’ Rights Network (MRN) has been at the forefront of civil society responses to this scheme. MRN, together with Doctors of the World UK, Docs not Cops (a group of professionals resisting the implementation of “hostile environment” measures in the health sector), and civil rights organisation Liberty, argues that sharing data between health services and immigration control officers violates migrants’ fundamental right to patient confidentiality. Such a breach of fundamental privacy rights is all the more worrying that the Home Office has error margins of 10 percent in its decisions to target “immigration offenders” – meaning they would routinely request data for the wrong individuals.

Crucially, introducing the possibility that health services might hand over patient data to the Home Office will make many vulnerable migrants afraid to seek care. This is already a reality. During a parliamentary hearing in January 2018, elected representatives heard the tragic story of an undocumented domestic worker who avoided treatment out of fear that she could be deported, and died of otherwise preventable complications.

MRN argues that such a situation dismantles the very principles of public health, starting with duty of care and public trust in health providers. The Home Office and NHS Digital have denied this, and argue that data-sharing for immigration enforcement is “in the public interest.” Yet the only other reason NHS Digital normally supplies confidential patient data to the Home Office is in the case of serious crime, such as child abuse or murder. By putting immigration and serious crime on a similar level, this data-sharing arrangement contributes to the dramatic criminalisation of undocumented existence (already exemplified in everyday language by the expression “illegal migrant”).

The UK Parliament’s Health Committee and the British Medical Association have both asked for data-sharing to stop. The Home Office have responded by saying they need to gather more evidence of the scheme’s impact, which could take more than a year. MRN believe this is unacceptable, as lives are currently at risk. MRN is thus challenging the data-sharing agreement in court. The organisation has obtained permission for judicial review (after appeal), likely to take place during the summer 2018, and is currently raising funds to cover its potential court costs.

MRN’s legal challenge is rooted in a desire to protect public health principles and vulnerable lives, but it also has broader implications for data protection in the UK. It aims to send a clear signal that data rights cannot be stripped on the basis of nationality. This is absolutely crucial at a moment when the UK’s latest data protection law, currently being debated in Parliament, includes an exemption clause for immigration enforcement, which would prevent migrants from exercising their full rights under the EU General Data Protection Regulation (GDPR). MRN thus hopes to set a positive precedent for judicial activism on these matters, and make a strong case for non-discrimination as a pillar of data justice.

Against Borders for Children campaign: We won! DfE are ending the nationality school census!

Crowdjustice fundraiser: Stop data-sharing between the NHS and the Home Office

Making the NHS a ‘hostile environment’ for migrants demeans our country (24.10.2017)

‘Hostile environment’: the hardline Home Office policy tearing families apart (28.11.2017)

NHS accused of breaching doctor-patient confidentiality for helping Home Office target foreigners (09.11.2017)

Migrants’ Rights Network granted permission for judicial review of patient data-sharing agreement between NHS Digital and the Home Office (01.03.2018)

MRN legal challenge against NHS data-sharing deal (29.11.2017)

(Contribution by Fabien Cante, LSE Media & Communications / Migrants’ Rights Network, the United Kingdom)