25 Jul 2018

EU Council considers undermining ePrivacy

By IT-Pol

On 19 October 2017, the European Parliament’s LIBE Committee adopted its report on the ePrivacy Regulation. The amendments improve the original proposal by strengthening confidentiality requirements for electronic communication services, a ban on tracking walls, legally binding signals for giving or refusing consent to online tracking, and privacy by design requirements for web browsers and apps.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Before trialogue negotiations can start, the Council of the European Union (the Member States’ governments) must adopt its general approach. This process is still ongoing with no immediate end in sight. An analysis of the proposed amendments in Council documents so far shows that the Council is planning to significantly weaken the ePrivacy text compared to the Commission proposal and, especially, the LIBE report.

Metadata for electronic communications should be regarded as sensitive personal data, similar to the categories listed in Article 9 of the General Data Protection Regulation (GDPR). Under the ePrivacy Directive (current legal framework), necessary metadata may be processed for purposes of subscriber billing and interconnection payments, and, with consent of the user, for value added services. Apart from data retention requirements in national law, no other processing is allowed. In the ePrivacy Regulation, the Commission proposal and the LIBE text both uphold the principle of only allowing processing of electronic communications metadata for specific purposes laid down in law or with consent of the end-user. As a new specific purpose, processing for monitoring quality of service requirements and maintaining the availability of electronic communications networks can be done without consent.

The Council proposals significantly expand the permitted processing of metadata without consent by the electronic communications service (ECS) provider. The billing/interconnection purpose is extended to include processing when it is necessary “for the performance of a contract to which the end-user is party”. This will allow the ECS provider to process metadata not directly related to billing through provisions in the contract with the end-user. Service offerings by ECS providers are generally moving towards simpler products with increased reliance on flat rate tariffs, which should reduce the processing and storage of metadata necessary for billing purposes. These privacy benefits will be lost with the Council text.

In December 2017, the Council proposed further processing of metadata without consent for scientific research or statistical purposes based on Union or Member State law. Despite the mandatory safeguards, which include encryption and pseudonymisation, this is a very problematic amendment since a potentially large amount of metadata, which would otherwise be deleted or anonymised, will be retained and stored in identifiable form. Data breaches and law enforcement access are two very specific data protection risks created by this amendment.

The latest text from the Austrian Presidency (Council document 10975/18) goes even further than this by proposing a new general provision for further processing of metadata for compatible purposes inspired by Article 6(4) of the GDPR. This comes very close to introducing “legitimate interest” as a legal basis for processing metadata by the ECS provider, something that has previously been ruled out because metadata for electronic communications is comparable to sensitive personal data under the case law of the Court of Justice of the European Union (CJEU). GDPR Article 9 does not permit the processing of sensitive personal data with legitimate interest as the legal basis. In March 2018, the former Bulgarian Presidency specifically noted that it is highly doubtful whether a non-specific provision for permitted processing would, given the sensitive nature of the data involved, be in line with the case-law of the CJEU.

The LIBE Committee adopted amendments to ensure that electronic communications content was protected under the ePrivacy Regulation during transmission and if the content is subsequently stored by the ECS provider. This is important because storage of electronic communications content is an integral part of many modern electronic communications services, such as webmail and messenger services. However, the Council amendments limit the protection under the ePrivacy Regulation to the transmission of the communication, a period which may be a fraction of a second. After the receipt of the message, the processing falls under the GDPR which could allow processing of personal data in electronic communications content (such as scanning email messages) based on legitimate interest rather than consent of the end-user. As suggested by the Council recital, the end-user can avoid this by deleting the message after receipt, but this would entirely defeat the purpose of many modern electronic communications services.

In Article 8 of the draft ePrivacy Regulation, the LIBE Committee adopted a general ban on tracking walls. This refers to the practice of making access to a website dependent on end-user consent to processing of personal data through tracking cookies (or device fingerprinting) that is not necessary for the provision of the website service requested by the end-user. This practice is currently widespread since many websites display cookie consent banners where it is only possible to click ‘accept’ or ‘OK’.

The Council text goes in the opposite direction with proposed wording in a recital which authorises tracking walls, in particular if a payment option is available that does not involve access to the terminal equipment (e.g. tracking cookies). This amounts to a monetisation of fundamental rights, as EU citizens will be forced to decide whether to pay for access to websites with money or by being profiled, tracked and abandoning their fundamental right to protection of personal data. This inherently contradicts the GDPR since consent to processing of personal data can become the counter-performance for access to a website, contrary to the aim of Article 7(4) of the GDPR.

Finally, the latest text from the Austrian Presidency proposes to completely delete Article 10 on privacy settings. Article 10 requires web browsers and other software permitting electronic communications to offer privacy settings which prevent third parties from accessing and storing information in the terminal equipment, and to inform the end-user of these privacy settings when installing the software. An example of this could be an option to block third party cookies in web browsers. Such privacy settings are absolutely critical for preventing leakage of personal data to unwanted third parties and for protecting end-user privacy when consent to tracking is coerced through tracking walls. The recent Cambridge Analytica scandal should remind everyone, including EU Member States’ governments, of the often highly undesirable consequences of data disclosures to unknown third parties.

If Article 10 is deleted, it will be possible to offer software products that are set to track and
invade individuals’ confidential communications by design and by default, with no possibilities for the individual to change this by selecting a privacy-friendly option that blocks data access by third parties. This goes in the complete opposite direction of the LIBE report, which contains amendments to strengthen the principle of privacy by design by requiring that access by third parties is prevented by default, and upon installation to ask the end-user to either confirm this or select another, possibly less privacy-friendly, option.

The rationale for deleting Article 10 given by the Austrian Presidency is the burden on software vendors and consent fatigue for end-users. The latter is somewhat ironic since technical solutions, such as genuine privacy by design requirements and innovative ways to give or refuse consent, like a mandatory Do Not Track (DNT) standard, are needed to reduce the number of consent requests in the online environment. The Council amendments for articles 8 and 10 would aggravate the current situation, where end-users on countless websites are forced to give essentially meaningless consent to tracking because the cookie banner only provides the option of clicking ‘accept’.

If the ePrivacy amendments in 10975/18 and earlier Council documents are adopted as the general approach, Council will enter trialogue negotiations with a position that completely undermines the ePrivacy Regulation by watering down all provisions which provide stronger protection than the GDPR. This would put a lot of pressure on the European Parliament negotiators to defend the privacy rights of European citizens. For telecommunications services, which presently enjoy the strong protection of the ePrivacy Directive, the lower level of protection will be particularly severe, even before considering the dark horse of mandatory data retention that EU Member States are trying to uphold, in part through amendments to the ePrivacy Regulation.

EDRi, along with EDRi members Access Now, Privacy International and IT-Pol Denmark, have communicated their concerns about the proposed Council amendments though letters to WP TELE, as well as a civil society meeting with Council representatives on 31 May 2018 organised by the Dutch Permanent Representation and the Bulgarian Council Presidency.

Read more:

e-Privacy: What happened and what happens next (29.11.2017)
https://edri.org/e-privacy-what-happened-and-what-happens-next/

EU Member States fight to retain data retention in place despite CJEU rulings (02.05.2018)
https://edri.org/eu-member-states-fight-to-retain-data-retention-in-place-despite-cjeu-rulings/

ePrivacy: Civil society letter calls to ensure privacy and reject data retention (24.04.2018)
https://edri.org/eprivacy-civil-society-letter-calls-to-ensure-privacy-and-reject-data-retention/

Civil society calls for protection of communications confidentiality (13.06.2018)
https://edri.org/civil-society-calls-for-protection-of-communications-confidentiality/

Civil society letter to WP TELE on the ePrivacy amendments in Council document 10975/18 (13.07.2018)
https://edri.org/civil-society-calls-for-protection-of-privacy-in-eprivacy/

(Contribution by Jesper Lund, EDRi-member IT-Pol)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

 

close
27 Jun 2018

NCC publishes a report on tech companies’ use of “dark patterns”

By Maria Roson

Today, the Norwegian Consumer Council (NNC), a consumers group active on the field of digital rights, has published a report on how default settings and “dark patterns” are used by techs companies such as Facebook, Google and Microsoft to nudge users towards privacy intrusive options.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The term “dark patterns” refers to the practices used to deliberately mislead users through exploitative nudging. The NNC describes them as “features of interface design crafted to trick users into doing things that they might not want to do, but which benefit the business in question, or in short, nudges that may be against the user’s own interest”.

The General Data Protection Regulation (GDPR) requires services to be developed according to the principles of data protection by design and data protection by default and obliges companies to make a lawful use of their users’ data. With the entry into operation of the GDPR last May, the three companies had to update the conditions of use of their services, which they did by using a wide variety of “dark patterns”. The report focuses on five of them which overlap with each other and that together form the big picture of how companies mislead users to “chose” invasive instead of data protection-friendly options. This is done by putting in place the following mechanisms:

1. Default settings

Facebook and Google hide and obscure the privacy settings, making it much easier and visible for the user to accept the most intrusive options.

2. Taking the hand of the user to mislead him

Usually, the services push users to accept unnecessary data collection through a combination of positioning and visual cues. Facebook and Google go a step further by requiring a much larger amount of steps to limit data collection, in order to disincentive citizens to protect themselves.

3. Invasive options go first

All three companies presented as the positive option the settings that maximise data collection, creating doubts on the user and even ethical dilemmas. The companies do not explain the full consequences of their choices but frame their messages focusing on the theoretical positive sides of allowing wider data collection, such as the improvement of the user experience.

4. Rewards and punishments

A typical nudging strategy is to use incentives to reward the “right” choice, and punish choices that the service provider deems undesirable. The reward is often described as “extra functionality” or a “better service” (without making clear what this means in practice), while the punishment might be the loss of functionality or the deletion of the account if they decline, which has been the strategy of Facebook and Google. 5. Time pressure: When it came to completing the settings review, all the three services put pressure on the user to complete them at a time determined by the service provider. This was made without a clear option for the user to postpone the settings review and not making clear either whether the user could still use the service or not.

The report concludes that these service providers are just giving users the “illusion of control” while nudging them toward the options more desirable for the companies.

Read more:

DECEIVED BY DESIGN: How tech companies use dark patterns to discourage us from exercising our rights to privacy (27.06.2018)
https://fil.forbrukerradet.no/wp-content/uploads/2018/06/2018-06-27-deceived-by-design-final.pdf

GDPR: noyb.eu filed four complaints over “forced consent” against Google, Instagram, WhatsApp and Facebook (25.08.2018)
https://noyb.eu/wp-content/uploads/2018/05/pa_forcedconsent_en.pdf

GDPR explained
https://gdprexplained.eu/

(Contribution by Maria Roson, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
13 Jun 2018

ePrivacy for Children: What is Data Protection Culture?

By Alternatif Bilisim

The General Data Protection Regulation (GDPR) attracted widespread attention and comment in recent weeks when it came into force on 25 May 2018. Having taken several years to get from being proposed by the European Commission to entering into force, the GDPR has been designed as a concerted, holistic and unifying effort to regulate personal data protection in the digital age.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

At a time when many public, private and third sector organisations have only recently ‘gone digital’ and when data has very rapidly becoming seen as ‘a new currency,’ the scope of application of the GDPR is vast. Serious fines can applied to firms, that do not abide by the new rules. This is no coincidence of course; recent Cambrige Analytica and Facebook violations of privacy forced the public debate to grow and with that awareness of what is at stake.

It is not only the scandals on the surface that have piqued the interest of the average user, though; the capital and energy spent on the data gathering fetish of social media platforms is also a key determinant of the process. The right to erasure is also more easily applicable from now on, signifying more meaningful control over data and the erosion of post-capitalist surveillance society. However, in the decade of tl;dr (too-long-did-not-read) and post-truth, this type of detailed regulation might be a little too complicated to understand for internet users of all ages.

Through the lens of a researcher-mother, one is quickly struck by the image of hyper-socialised millennium generation on massive platforms like Facebook and Instagram. GDPR brings special conditions for childrens’ data. Well, living in Turkey with your child right beside you is not a comfort; you are still spending 16+ hours of your day connected to inter-networks.

The GDPR makes some specific requirements in respect of children’s data, for reasons set out in recital 38: “Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child. The consent of the holder of parental responsibility should not be necessary in the context of preventive or counseling services offered directly to a child.”

While this statement has much merit, it is only an explanatory recital, guiding implementation of the GDPR but lacking the legal force of an article. In a recent London School of Economics Media Policy Project roundtable, it became clear that there is considerable scope for interpretation, if not confusion, regarding the legal basis for processing (including, crucially, when processing should be based on consent), the definition of an information society service (ISS) and the meaning of the phrase “directly offered to a child” in Article 8 (which specifies a so-called “digital age of consent” for children), the rules on profiling children, how parental consent is to be verified (for children younger than the age of consent), and when and how risk-based impact assessments should be conducted (including how they should cover intended or actual child users). It is also unclear in practice just how children will be enabled to claim their rights or seek redress when their privacy is infringed.

Already there are some surprises. WhatsApp, currently used by 24% of UK 12-15 year olds, announced it will restrict its services to those aged 16+, regardless of the fact that in many countries in Europe the digital age of consent is set at 13. Instagram is now asking its users if they are under or over 18 years old, perhaps because this is the age of majority in the United Nations Convention on the Rights of the Child (UNCRC)? We will see how things will unfold in the coming months.

In the meantime, a few suggestions are made by Sonia Livingstone of the London School of Economics in the light of a new project. For exploring how children themselves understand how their personal data is used and how their data literacy develops through the years from 11-16 years old, (1) conducting focus group research with children; (2) organising child deliberation panels for formulating child-inclusive policy and educational/awareness-raising recommendations; and (3) creating an online toolkit to support and promote children’s digital privacy skills and awareness. The young generation reminds us once again of the responsibility for creating commons data culture at grassroots level.

Do such changes mean effective age verification will now be introduced (leading to social media collecting even more personal data?), or will the GDPR become an unintended encouragement for children to lie about their age to gain access to beneficial services, as part of their right to participate? How will this protect them better? And what does this increasingly complex landscape mean for media literacy education, given that schools are often expected to overcome regulatory failures by teaching children how to engage with the internet critically? As in the case of Turkey, teachers digital literacy skills need a serious and rapid boost and even more primarily, policies regarding internet governance and community education must be redrafted.

Translated from the Original Text by Asli Telli Aydemir, Alternative Informatics (Alternatif Bilisim)
You can read the original text in Turkish here.

Read more:

A Digestible Guide to Individual’s Rights under GDPR (29.05.2018)
https://edri.org/a-guide-individuals-rights-under-gdpr/

EDRi General Director Joe McNamee live interview on TRTWorld
https://www.youtube.com/watch?v=pKvcZz8TKuE

GDPR Exlained Campaign
https://gdprexplained.eu

Time to Disagree Campaign
https://timetodisagree.eu

EDRi`s privacy for kids booklet
https://edri.org/privacy-for-kids-digital-defenders/

(Contribution by Alternatif Bilisim, EDRi member, Turkey)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
31 May 2018

Xnet: Opposing guarded access to institutional information

By Xnet

In their fight for free access to information and data protection, Spanish EDRi member Xnet contacted the Spanish Data Protection Agency (AEPD). As the AEPD is the institution responsible for the implementation of the General Data Protection Regulation (GDPR) in Spain, Xnet brought up questions about the compliance of the agency’s work with the new regualation.

You can read Xnet’s letter below:

“To whom it may concern,

We have two questions:

1. In order to offer information that should be publicly accessible, you ask for all the personal details of those requesting said information. Does this not clash with article 5 of the GDPR, which states that only the data necessary for performance of the task should be obtained? It is our understanding that for the task of offering information on topics in the public domain, you do not need any data.

As a specific example, in order to ask you this very question, we had to write to you with our electronic certificate, which means you  have access to our personal data. As we understand it, your task is to answer these questions to whomever asks, it should have been possible to ask them without you needing to know who we are. Is this not the case? However, you offer no information by email or by telephone, only in response to communications using electronic certificates.

If our understanding is not correct, we would kindly ask you to send us the legal articles that corroborate your interpretation.

2. We do not understand why the Spanish Data Protection Agency, which as previously mentioned is highly demanding with individuals, does not use https (Secure Hypertext Transfer Protocol) by default in its digital spaces. This leaves the data of those who access your websites vulnerable. We would like to know the reason for this.

Thank you for your attention.

Regards,

Xnet”

(Contribution by Xnet, EDRi member)

Read more:

A Digestible Guide to Individual’s Rights under GDPR (29.05.2018)
https://edri.org/a-guide-individuals-rights-under-gdpr/

close
30 May 2018

Your ePrivacy is nobody else’s business

By Maria Roson

The right to privacy is a fundamental right for every individual, enshrined in international human rights treaties. This right is being particularly threatened by political and economic interests, which are having a deep impact on freedom of expression, democratic participation and personal security. The recent Facebook-Cambridge Analytica scandal is a perfect example of the risks that privacy breaches poses to individuals’ rights.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Under the excuse of providing customers with “a better service”, companies are often unnecessarily asking to exploit communications data, and track them online. In practice, these “requests” often leave users without the real possibility of refusing, as this would mean not being allowed to use the service. This is what EDRi member Bits of Freedom calls “tracking walls”. To protect citizens from this and other abusive practices, EU level rules have been developed, namely the ePrivacy Directive. This Directive was adopted in 2002 and revised in 2009. Now, a new proposal for a ePrivacy Regulation is on the table.

The protection of the right to privacy online in the ePrivacy Regulation should be at the centre of EU’s priorities. For this reason, it is important to be aware of the most sensitive issues concerning ePrivacy, to be able to identify when citizens’ rights could be at risk:

Consent

Consent is one of the ways to allow your data to be used legally. Through free and informed consent, the users agree that a company to accesses a specific personal information for a specific purpose.. Consent drives the trust that is needed for new services but it needs to be meaningful. It must be freely given, specific, informed and explicit, not the only choice that is available. For example, accepting abusive permissions “required” by an app, when the only alternative is not using the app at all, is not a valid form of requiring consent.

Legitimate interest

“Legitimate interest” means that under exceptional circumstances it would be legal to access personal data without the user’s consent. Communications data – your emails, calls over the internet, chats, and so on – must be treated as sensitive data, as it has been stated by the Court of Justice of the European Union (CJEU). The “legitimate interest” exception allows only the use of non-sensitive data – such as an email address or a telephone number – therefore communications data cannot, logically and legally, be processed under this exception. For this reason, companies should, in no circumstances, be allowed to monetise or otherwise exploit sensitive communications without specific permission.

Given that the scope of the ePrivacy Regulation deals with sensitive data, the legitimate interest exception has no place in it. Any suchexception would fatally undermine users’ control over such information. Moreover, it would affect freedom of expression, as the users would fear having their communications controlled by companies without consent.

Offline tracking

Offline tracking is a highly intrusive technology, which implies being tracked through your electronic device. The location of your device can be used for unlawful purposes involving the use of sensitive data, revealing personal information of the users, particularly when they are in the vicinity of – or in – various service or institutions. The European Commission has proposed to allow this offline tracking as long as the individual notified. However, obtaining this information by tracking individual citizens poses severe privacy risks and possibilities for abuse, including the risk of mass surveillance by commercial or law enforcement entities. For these reasons, every update of the ePrivacy rules must consider less intrusive ways to obtain location-based information.

Privacy by design and by default

In the same way that you expect to use a microwave oven without having to think about a risk of starting a fire in your house, your connected devices should protect your privacy by design and by default. Privacy by design is the principle by which a high level of user privacy protection is incorporated in all stages of a device’s creation Privacy by default means that our devices are set to protect our data, with options to change this, if we wish to do so. As the ePrivacy Regulation will be the main framework to protect your communications online, it is important that hardware and software (not only browsers) will be designed, at all stages, to protect the privacy of individuals by default, and not by option.

The ePrivacy Regulation is currently being revised in the Council of the European Union, and there is an aggressive lobbying campaign to influence the Regulation to allow big business to exploit personal data more easily. Consequently, it will become less favourable for protecting citizens and their privacy online – the very purpose of the Regulation. Some of the [edri.org/files/eprivacy/ePrivacy_mythbusting.pdf arguments promoted by the lobbyists] are that ePrivacy is bad for democracy and for media pluralism, and that it prevents the fight against illegal content. (None of these arguments is actually linked with protecting privacy.) We have busted these myths, as well as the rest of the most common misconceptions related to ePrivacy. You can read more about it here: edri.org/files/eprivacy/ePrivacy_mythbusting.pdf

Being aware of what it is at risk is the best way to fight against lobby campaigns threatening citizens’ rights.

(Contribution by Maria Roson, EDRi Intern)

Read more:

Mythbusting – Killing the lobby myths that are polluting the preparation of the e-Privacy Regulation
https://edri.org/files/eprivacy/ePrivacy_mythbusting.pdf

EU Member States fight to retain data retention in place despite CJEU rulings (02.05.2018)
https://edri.org/eu-member-states-fight-to-retain-data-retention-in-place-despite-cjeu-rulings/

ePrivacy: Civil society letter calls to ensure privacy and reject data retention (24.04.2018)
https://edri.org/eprivacy-civil-society-letter-calls-to-ensure-privacy-and-reject-data-retention/

Cambridge Analytica access to Facebook messages a privacy violation (18.04.2018)
https://edri.org/cambridge-analytica-access-to-facebook-messages-a-privacy-violation/

(Contribution by Maria Roson, EDRi Intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
29 May 2018

A Digestible Guide to Individual’s Rights under GDPR

By Guest author

The General Data Protection Regulation went into effect on May 25th and Privacy Policy updates have been flooding inboxes. GDPR enhances everyone’s rights, regardless of nationality, gender, economic status and so on. Unfortunately, the majority of individuals know very little about these rights and GDPR at large. The following guide is part of the GDPRexplained campaign and provides a digestible explanation of individuals’ rights and basic concepts in the EU’s new data protection regulation.

What are my rights under the GDPR?
1. You have the right to information.
  • Companies and organisations are now required to communicate to you, in plain and accessible language, what personal data they process and how they use it. (“Processing” includes anything related to the collection, aggregation, mining or sharing of data.)
  • If a company or organisation builds a profile on you (e.g. from data matched up from different sources), you have the right to know what’s in this profile.
2. You have the right to secure handling.

The GDPR regulates that personal data should be stored and processed securely.

3. You have the right to access the personal data a company/organisation holds on you, at any time.
  • If the data is inaccurate, you can change or complete it.
  • If the data is no longer necessary, you can ask the company/organisation to delete it.
  • If you initially gave the company/organisation more data than was necessary for receiving the service (e.g. for marketing purposes), but no longer want them to have this data, you can ask them to delete it.
4. You have the right to use a service without giving away additional data.

If a company/organisation wants to process personal data that is not strictly necessary for the provision of a particular service (e.g. a transport app that wants access to your phone’s contact list), they need to get your explicit consent to process that data. Note that even if a company believes that certain data is in their interest to process, this does not always mean that it is necessary. If you have already consented to the processing of additional data, you can always withdraw this consent.

5. With automated decisions, you have the right to explanation and human intervention.
  • If a decision has been made about you through automatic mechanisms, you have the right to know how the decision was made (i.e. you are entitled to an explanation of the logic behind the mechanism used).
  • When it comes to automated decision-making, you have a right to human intervention, and the right to contest any decision made.
6. How will these rights be enforced?

Each country will have an independent public Data Protection Authority (DPA) to ensure that companies are in compliance with the regulation. You have the right to lodge a complaint with your DPA or to go to court if you feel that your rights have been violated.

7. Do I need to do anything?

No. It’s up to companies and organisations to make sure that your personal data is protected. There are, however, still decisions you’ll need to make.

  • For new services you want to use: If the company is asking you to give them data, do you really want to agree? (If the service only processes necessary data, they are required to inform you but do not need to ask for special consent to do so. They do, however, need to ask for explicit consent when they want data that’s not necessary).
  • For the services you’re using at the moment: Are you still comfortable with the way the company/ organisation collects, analyses and shares your personal data? If you no longer agree, you can simply say “no”.
  • Finally: if you think your rights are not being upheld, you can decide to report it to your DPA, or even challenge the company in court.
8. Does it mean I can “delete” myself?

Not quite. You can’t delete all your personal data whenever you want to. But you can ask to have your data deleted in a few specific situations – for example if a company/organisation no longer needs it it in order to provide the service you are using, or if you decide to withdraw your consent. However, even in such cases, companies may still have viable reasons to keep your data, for example for tax purposes or to protect themselves from possible future claims.

9. Can I talk to companies about their use of my data?

Absolutely! The GDPR requires that companies and organisations respond to questions about personal data. This includes whether or not they process your personal data in the first place, and if so for what purpose, how long it will be stored, and with whom it is shared. And if you ever change your mind about what you have consented to or accepted, companies and organisations are also required not only to make it easy for you to communicate this choice, but also to act upon it.

10. What can I do if a company is using my personal data against my will?
  • It may be useful to contact the company itself first. Regardless of whether you do that, however, you can also file a complaint with your national Data Protection Authority – even if the company does not have an office in your country. And if you’re not satisfied with the DPA’s decision, you can take the company to court.
  • You can also skip the DPA and go directly to court if you feel your rights have been violated.
  • If as a result of a violation you have suffered material or non-material damage, you can seek financial compensation.
  • Third parties, such as consumer protection agencies, digital rights foundations or other interest groups, could also litigate on behalf of you and others.
11. Why are some companies critical of the GDPR?

Many companies have become used to treating your data as a ‘free resource’ – something they could take without asking permission and exploit for their own financial gain; something they could collect without limit, without protecting it. The GDPR is a powerful tool to force companies to re-evaluate the risks involved – not just to the individuals whose data they process, but also to themselves, in terms of fines and loss of customer trust – and to treat your data with the common-sense care and respect that should really have been in place from the beginning.

12. Does the GDPR apply to the data my employer has on me?

Yes. Your employer, like any other organisation that processes data, has to conform to the GDPR. However each EU member state can adopt more specific rules when it comes to the employment relationship. If you’re interested in this, you should look for more information on your national Data Protection Authority’s website.

13. Does the GDPR apply to US companies?

Yes. As soon as a company monitors or tracks the behaviour of internet users on EU territory, the regulation will kick in – no matter where the company is based.

Read more:

GDPRexplained: a social campaign launched today reminds the new regulation is there to protect our rights (25.05.2018)
https://en.panoptykon.org/articles/gdprexplained-social-campaign-launched-today-reminds-new-regulation-there-protect-our

Press Release: GDPR: A new philosophy of respect (25.05.2018)
https://edri.org/press-release-gdpr-philosophy-respect/

The four year battle for the protection of your data (24.05.2018)
https://edri.org/four-year-battle-protection-of-your-data-gdpr/

Twitter_tweet_and_follow_banner

close
29 May 2018

GDPRexplained Campaign: the new regulation is here to protect our rights

By Panoptykon Foundation

Hundreds of e-mails informing about changes to companies’ privacy policies were sent out across the EU in the name of the GDPR. Both users and companies are confused with the variety of – sometimes contradictory – explanations and interpretations. The #GDPRexplained / #TimeToDisagree campaign launched by Panoptykon together with European Digital Rights and Bits of Freedom reminds everyone that the GDPR is – above all – a new tool to protect our rights.

The new data protection regulations re-emphasize that what we are protecting is living people and not meaningless sets of digits. A person can easily fall victim to wrongdoings concerning personal data. For instance, consumers may be negatively impacted if an insurance company increases a fee, a bank rejects an application for a loan based on unclear criteria or an ISP manipulates their political and consumer decisions by streaming a “tailored” newsfeed on their wall, without explaining the logic behind the choice.

The point of the new regulation – to regain control over who knows what about us and what they do with this information – is buried under the discussion about how companies are not meeting up to their requirements and seeking simple yes or no answers to particular dilemmas. What really matters though is the people and their rights.

Have you ever received a call from an unknown company and the person on the other side of the wire called you by your first name? The GDPR will make it easier to find out where did the company obtained your data from and ask them to erase it. It will challenge the common problem of bullying users to get their consent for data processing. The fuss around the GDPR alone makes so many people think: perhaps I don’t have to agree to all of this? A strong data protection authority and a perspective of real financial sanctions should discourage everyone from taking unnecessary risks associated with violating the rights of their customers.

Our GDPR Explained campaign aims at educating individuals and organisations about the new rights granted to us and the changes to be made when dealing with personal data. We have put together answers to many important questions we have received and built a FAQ for anyone to access.

Visit the campaign at https://gdprexplained.eu.

Read more:

Press Release: GDPR: A new philosophy of respect (25.05.2018)
https://edri.org/press-release-gdpr-philosophy-respect/

The four year battle for the protection of your data (24.05.2018)
https://edri.org/four-year-battle-protection-of-your-data-gdpr/

GDPRexplained: a social campaign launched today reminds the new regulation is there to protect our rights (25.05.2018)
https://en.panoptykon.org/articles/gdprexplained-social-campaign-launched-today-reminds-new-regulation-there-protect-our

(Contribution by Panoptykon Foundation, EDRi member)

Twitter_tweet_and_follow_banner

close
24 May 2018

Press Release: GDPR: A new philosophy of respect

By EDRi

The General Data Protection Regulation (GDPR) is going in effect tomorrow, on 25 May 2018, strengthening and harmonising individuals rights in regards to personal data. A much celebrated success for all privacy advocates, GDPR is more than just a law.

GDPR is a new philosophy that promotes a culture of trust and security and that enables an environment of Respect-by-Default

said Joe McNamee, Executive Director of European Digital Rights.

The Directive adopted in 1995 was characterised by a tendency towards bureaucratic compliance with little enforcement. The GDPR represents a recalibration of focus, establishing a new balance between companies, people and data. The framework does not only protect, but also changes, perceptions of personal data. On one hand, GDPR protects individuals from companies and governments abusing their personal data and promotes privacy as a standard. On the other, it gives businesses the chance to develop processes with privacy-by-default in mind, ensuring in this way both individuals’ trust and legal compliance . GDPR minimises the risk of some companies’ bad behaviour undermining trust in all actors.

The GDPR is capable of setting the highest regional standards for the protection of personal data; once well implemented, we need updated global rules

said Diego Naranjo, Senior Policy Advisor of European Digital Rights.

While not perfect, because no legislation is perfect, the GDPR is probably the best possible outcome in the current political context. We will now have to rely on each EU Member State’s Data Protection Authority (DPA) to do their jobs correctly and on governments to ensure enough resources have been allocated to allow this to happen.

To promote educational efforts around GDPR, we have developed an online resources that help everyone better understand their new rights and responsibilities, the “GDPR Explained” campaign which will be launched shortly.

Read more:

The four year battle for the protection of your data (24.05.2018)
https://edri.org/four-year-battle-protection-of-your-data-gdpr/

EU Data Protection Package – Lacking ambition but saving the basics (17.12.2015)
https://edri.org/eu-data-protection-package-lacking-ambition-but-saving-the-basics/

Twitter_tweet_and_follow_banner

close
24 May 2018

The four year battle for the protection of your data

By Bits of Freedom

In 2012, what would become a four-year process started: the creation of new European data protection rules. The General Data Protection Regulation would replace the existing European Data Protection Directive adopted in 1995 and enhance and harmonise data protection levels across Europe. The result is an influential piece of legislation that touches on the lives of 500 million people and creates the highest regional standard for data protection.

A lobbyist feeding frenzy

With so much at stake, civil society was preparing for strong push-back from companies. But we could never have dreamed just how dead set corporate lobbyists were on undermining citizens’ rights – or the lengths they would go to to achieve their goals. Former European Commissioner Viviane Reding said it was the most aggressive lobbying campaign she had ever encountered. The European Parliament was flooded with the largest lobby offensive in its political history.

Civil society fights back

The European Digital Rights network worked together and continued to fight back. Among other things we had to explain that data leaks are dangerous and need to be reported, and that it’s not acceptable to track and profile people without their consent. We were up against the combined resources of the largest multinational corporations and data-hungry governments, but we also had two things in our favor: the rapporteur Jan Philipp Albrecht and his team were adamant about safeguarding civil rights, and in 2013 the Snowden-revelations made politicians more keen on doing the same. Against all odds, we prevailed!

GDPR isn’t perfect, but it is a way forward

The General Data Protection Regulation that was adopted in 2016, and will be enforced starting May 25th, is far from perfect. As we pointed out in 2015, we did however manage to save “the essential elements of data protection in Europe”, and now have a tool with which to hold companies and governments using your data to account. We are committed to doing just that. We will continue to fight for your privacy, speak out when and where it is necessary and help you do the same.

Read more:

EU Data Protection Package – Lacking ambition but saving the basics (17.12.2015)
https://edri.org/eu-data-protection-package-lacking-ambition-but-saving-the-basics/ 

EDRi GDPR document pool
https://edri.org/gdpr-document-pool/

(Contribution by Evelyn Austin, EDRi member Bits of Freedom, the Netherlands)

Twitter_tweet_and_follow_banner

close
02 May 2018

Are GDPR certification schemes the next data transfer disaster?

By Foundation for Information Policy Research

The General Data Protection Regulation (GDPR) encourages the establishment of data protection certification mechanisms, “in particular at [EU] level” (Art. 42(1)). But the GDPR also envisages various types of national schemes, and allows for the approval (“accreditation”) of schemes that are only very indirectly linked to the national data protection authority.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

On 6 February 2018, the Article 29 Working Party (WP29) adopted Draft Guidelines on the accreditation of certification bodies under Regulation (EU) 2016/679 (WP261). On 16 February, it issued a call asking for comments on these draft guidelines. Why can this seemingly technical issue have major implications, in particular in relation to transfers of personal data to third countries without “adequate” data protection (such as the USA)?

The GDPR stipulates that, in relation to several requirements (consent, data subject rights, etc.), a data protection seal (issued at national or EU level) can be used as “an element by which to demonstrate” the relevant matters. This makes such seals useful and valuable, but still allows the data protection authorities to assess whether a product or service for which a seal has been issued really does conform to the GDPR.

However, in one context this is different: in relation to transfers of personal data to third countries without adequate data protection. Such transfers are in principle prohibited, subject to a limited number of exceptions, including where “appropriate safeguards” are provided by the controller or processor (Art. 46). In this regard, the GDPR stipulates that such appropriate safeguards “may be provided for” inter alia by:
an approved certification mechanism pursuant to Article 42 together with binding and enforceable commitments of the controller or processor in the third country to apply the appropriate safeguards, including as regards data subjects’ rights (Art. 46(2)(f)).

In other words, in relation to transfers of personal data to countries without adequate data protection, certifications are conclusive: they provide, in and by themselves, the required safeguards. Indeed, the article adds that certifications can achieve this “without requiring any specific authorisation from a supervisory authority” (leading sentence to Article 46(2)).

In the highly sensitive context of data transfers, it is therefore crucial that certification schemes will ensure that certifications can and will only be issued in cases in which they really provide cast-iron safeguards, “essentially equivalent” to those provided within the European Union and the European Economic Area (EEA) by the GDPR. Otherwise, the very same problems and challenges will arise as arose in relation to the discredited “Safe Harbor” scheme and the not-much-less contestable (and currently contested) “Privacy Shield”.

Unfortunately, the GDPR does not directly guarantee that certification schemes must be demanding and set high standards. Rather, member states can choose from three types of arrangement: the relevant national data protection authority (DPA) issuing seals; the national DPA accrediting other bodies to issue seals; or leaving it to national accreditation bodies to accredit other bodies to issue seals. In the last case, the seal-issuing bodies are therefore two arms-lengths removed from the DPAs. Moreover, national accreditation bodies normally accredit technical standards bodies, for example, for medical devices or toys – they are unsuited to approve mechanisms supposed to uphold fundamental rights. This could lead to low-standard seal schemes, in particular in countries that have always been lax in terms of data protection rules and enforcement, such as the UK and Ireland.

The only safeguard against the creation of weak certification schemes lies in the criteria for accreditation of certification schemes, applied by the relevant accrediting body (which as just mentioned need not be the country’s DPA): those criteria must be approved by the relevant national DPA, subject to the consistency mechanism of the GDPR (which means that ultimately the new European Data Protection Board, created by the GDPR as the successor to the Article 29 Working Party) will have the final say on those criteria. But this is still rather far removed from the actual awarding of certifications.

Surprisingly, the Draft Guidelines on the accreditation of certification bodies, released by the WP29, do not include the very annex that is to contain the accreditation criteria.

To the extent that the WP29 say anything about them, they play them down: the WP29 says that the as-yet-unpublished guidelines in the not-yet-available annex will “not constitute a procedural manual for the accreditation process performed by the national accreditation body or the supervisory authority”, but rather will only “provide […] guidance on structure and methodology and thus a toolbox to the supervisory authorities to identify the additional requirements for accreditation” (p. 12).

As pointed out in a letter to the WP29, “the WP29 Draft Guidelines therefore fail to address the most important issues concerning certification”. The letter calls on the WP29 to:

urgently provide an opinion on the ways in which it can be assured that certification schemes will really only lead to certifications at the highest level, and in particular to ensure that certifications will not be used to undermine the strict regime for transfers of personal data from the EU/EEA to third countries that do not provide “adequate” (that is: “essentially equivalent”) data protection to that provided by the GDPR –

[and to]

urgently move towards the accreditation of (a) pan-EU/EEA certification scheme(s) at the highest level, and adopt a policy that would require controllers and processors involved in cross-border processing operations within the EU/EEA and/or data transfers to third countries without adequate data protection to seek such pan-EU/EEA certifications for such cross-border operations, rather than certifications issued by national schemes.

Draft Guidelines on the accreditation of certification bodies under Regulation (EU) 2016/679 (WP261)
http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=614486

Letter to the Article 29 Working Party
https://edri.org/files/EDRi_comments_on_WP261_re-accreditation.pdf

General Data Protection Regulation (GDPR)
https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=uriserv:OJ.L_.2016.119.01.0001.01.ENG&toc=OJ:L:2016:119:TOC

(Contribution by Douwe Korff, EDRi member Foundation for Information Policy Research – FIPR, United Kingdom)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close