07 Nov 2018

My Data Done Right launched: check your data!

By Bits of Freedom

On 25 October 2018 EDRi member Bits of Freedom launched My Data Done Right – a website that gives you more control over your data. From now on you can easily ask organisations what data they have about you, and ask them to correct, delete or transfer your data.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Use your rights

On 25 May 2018, new privacy rules entered into force in Europe. Based on these rules you have several rights that help you to get more control over your data. However, these rights can only have effect if people can easily exercise them. That is why Bits of Freedom developed My Data Done Right.

Generate and keep track of your requests

With My Data Done Right, you can easily create an access, correction or removal request, or a request to transfer your data. You no longer have to search for the contact details in privacy statements. This information on more than 1000 organisations is already collected on the website. You don’t have to prepare the request yourself either, but it is automatically generated based on your input. You only have to send it.

My Data Done Right also contains a few other useful options. You can receive a reminder about your request by email or in your calendar, so that you don’t forget the request you’ve sent. At the moment, you can generate requests in English and Dutch. Soon there will also be an option to share your experiences with us through a short questionnaire.

Cooperation across Europe

The launch is a starting point for the further development of My Data Done Right. We plan to continue expanding the database with organisations, but also to make My Data Done Right available for all people in the European Union.

Together with other digital rights organisations and volunteers, Bits of Freedom will work on versions of My Data Done Right for other EU countries and grow our database to include many more organisations to which you can address your requests. Do you want to help? Please contact Bits of Freedom!

My Data Done Right
https://www.mydatadoneright.eu/

GDPR Today: Stats, news and tools to make data protection a reality (25.10.2018)
https://edri.org/the-gdpr-today-stats-news-and-tools-to-make-data-protection-a-reality/

Press Release: GDPR: A new philosophy of respect (24.05.2018)
https://edri.org/press-release-gdpr-philosophy-respect/

A Digestible Guide to Individual’s Rights under GDPR (29.05.2018)
https://edri.org/a-guide-individuals-rights-under-gdpr/

(Contribution by David Korteweg , EDRi member Bits of Freedom, the Netherlands)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
07 Nov 2018

Facebook fails political ads tests several times

By Chloé Berthélémy

On 28 June 2018, Facebook announced it had set forth a compulsory “Paid for by” feature, limiting anonymity by requiring to submit a valid ID and proof of residence. This had been introduced in reaction to a series of election interferences in the past year through foreign political advertising on social media platforms. This tool was supposed to help reducing “bad ads”, fighting against election manipulation and online disinformation. Since then, several experiments have been conducted to see whether ad manipulation is still possible on Facebook. Facebook failed all of these tests.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

On 25 and 30 October 2018, VICE News – an online media service publishing news documentaries and reports owned by Vice Media, a North American digital media and broadcasting company – revealed that it had conducted experiments on Facebook’s advertisement system. VICE News declared that it managed to pretend to be US Vice President Mike Pence, the US Democratic National Committee Chairman Tom Perez, the Islamic State and 100 US senators posting and sponsoring political ads on Facebook. The low quality of Facebook’s screening is also challenged by the absurdity of the pages approved to share the ads such as ‘Cookies for Political Transparency’ and ‘Ninja Turtles PAC’. As explained by VICE News, tricking the system was not difficult or did not require extra specific knowledge or competences.

On 31 October 2018, Business Insider revealed that it had also managed to set up a fake NGO page and run ads “paid for by Cambridge Analytica”.

VICE News and Business Insider’s investigations show that anyone can lie about the sponsoring of political ads. As a joint civil society report published by EDRi, Access Now and the Civil Liberties Union for Europe shows, unless the deeper problem of the business model that feeds these concerns is addressed, online disinformation or the underlying issue of online manipulation will not be resolved.

Informing the “disinformation” debate by EDRi, Access Now and Civil Liberties (18.10.2018)
https://edri.org/files/online_disinformation.pdf

Facebook’s political ad tool let us buy ads “paid for” by Mike Pence and ISIS by William Turton, VICE News (25.10.2018)
https://news.vice.com/en_us/article/wj9mny/facebooks-political-ad-tool-let-us-buy-ads-paid-for-by-mike-pence-and-isis

We posed as 100 Senators to run ads on Facebook. Facebook approved all of them by William Turton, VICE News (30.10.2018)
https://news.vice.com/en_us/article/xw9n3q/we-posed-as-100-senators-to-run-ads-on-facebook-facebook-approved-all-of-them

We ran 2 fake ads pretending to be Cambridge Analytica — and Facebook failed to catch that they were frauds by Shona Ghosh, Business Insider (31.10.2018)
https://www.businessinsider.nl/facebook-approved-political-ads-paid-for-by-cambridge-analytica-2018-10/

(Contribution by Chloé Bérthélemy, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
25 Oct 2018

The GDPR Today – Stats, news and tools to make data protection a reality

By EDRi

25 October 2018 marks the launch of GDPR Today – your online hub for staying up-to-date with the (real) life of the new EU data protection law, the General Data Protection Regulation (GDPR). The project will monitor the implementation of the law across Europe by publishing statistics and sharing relevant news around key subjects.

GDPR Today, led by several EDRi member organisations, aims to complement our association’s past support for the data protection reform.

Katarzyna Szymielewicz, vice-president of EDRi and co-founder and president of Panoptykon Foundation

The initiative will prioritise building knowledge around legal guidelines and decisions, data breaches, new codes of conduct, tools facilitating individuals’ exercise of rights, important business developments and governmental support for data protection authorities. The GDPR Today is an instrument aimed at data protection experts, activists, journalists, lawyers, and anyone interested in the protection of personal data.

Our goal with GDPR Today is to present facts to the public on the implementation of the law, so that those interested can follow how the GDPR is both shaping the EU digital market and helping people regain control over their personal data.

Estelle Massé, Senior Policy Analyst and Global Data Protection Lead at Access Now

The GDPR has so far often been portrayed as a burden, and the focus has been on so-called non-functional elements which remain untested and often created misunderstanding around the functional ones. The GDPR Today will put facts on the implementation of the law at the centre of the debate.

Read the first edition of the GDPR Today here: https://www.gdprtoday.org/

Twitter_tweet_and_follow_banner

close
24 Oct 2018

Council continues limbo dance with the ePrivacy standards

By Yannic Blaschke

It’s been six-hundred-fifty-two days since the European Commission launched its proposal for an ePrivacy Regulation. The European Parliament took a strong stance towards the proposal when it adopted its position a year ago, but the Council of the European Union is still only taking baby steps towards finding its position.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

In their latest proposal, the Austrian Presidency of the Council continues, unfortunately, the trend of presenting the Council with suggestions that lower privacy protections that were proposed by the Commission and strengthened by the Parliament. In the latest working document that was published on 19 October 2018, it becomes apparent that we are far from having reached the bottom of what the Council sees as acceptable in treating our personal data as a commodity.

Probably the gravest change of the text is to allow the storing of tracking technologies on the individual’s computer without consent for websites that partly or wholly finance themselves through advertisement, provided they have informed the user of the existence and use of such processing and the user “has accepted this use” (Recital 21). The “acceptance” of such identifiers by the user as suggested is far from being the informed consent that the General Data Protection Regulation (GDPR) established as a standard in the EU. The Austrian Presidency text will put cookies which are necessary for a regular use (such as language preferences and contents of a shopping basket) on the same level as the very invasive tracking technologies which are being pushed by the Google/Facebook duopoly in the current commercial surveillance framework. This opens the Pandora’s box for more and more sharing, merging and reselling citizen’s data in huge online commercial surveillance networks, and micro-targeting them with commercial and political manipulation, without the knowledge of the person whose private information is being shared to a large number of unknown third parties.

One of the great added values of the ePrivacy Regulation (which was originally intended to enter into force at the same point in time as the GDPR) is that it’s supposed to raise the bar for companies and other actors who want to track citizens’ behaviour on the internet by placing tracking technologies on the users’ computers. Currently, such an accumulation of potentially highly sensitive data about an individual mostly happens without real knowledge of individuals, often through coerced (not freely given) consent, and the data is shared and resold extensively within opaque advertising networks and data-broker services. In a strong and future-proof ePrivacy Regulation, the collection and processing of such behavioural data thus needs to be tightly regulated and must be based on an informed consent of the individual – an approach that becomes now more and more jeopardised as the Council seems to become increasingly favourable to tracking technologies.

The detrimental change of Recital 21 is only one of the bad ideas through which the Austrian Presidency seeks to strike a consensus: In addition, there is for instance the undermining of the protection of “compatible further processing” (which is itself already a bad idea introduced by the Council) in Article 6 2aa (c), or the watering down of the requirements for regulatory authorities in Article 18, which causes significant friction with the GDPR. With one disappointing “compromise” after another, the ePrivacy Regulation becomes increasingly endangered of falling short on its ambition to end unwanted stalking of individuals on the internet.

EDRi will continue to observe the developments of the legislation closely and calls everyone in favour of a solid EU privacy regime that protects citizens’ rights and competition to voice their demands to their member states.

Five Reasons to be concerned about the Council ePrivacy draft (26.09.2018)
https://edri.org/five-reasons-to-be-concerned-about-the-council-eprivacy-draft/

EU Council considers undermining ePrivacy (25.07.2018)
https://edri.org/eu-council-considers-undermining-eprivacy/

Your ePrivacy is nobody else’s business (30.05.2018)
https://edri.org/your-eprivacy-is-nobody-elses-business/

e-Privacy revision: Document pool (10.01.2017)
https://edri.org/eprivacy-directive-document-pool/

(Contribution by Yannic Blaschke, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
24 Oct 2018

ePrivacy: Public benefit or private surveillance?

By Yannic Blaschke

92 weeks after the proposal was published, the EU is still waiting for an ePrivacy Regulation. The Regulation is supposed to replace the current ePrivacy Directive, aligning it with the General Data Protection Regulation (GDPR).

While the GDPR regulates the ways in which personal data is processed in general, the ePrivacy Regulation specifically regulates the protection of privacy and confidentiality of electronic communications. The data in question not only includes the content and the “metadata” (data on when, where and to whom a person communicated) of communications, but also other identifiers such as “cookies” that are stored on users’ computers. To make the legislation fit for its purpose in regard to technological developments, the European Commission (EC) proposal addresses some of the major changes in communications of the last decade, including the use of so-called “over the top” services, such as WhatsApp and Viber.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

The Regulation is currently facing heavy resistance from certain sectors of the publishing and behavioural advertising industry. After an improved text was adopted by the European Parliament (EP), it is now being delayed at the Council of the European Union level, where EU Member States are negotiating the text.

One of the major obstacles in the negotiations is the question to what extent providers such as telecommunication companies can use metadata for other purposes than the original service. Some private companies – the same ones that questioned the need of consent from users in the GDPR – now re-wrapped their argument saying that an “overreliance” on consent would substantially hamper future technologies. Over-reliance on anything is not good, by definition, as is under-reliance, but such sophistry is a mainstay of lobby language.

However, this lobby attack omits reference to the fact that compatible further processing would not lead only to benign applications in the public interest: Since the proposal does not limit further processing to statistical or research purposes, it could just as well be used for commercial purposes such as commercial or political manipulation. But even with regard to the potentially more benevolent applications of AI, it should be kept in mind that automated data processing has in some cases shown to be highly detrimental to parts of society, especially vulnerable groups. This should not be ignored when evaluating the safety and privacy of aggregate data. For instance, while using location data for “smart cities” can make sense in some narrowly-defined circumstances when it is used for traffic control or natural disaster management, it gains a much more chilling undertone when it leads for instance to racial discrimination in company delivery services or law enforcement activities. It is easily imaginable that metadata, one of the most revealing and easiest to process forms of personal data, could be used for equally crude or misaligned applications, yielding highly negative outcomes for vulnerable groups. Moreover, where aggregate, pseudonymised data produces adverse outcomes for an individual, not even a rectification or deletion of the person’s data will lead to an improvement, as long as the accumulated data of similar individuals is still available.

Another pitfall of the supposedly private, ostensibly pseudonymised way of processing is that even if individual users are not targeted, companies may need to maintain the metadata of citizens in identifiable form to link existing data sets with new ones. This could essentially lead to a form of voluntary data retention, which might soon attract the interest of public security actors rapaciously seeking new data sources and new powers. If such access was granted, individuals would essentially be identifiable. Even retaining “only” aggregate data for certain societal groups or minorities might often already be enough to spark discriminatory treatment.

Although the Austrian Presidency of the Council of the European Union did include in their most recent draft compromise some noteworthy safeguards for compatible further processing, most notably the necessity to consult the national Supervisory Authority or to conduct a data protection impact assessment, the current proposal does not adequately empower individuals. Given that the interpretation of what is a “compatible” further processing may vary significantly among Member States (which would lead to years of litigation), it should be up to citizens to decide (and for the industry to prove) which forms of metadata processing are safe, fair and beneficial in society.

Five Reasons to be concerned about the Council ePrivacy draft (26.09.2018)
https://edri.org/five-reasons-to-be-concerned-about-the-council-eprivacy-draft/

EU Council considers undermining ePrivacy (25.07.2018)
https://edri.org/eu-council-considers-undermining-eprivacy/

Your ePrivacy is nobody else’s business (30.05.2018)
https://edri.org/your-eprivacy-is-nobody-elses-business/

e-Privacy revision: Document pool (10.01.2017)
https://edri.org/eprivacy-directive-document-pool/

(Contribution by Yannic Blaschke, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
10 Oct 2018

The Facebook breach – a GDPR test-case

By Yannic Blaschke

On 28 September, Facebook notified the Irish Data Protection Commissioner (DPC) about a massive data breach affecting more than 50 million of its users. The hack of the “view as” feature, which allowed users to see their profile from the perspective of an external visitor or friend, exploited an interaction of several bugs on Facebook and allowed the intruders to acquire so called “access tokens”. With these tokens, the attackers had access to personal data from the affected accounts, potentially including personal messages.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The incident is a highly salient test-case for the application of the General Data Protection Regulation (GDPR) in practice, specifically for:

1) Notification and provision of information: Under Article 33 of the GDPR, an entity facing a breach must notify the relevant data protection authority (DPA) within 72 hours, “where feasible”. As the vulnerability was discovered on 26 September, Facebook complied with this provision, unlike other companies (Uber being one of them) have done in the past. However, the information provided by Facebook so far seems to only have delivered the very basics of what is required under the GDPR. The Irish DPC publicly urged the enterprise to submit more details so the authorities could properly assess the nature of the breach and the risk to users. Article 34 of the GDPR further requires that individuals whose personal data might have been compromised during the breach are notified without undue delay of the incident and the counter-measures that have been taken so far. Facebook implemented this by displaying a message in the feed of the affected accounts. The information provided included an initial overview on the “view as” weakness, as well as the statements that the function has been turned off and that accounts who had used it in since July 2017 had their access tokens removed, requiring a new login.

2) Sanctions: The GDPR allows for sanctions against the entity that faced the breach, which depend on the sensitivity of the compromised information and the degree to which appropriate safeguards were not implemented. Since approximately five million of the affected users come from the EU, Facebook could be liable for a 1,63 billion US dollar fine if that was found to be the case. Since the exact nature of the breach is still investigated by the Irish DPC, it remains unclear to which extent the hacking was a result of negligence. In any case, the investigation might bring some further clarification on how the responsibility for the security of processing is allocated in practice, and how strictly infringements of this obligation are sanctioned. Cases like this thus offer an opportunity for other companies processing users’ personal data to learn in more detail about their security obligations under the GDPR, and provide them with examples on how to respond to a data breach. For users, the investigation also serves an important purpose: It shows them whether the security of their data is actually taken seriously. If it is not and they suffer adverse effects from that, they have the possibility to demand compensation – and since the Irish implementation of the GDPR allows for collective redress, they could even be represented by civil society in court. On the other hand, the incident also emphasises that, even if Facebook did not act carelessly, caution about uploading personal data is always advised, as absolute safety of personal information is never certain.

This data breach is yet another example of the importance of secure and confidential storing of personal data on the internet. While the news show that the GDPR has successfully obliged Facebook to communicate in a more comprehensive and timely manner about its breach than other big tech companies previously did, it is now of utmost importance to follow up on the incident with an in-depth investigation: Users’ rights under the GDPR should be fully and effectively enforced by the Irish DPC.

A Digestible Guide to Individual’s Rights under GDPR (29.5.2018)
https://edri.org/a-guide-individuals-rights-under-gdpr/

GDPRexplained Campaign: the new regulation is here to protect our rights (29.5.2018)
https://edri.org/gdprexplained-gdpr-new-regulation-protect-our-rights/

General Data Protection Regulation: Document pool (25.6.2015)
https://edri.org/gdpr-document-pool/

Your ePrivacy is nobody else’s business (30.5.2018)
https://edri.org/your-eprivacy-is-nobody-elses-business/

Cambridge Analytica access to Facebook messages a privacy violation (18.4.2018)
https://edri.org/cambridge-analytica-access-to-facebook-messages-a-privacy-violation/

(Contribution by Yannic Blaschke, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
26 Sep 2018

Five reasons to be concerned about the Council ePrivacy draft

By IT-Pol

On 19 October 2017, the European Parliament’s LIBE Committee adopted its report on the ePrivacy Regulation. The amendments improve the original proposal by strengthening confidentiality requirements for electronic communication services, and include a ban on tracking walls, legally binding signals for giving or refusing consent to online tracking, and privacy by design requirements for web browsers and apps. Before trilogue negotiations can start, the Council of the European Union (the Member States’ governments) must adopt its “general approach”. The Council Presidency, currently held by Austria, is tasked with securing a compromise among the Member States. This article analyses the most recent draft text from the Austrian Council Presidency 12336/18.

Further processing of electronic communications metadata

The current ePrivacy Directive only allows processing of electronic communications metadata for specific purposes given in the Directive, such as billing. The draft Council ePrivacy text in Article 6(2a) introduces further processing for compatible purposes similar to Article 6(4) of the General Data Protection Regulation (GDPR). This further processing must be based on pseudonymous data, profiling individual users is not allowed, and the Data Protection Authority must be consulted.

Despite these safeguards, this new element represents a huge departure from the current ePrivacy Directive, since the electronic communications service provider will determine what constitutes a compatible purpose. The proposal comes very close to introducing “legitimate interest” loophole as a legal basis for processing sensitive electronic communications metadata. Formally, the further processing must be subject to the original legal basis, but what this means in the ePrivacy context is not entirely clear, since the main legal basis is a specific provision in the Regulation, such as processing for billing or calculating interconnection payments or maintaining or restoring the security of electronic communications networks.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

An example of further processing could be tracking mobile phone users for “smart city” applications such as traffic planning or monitoring travel patterns of tourists via their mobile phone. Even though the purpose of the processing must be obtaining aggregate information, and not targeting individual users, metadata will still be retained for the individual users in identifiable form in order to link existing data records with new data records (using a persistent pseudonymous identifier). Therefore, it becomes a form of voluntary data retention. The mandatory safeguard of pseudonymisation does not prevent the electronic communications service provider from subsequently identifying individual users if law enforcement authorities obtain a court order for access to retained data on individual users.

Communications data only protected in transit

Whereas the text adopted by the European Parliament specifically amends the Commission proposal to ensure that electronic communications data is protected under the ePrivacy Regulation after it has been received, the Council text clarifies that the protection only applies in transit. After the communication has been received by the end-user, the GDPR applies, which gives the service provider much greater flexibility in processing the electronic communication data for other purposes. For a number of modern electronic communications services, storage of electronic communication data on a central server (instead of on the end-user device) is an integral part of the service. An example is the transition from SMS (messages are stored on the phone) to modern messenger services such as WhatsApp or Facebook Messenger (stored on a central server). This makes it important that the protection under the ePrivacy Regulation applies to electronic communications data after it has been received. The Council text fails to address this urgent need.

Tracking walls

The European Parliament introduced a ban on tracking walls, that is the practice of denying users access to a website unless they consent to processing of personal data via cookies (typically tracking for targeted advertising) that is not necessary for providing the service requested.

The Council text goes in the opposite direction by specifically allowing tracking walls in Recital 20 for websites where the content is provided without a monetary payment if the website visitor is presented with an alternative option without this processing (tracking). This could be a subscription to an online news publication. The net effect of this is that personal data will become a commodity that can be traded for access to online news media or other online services. On the issue of tracking walls and coerced consent, the Council ePrivacy text may actually provide a lower level of protection than Article 7(4) of the GDPR, which specifically seeks to prevent that personal data can become the counter-performance for a contract. This is contrary to the stated aim of the ePrivacy Regulation.

Privacy settings and privacy by design

The Commission proposal requires web browsers to offer the option of preventing third parties from storing information in the browser (terminal equipment) or processing information already stored in the browser. An example of this could be an option to block third party cookies. The Council text proposes to delete Article 10 on privacy settings. The effect of this is that fewer users will become aware of privacy settings that protect them from leaking information about their online behaviour to third parties and that software may be placed on the market that does not even offer the user the possibility of blocking data leakage to third parties.

Data retention

Article 15(1) of the current ePrivacy Directive allows Member States to require data retention in national law. Under the case law of the Court of Justice of the European Union (CJEU) in Digital Rights Ireland (joined cases C-293/12 and C-594/12) and Tele2 (joined cases C-203/15 and C-698/15), this data retention must be targeted rather than general and undifferentiated (blanket data retention). In the Commission proposal for the ePrivacy Regulation, Article 11 on restrictions is very similar to Article 15(1) of the current Directive.

In the Council text, Article 2(2)(aa) excludes activities concerning national security and defence from the scope of the ePrivacy Regulation. This includes processing performed by electronic communications service providers when assisting competent authorities in relation to national security or defence, for example retaining metadata (or even communications content) that would otherwise be erased or not generated in the first place. The effect of this is that data retention for national security purposes would be entirely outside the scope of the ePrivacy Regulation and, potentially, the case law of the CJEU on data retention. This circumvents a key part of the Tele2 ruling where the CJEU notes (para 73) that the protection under the ePrivacy Directive would be deprived of its purpose if certain restrictions on the rights to confidentiality of communication and data protection are excluded from the scope of the Directive.

If data retention (or any other processing) for national security purposes is outside the scope of the ePrivacy Regulation, it is unclear whether such data retention is instead subject to the GDPR, and must satisfy the conditions of GDPR Article 23 (which is very similar to Article 11 of the proposed ePrivacy Regulation), or whether it is completely outside the scope of EU law. The Council text would therefore create substantial legal uncertainty for data retention in Member States’ national law, undoubtedly to the detriment of the fundamental rights of many European citizens.

Proposal for a Regulation concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC – Examination of the Presidency text (20.09.2018)
www.parlament.gv.at/PAKT/EU/XXVI/EU/03/55/EU_35516/imfname_10840532.pdf

e-Privacy: What happened and what happens next (29.11.2017)
https://edri.org/e-privacy-what-happened-and-what-happens-next/

EU Member States fight to retain data retention in place despite CJEU rulings (02.05.2018)
https://edri.org/eu-member-states-fight-to-retain-data-retention-in-place-despite-cjeu-rulings/

EU Council considers undermining ePrivacy (25.07.2018)
https://edri.org/eu-council-considers-undermining-eprivacy/

Civil society letter to WP TELE on the ePrivacy Regulation (24.09.2018)
https://edri.org/files/eprivacy/201809-LettertoCouncil_FINAL.pdf

(Contribution by Jesper Lund, EDRi member IT-Pol, Denmark)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
12 Sep 2018

How the online tracking industry “informs” policy makers

By Yannic Blaschke

Following the entry into force of the General Data Protection Regulation (GDPR), the online advertising industry’s lobbying efforts moved to undermining the ePrivacy Regulation proposal. The Regulation, building on the GDPR, is designed to provide more specific provisions related to privacy and confidentiality of communications in the context of e-communications.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

For example, the ePrivacy Regulation will regulate the way in which online tracking companies operate and how the privacy of individuals can be further protected. In this context, lobbying groups for the booming online stalking industry are doing everything they can to label the protection of citizen’s privacy rights as harmful for the digital economy. As recent evidence shows, these efforts do not even stop at providing European Union policy makers with information that appears designed to mislead.

In a Euractiv.com op-ed, Dr. Johnny Ryan (Chief Policy & Industry Relations Officer at Brave Software), explained that “research” circulated by lobby group IAB Europe was dubious, at best. The “research” misleadingly misrepresented the revenues collected by European publishers from behavioural advertising, by including the advertising revenues of Google and Facebook – two powerful members of IAB (Google is a direct member, Facebook is taking part through its subsidiary company Atlas) who, in relation to this activity, are clearly not “publishers” in the sense of traditional news outlets.

In this context, it is all the more misleading that the research report spread by the IAB in September 2017 crams tech-giants and media-outlets together into the category of ‘publishers’. In an earlier position paper, the IAB stated that the proposed ePrivacy regulation would “derail European digital media outlets by significantly undermining their ability to generate enough revenue to create and provide free online content and services”. However, as Dr. Ryan reports, only a fraction of the claimed 10,6 billion euro revenue that European publishers allegedly made with behavioural advertising in 2016 actually goes to journalists and creative content providers.

Actively confusing the revenue of these actual publishers with the vast sums harvested by Google and Facebook through stalking online browsing behaviour (and, we have since learned, staking people’s location offline also), appears more than a little cynical. It is also a critical omission of information that reflects badly on the IAB’s respect for its oath to provide complete and non-misleading information, which they made as part of their registration for the EU Transparency Register. While the main advocate for companies whose aim it is to monitor European citizens’ every step on the internet has proved a flexible attitude to factual reporting in the past, this incident reaches a new level of flexibility with the truth.

EU parliamentarians and EU Member States need to question the supposed ‘economic value’ of ubiquitous monitoring on their voters. All the more, the evidence should also serve as a warning to the Austrian Council Presidency, which has pledged to “ensure strong privacy protection in electronic communications while also taking into account development opportunities for innovative services”. As it has been demonstrated, the alleged ‘development opportunities’ of behavioural advertising in the EU are mainly to the benefit of advertising duopoly. Will the Austrian presidency live up to its motto of a “Europe that protects” by supporting a strong ePrivacy regime?

Read more:

ePrivacy: Over-regulation or opportunity? (07.09.2018)
https://www.euractiv.com/section/digital/opinion/eprivacy-over-regulation-or-opportunity/

EU Council considers undermining ePrivacy (25.07.2018)
https://edri.org/eu-council-considers-undermining-eprivacy/

Your ePrivacy is nobody else’s business (30.05.2018)
https://edri.org/your-eprivacy-is-nobody-elses-business/

Five things the online tracking industry gets wrong (13.09.2017)
https://edri.org/five-things-the-online-tracking-industry-gets-wrong/

Massive lobby against personal communications security has started (27.07.2016)
https://edri.org/massive-lobby-personal-communications-security-started/

(Contribution by Yanic Blaschke, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
25 Jul 2018

EU Council considers undermining ePrivacy

By IT-Pol

On 19 October 2017, the European Parliament’s LIBE Committee adopted its report on the ePrivacy Regulation. The amendments improve the original proposal by strengthening confidentiality requirements for electronic communication services, a ban on tracking walls, legally binding signals for giving or refusing consent to online tracking, and privacy by design requirements for web browsers and apps.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Before trialogue negotiations can start, the Council of the European Union (the Member States’ governments) must adopt its general approach. This process is still ongoing with no immediate end in sight. An analysis of the proposed amendments in Council documents so far shows that the Council is planning to significantly weaken the ePrivacy text compared to the Commission proposal and, especially, the LIBE report.

Metadata for electronic communications should be regarded as sensitive personal data, similar to the categories listed in Article 9 of the General Data Protection Regulation (GDPR). Under the ePrivacy Directive (current legal framework), necessary metadata may be processed for purposes of subscriber billing and interconnection payments, and, with consent of the user, for value added services. Apart from data retention requirements in national law, no other processing is allowed. In the ePrivacy Regulation, the Commission proposal and the LIBE text both uphold the principle of only allowing processing of electronic communications metadata for specific purposes laid down in law or with consent of the end-user. As a new specific purpose, processing for monitoring quality of service requirements and maintaining the availability of electronic communications networks can be done without consent.

The Council proposals significantly expand the permitted processing of metadata without consent by the electronic communications service (ECS) provider. The billing/interconnection purpose is extended to include processing when it is necessary “for the performance of a contract to which the end-user is party”. This will allow the ECS provider to process metadata not directly related to billing through provisions in the contract with the end-user. Service offerings by ECS providers are generally moving towards simpler products with increased reliance on flat rate tariffs, which should reduce the processing and storage of metadata necessary for billing purposes. These privacy benefits will be lost with the Council text.

In December 2017, the Council proposed further processing of metadata without consent for scientific research or statistical purposes based on Union or Member State law. Despite the mandatory safeguards, which include encryption and pseudonymisation, this is a very problematic amendment since a potentially large amount of metadata, which would otherwise be deleted or anonymised, will be retained and stored in identifiable form. Data breaches and law enforcement access are two very specific data protection risks created by this amendment.

The latest text from the Austrian Presidency (Council document 10975/18) goes even further than this by proposing a new general provision for further processing of metadata for compatible purposes inspired by Article 6(4) of the GDPR. This comes very close to introducing “legitimate interest” as a legal basis for processing metadata by the ECS provider, something that has previously been ruled out because metadata for electronic communications is comparable to sensitive personal data under the case law of the Court of Justice of the European Union (CJEU). GDPR Article 9 does not permit the processing of sensitive personal data with legitimate interest as the legal basis. In March 2018, the former Bulgarian Presidency specifically noted that it is highly doubtful whether a non-specific provision for permitted processing would, given the sensitive nature of the data involved, be in line with the case-law of the CJEU.

The LIBE Committee adopted amendments to ensure that electronic communications content was protected under the ePrivacy Regulation during transmission and if the content is subsequently stored by the ECS provider. This is important because storage of electronic communications content is an integral part of many modern electronic communications services, such as webmail and messenger services. However, the Council amendments limit the protection under the ePrivacy Regulation to the transmission of the communication, a period which may be a fraction of a second. After the receipt of the message, the processing falls under the GDPR which could allow processing of personal data in electronic communications content (such as scanning email messages) based on legitimate interest rather than consent of the end-user. As suggested by the Council recital, the end-user can avoid this by deleting the message after receipt, but this would entirely defeat the purpose of many modern electronic communications services.

In Article 8 of the draft ePrivacy Regulation, the LIBE Committee adopted a general ban on tracking walls. This refers to the practice of making access to a website dependent on end-user consent to processing of personal data through tracking cookies (or device fingerprinting) that is not necessary for the provision of the website service requested by the end-user. This practice is currently widespread since many websites display cookie consent banners where it is only possible to click ‘accept’ or ‘OK’.

The Council text goes in the opposite direction with proposed wording in a recital which authorises tracking walls, in particular if a payment option is available that does not involve access to the terminal equipment (e.g. tracking cookies). This amounts to a monetisation of fundamental rights, as EU citizens will be forced to decide whether to pay for access to websites with money or by being profiled, tracked and abandoning their fundamental right to protection of personal data. This inherently contradicts the GDPR since consent to processing of personal data can become the counter-performance for access to a website, contrary to the aim of Article 7(4) of the GDPR.

Finally, the latest text from the Austrian Presidency proposes to completely delete Article 10 on privacy settings. Article 10 requires web browsers and other software permitting electronic communications to offer privacy settings which prevent third parties from accessing and storing information in the terminal equipment, and to inform the end-user of these privacy settings when installing the software. An example of this could be an option to block third party cookies in web browsers. Such privacy settings are absolutely critical for preventing leakage of personal data to unwanted third parties and for protecting end-user privacy when consent to tracking is coerced through tracking walls. The recent Cambridge Analytica scandal should remind everyone, including EU Member States’ governments, of the often highly undesirable consequences of data disclosures to unknown third parties.

If Article 10 is deleted, it will be possible to offer software products that are set to track and
invade individuals’ confidential communications by design and by default, with no possibilities for the individual to change this by selecting a privacy-friendly option that blocks data access by third parties. This goes in the complete opposite direction of the LIBE report, which contains amendments to strengthen the principle of privacy by design by requiring that access by third parties is prevented by default, and upon installation to ask the end-user to either confirm this or select another, possibly less privacy-friendly, option.

The rationale for deleting Article 10 given by the Austrian Presidency is the burden on software vendors and consent fatigue for end-users. The latter is somewhat ironic since technical solutions, such as genuine privacy by design requirements and innovative ways to give or refuse consent, like a mandatory Do Not Track (DNT) standard, are needed to reduce the number of consent requests in the online environment. The Council amendments for articles 8 and 10 would aggravate the current situation, where end-users on countless websites are forced to give essentially meaningless consent to tracking because the cookie banner only provides the option of clicking ‘accept’.

If the ePrivacy amendments in 10975/18 and earlier Council documents are adopted as the general approach, Council will enter trialogue negotiations with a position that completely undermines the ePrivacy Regulation by watering down all provisions which provide stronger protection than the GDPR. This would put a lot of pressure on the European Parliament negotiators to defend the privacy rights of European citizens. For telecommunications services, which presently enjoy the strong protection of the ePrivacy Directive, the lower level of protection will be particularly severe, even before considering the dark horse of mandatory data retention that EU Member States are trying to uphold, in part through amendments to the ePrivacy Regulation.

EDRi, along with EDRi members Access Now, Privacy International and IT-Pol Denmark, have communicated their concerns about the proposed Council amendments though letters to WP TELE, as well as a civil society meeting with Council representatives on 31 May 2018 organised by the Dutch Permanent Representation and the Bulgarian Council Presidency.

Read more:

e-Privacy: What happened and what happens next (29.11.2017)
https://edri.org/e-privacy-what-happened-and-what-happens-next/

EU Member States fight to retain data retention in place despite CJEU rulings (02.05.2018)
https://edri.org/eu-member-states-fight-to-retain-data-retention-in-place-despite-cjeu-rulings/

ePrivacy: Civil society letter calls to ensure privacy and reject data retention (24.04.2018)
https://edri.org/eprivacy-civil-society-letter-calls-to-ensure-privacy-and-reject-data-retention/

Civil society calls for protection of communications confidentiality (13.06.2018)
https://edri.org/civil-society-calls-for-protection-of-communications-confidentiality/

Civil society letter to WP TELE on the ePrivacy amendments in Council document 10975/18 (13.07.2018)
https://edri.org/civil-society-calls-for-protection-of-privacy-in-eprivacy/

(Contribution by Jesper Lund, EDRi-member IT-Pol)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

 

close
27 Jun 2018

NCC publishes a report on tech companies’ use of “dark patterns”

By Maria Roson

Today, the Norwegian Consumer Council (NNC), a consumers group active on the field of digital rights, has published a report on how default settings and “dark patterns” are used by techs companies such as Facebook, Google and Microsoft to nudge users towards privacy intrusive options.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The term “dark patterns” refers to the practices used to deliberately mislead users through exploitative nudging. The NNC describes them as “features of interface design crafted to trick users into doing things that they might not want to do, but which benefit the business in question, or in short, nudges that may be against the user’s own interest”.

The General Data Protection Regulation (GDPR) requires services to be developed according to the principles of data protection by design and data protection by default and obliges companies to make a lawful use of their users’ data. With the entry into operation of the GDPR last May, the three companies had to update the conditions of use of their services, which they did by using a wide variety of “dark patterns”. The report focuses on five of them which overlap with each other and that together form the big picture of how companies mislead users to “chose” invasive instead of data protection-friendly options. This is done by putting in place the following mechanisms:

1. Default settings

Facebook and Google hide and obscure the privacy settings, making it much easier and visible for the user to accept the most intrusive options.

2. Taking the hand of the user to mislead him

Usually, the services push users to accept unnecessary data collection through a combination of positioning and visual cues. Facebook and Google go a step further by requiring a much larger amount of steps to limit data collection, in order to disincentive citizens to protect themselves.

3. Invasive options go first

All three companies presented as the positive option the settings that maximise data collection, creating doubts on the user and even ethical dilemmas. The companies do not explain the full consequences of their choices but frame their messages focusing on the theoretical positive sides of allowing wider data collection, such as the improvement of the user experience.

4. Rewards and punishments

A typical nudging strategy is to use incentives to reward the “right” choice, and punish choices that the service provider deems undesirable. The reward is often described as “extra functionality” or a “better service” (without making clear what this means in practice), while the punishment might be the loss of functionality or the deletion of the account if they decline, which has been the strategy of Facebook and Google. 5. Time pressure: When it came to completing the settings review, all the three services put pressure on the user to complete them at a time determined by the service provider. This was made without a clear option for the user to postpone the settings review and not making clear either whether the user could still use the service or not.

The report concludes that these service providers are just giving users the “illusion of control” while nudging them toward the options more desirable for the companies.

Read more:

DECEIVED BY DESIGN: How tech companies use dark patterns to discourage us from exercising our rights to privacy (27.06.2018)
https://fil.forbrukerradet.no/wp-content/uploads/2018/06/2018-06-27-deceived-by-design-final.pdf

GDPR: noyb.eu filed four complaints over “forced consent” against Google, Instagram, WhatsApp and Facebook (25.08.2018)
https://noyb.eu/wp-content/uploads/2018/05/pa_forcedconsent_en.pdf

GDPR explained
https://gdprexplained.eu/

(Contribution by Maria Roson, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close