05 Oct 2017

Dear MEPs: We need you to protect our privacy online!

By EDRi

They’re hip, they’re slick and they follow you everywhere. They know you like new shoes, playing tennis and tweeting at odd hours of the morning. Do you know what that says about your health, your relationships and your spending power? No? Well, the online companies do. They follow you everywhere you go online, they have a perfect memory, they know the sites you visited last year even if you’ve forgotten… Look who’s stalking.

European legislation protecting your personal data was updated in 2016, but the battle to keep it safe is not over yet. The European Union is revising its e-Privacy rules. We welcomed the European Commission (EC) proposal as a good starting point, but with room for improvement. The online tracking industry is lobbying fiercely against it. Online tracking and profiling gave us filter bubbles and echo chambers. Yet the lobbyists lobby for it under the pretext of “saving the internet”, “protecting quality journalism” – even “saving democracy”.

The European Parliament is currently debating its position on the EC proposal. Some Members of the European Parliament (MEPs) support “tracking business, as usual” while others support a strong future-proof norm to protect the privacy, innovation and security of future generations of EU citizens and businesses.

Priorities for defending privacy and security:

1) Protect confidentiality of our communications – both in transit and at rest!
Confidentiality of communications needs to be protected both in transit and when it is stored. Lobbyists have been campaigning for a technicality that would allow them to read and exploit your emails stored in the cloud. (Art. 5)

2) Protect our privacy: Do not add loopholes to security measures!
A “legitimate interest” exception was not included in any version of the previous e-Privacy Directives. This would be a major weakening of the legislation compared with existing rules. Our member Bits of Freedom wrote about the problems with “legitimate interest” here. (several Articles and Recitals)

3) Do not let anyone use our data without asking for our consent!
It is crucial to keep consent as the legal ground to process communications data. Neither “legitimate interest” nor “further processing” should be allowed to weaken the security and privacy of European citizens and businesses (Art.6)

4) Privacy should not be an option – what we need is privacy by default!
Provisions about default privacy settings need to be strengthened and improved, certainly not watered down or deleted. e-Privacy must ensure “privacy by design and by default” and not, as in the EC proposal, “privacy by option”. You can find our specific proposals here. The European Parliament previously adopted a Directive that criminalises unauthorised access to computer systems. It would be completely incoherent if it were to adopt legislation that foresees default settings that do not protect against unauthorised access to devices. (Art. 10)

5) No new exceptions to undermine our privacy!
Exceptions for Member States cannot become a carte blanche rendering e-Privacy useless. Therefore, the safeguards established by the Court of Justice of the European Union on cases regarding the exceptions in the relevant sections of the e-Privacy Regulation should be diligently respected – the scope of the exception should not be expanded. (Art. 11)

6) Do not undermine encryption!
Imposing a ban on undermining or attacking encryption should be a priority.

7) Protect our devices (hardware+software) by design and by default!
Hardware and software security need to be protected by design and by default.

MEPs, protect our #ePrivacy – Support amendments that follow the principles listed above!

e-Privacy revision: Document pool
https://edri.org/eprivacy-directive-document-pool/

e-Privacy: Consent (pdf)
https://edri.org/files/eprivacy/e-privacy-onepager_consent.pdf

e-Privacy: Legitimate interest (pdf)
https://edri.org/files/eprivacy/e-privacy-onepager_legitimate-interest.pdf

e-Privacy: Privacy by design and by default (pdf)
https://edri.org/files/eprivacy/e-privacy-onepager_privacy-by-default.pdf

e-Privacy: Offline tracking (pdf)
https://edri.org/files/eprivacy/e-privacy-onepager_offline-tracking.pdf

Your privacy, security and freedom online are in danger (14.09.2016)
https://edri.org/privacy-security-freedom/

Five things the online tracking industry gets wrong (13.09.2017)
https://edri.org/five-things-the-online-tracking-industry-gets-wrong/

ePrivacy Regulation: Call a representative and make your voice heard!
https://eprivacy.laquadrature.net/-piphone/

Who’s afraid of… e-Privacy? (04.10.2017)
https://medium.com/@privacyint/whos-afraid-of-e-privacy-7969a1cfe776

Twitter_tweet_and_follow_banner

close
04 Oct 2017

ENDitorial: Tinder and me: My life, my business

By Maryant Fernández Pérez

Tinder is one of the many online dating companies of the Match Group. Launched in 2012, Tinder started being profitable as of 2015, greatly thanks to people’s personal data. On 3 March 2017, journalist Judith Duportail asked Tinder to send her all her personal data they had collected, including her “desirability score”, which is composed of the “swipe-left-swipe-right” ratio and many other pieces of data and mathematic formulae that Tinder does not disclose. Thanks to her determination and support from lawyer Ravi Naik, privacy expert Paul-Olivier Dehaye and the work of Norwegian consumers advocates, Judith reported on 27 September 2017 that she received 800 pages about her online dating-related behaviour.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Tinder did not disclose how desirable the company considered Duportail to be, though, even if it had disclosed it to another journalist. The 800 pages contained information such as her Facebook “likes”, her Instagram pictures (even if she had deleted her account), her education, how many times she had connected to Tinder, when and where she entered into online conversations, and many more things. “I was amazed by how much information I was voluntarily disclosing”, Duportail stated.

800 pages of personal data – surprising?

As a Tinder user, you should know that you “agree” to Tinder’s terms of use, privacy policy and safety tips, as well as other terms disclosed if you purchase “additional features, products or services”. These include the following:

  • “You understand and agree that we may monitor or review any Content you post as part of a Service.”
  • “If you chat with other Tinder users, you provide us the content of your chats.”
  • “We do not promise, and you should not expect, that your personal information, chats, or other communications will always remain secure.”
  • “By creating an account, you grant to Tinder a worldwide, transferable, sub-licensable, royalty-free, right and license to host, store, use, copy, display, reproduce, adapt, edit, publish, modify and distribute information you authorize us to access from Facebook, as well as any information you post, upload, display or otherwise make available (collectively, ‘post’) on the Service or transmit to other users (collectively, ‘Content’).”
  • “You agree that we, our affiliates, and our third-party partners may place advertising on the Services.”
  • “If you’re using our app, we use mobile device IDs (the unique identifier assigned to a device by the manufacturer), or Advertising IDs (for iOS 6 and later), instead of cookies, to recognize you. We do this to store your preferences and track your use of our app. Unlike cookies, device IDs cannot be deleted, but Advertising IDs can be reset in “Settings” on your iPhone.”
  • “We do not recognize or respond to any [Do Not Track] signals, as the Internet industry works toward defining exactly what DNT means, what it means to comply with DNT, and a common approach to responding to DNT.”
  • “You can choose not to provide us with certain information, but that may result in you being unable to use certain features of our Service.”

Tinder explains in its Privacy Policy – but not in the summarised version of the terms – that you have a right to access and correct your personal data. What is clear to the company is that you “voluntarily” provided your information (and that of others). Duportail received part of the information Tinder and its business partners hold, no doubt partly because she is a journalist. Her non-journalist friends have not experienced the same benevolence. Your personal data has an effect not only on your online dates, “but also what job offers you have access to on LinkedIn, how much you will pay for insuring your car, which ad you will see in the tube and if you can subscribe to a loan”, Paul-Olivier Dehaye highlights.

Worse still, even if you close your account or delete info, Tinder or its business partners do not necessarily delete it. And the worst, you’ve “agreed” to it: “If you close your account, we will retain certain data for analytical purposes and recordkeeping integrity, as well as to prevent fraud, enforce our Terms of Use, take actions we deem necessary to protect the integrity of our Service or our users, or take other actions otherwise permitted by law. In addition, if certain information has already been provided to third parties as described in this Privacy Policy, retention of that information will be subject to those third parties’ policies.”

You should be in control

Civil society organisations fight this kind of practices, to defend your rights and freedoms. For instance, the Norwegian Consumer Council successfully worked for Tinder to change its terms of service. On 9 May 2017, EDRi and its member Access Now raised awareness about period trackers, dating apps like Tinder or Grindr, sex extortion via webcams and the “internet of (sex) things” at the re:publica 17 conference. Ultimately, examples like Duportail’s shows the importance of having strong EU data protection and privacy rules. Under the General Data Protection Regulation, you have a right to access your personal data, and companies should provide privacy by default and design in their services. Now, we are working on the e-Privacy Regulation to ensure you have real consent instead of a tick on a box of something you never read, to prevent companies from tracking you unless you provide express and specific consent, among many other things.

Now that you know about this or have been reminded of this, spread the word! It does not matter whether you are on Tinder or not. This is about your online future.

I asked Tinder for my data. It sent me 800 pages of my deepest, darkest secrets (26.09.2017)
https://www.theguardian.com/technology/2017/sep/26/tinder-personal-data-dating-app-messages-hacked-sold

Getting your data out of Tinder is really hard – but it shouldn’t be (27.09.2017)
https://www.theguardian.com/technology/2017/sep/27/tinder-data-privacy-tech-eu-general-data-protection-regulation

Safer (digital) sex: pleasure is just a click away (09.05.2017)
https://re-publica.com/en/17/session/safer-digital-sex-pleasure-just-click-away

Tinder bends for consumer pressure (30.03.2017)
https://www.forbrukerradet.no/siste-nytt/tinder-bends-for-consumer-pressure

(Contribution by Maryant Fernández Pérez, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
04 Oct 2017

The privacy movement and dissent: Art

By Guest author

This is the third blogpost of a series, originally published by EDRi member Bits of Freedom, that explains how the activists of a Berlin-based privacy movement operate, organise, and express dissent. The series is inspired by a thesis by Loes Derks van de Ven, which describes the privacy movement as she encountered it from 2013 to 2015.*

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Although there are relatively few privacy movement members involved in the actual process of creating art, it does affect the movement as a whole. Art reflects the movement’s beliefs and is used as a weapon of resistance against injustice.

The two art projects of the privacy movement which will be introduced in this article are Panda to Panda and Anything to Say?. They both share a number of features that belong to activist art in general. One of these features is the way activist art comes into being; the art activists create almost always comes from personal experiences and wants to draw attention to and gain recognition for those experiences. In addition, it problematises authority, domination, and oppression and seeks to alter the current situation. Moreover, activists like their work to evoke emotion and provoke intellectually, and they aim to form a community among those who share a similar aversion to oppression.

Panda to Panda (2015) is part of a larger project called Seven on Seven, a project initiated by Rhizome, the influential platform for new media art affiliated with the New Museum in New York City. Each year, Rhizome matches seven artists with seven technologists. In 2015, one of the pairs Rhizome invited to participate were Ai Weiwei and Jacob Appelbaum. The result of their collaboration, Panda to Panda, consists of twenty stuffed pandas from which the stuffing has been replaced with shredded documents that Glenn Greenwald and Laura Poitras received from Edward Snowden. In addition, a micro SD card with the documents on it has been placed inside each panda. By distributing the pandas to as many places as possible, the pandas function as a “distributed backup” that is difficult to destroy, since that would mean destroying all twenty objects. The project was documented by Ai, who shared the images with his followers on social media. Laura Poitras was invited to film the process and eventually published the film in the online edition of The New York Times.

Panda to Panda is an example of ethico-political subversion, in which authority is undermined in a number of ways. First, the project in its totality is a complaint against government surveillance and state power. As Ai, Appelbaum, and Poitras were working on the project, they continuously filmed each other. With the constant filming they emphasise and visualise the surveillance they are under: while they film each other, they are also watched by the surveillance cameras placed in front of Ai’s studio by the Chinese authorities. There is a constant awareness of always being under watch.

Second, the pandas also have a symbolic meaning. From Appelbaum’s frame of reference, Panda to Panda is a variation on peer-to-peer communication, a means of communication in which there is no hierarchy and that allows all peers to interact in an equal way. This system is seen as a philosophy of egalitarian human interaction on the internet. This reference also materialises the goals of the movement. From Ai’s frame of reference, the pandas satirically reference popular culture: in China, the secret police, the “government spies” that also monitor Ai, are often referred to as pandas.

Anything to Say? A Monument of Courage (2015) is a life-size bronze sculpture by American author Charles Glass and Italian artist Davide Dormino. The sculpture portrays three people: Julian Assange, Edward Snowden, and Bradley Manning (who is now Chelsea Manning). The three each stand on a chair, a fourth chair is left empty. This fourth chair is meant for other individuals to stand on, to enable them to stand with the whistleblowers and freely express themselves. Anything to Say? has its own Twitter account where followers can follow the realisation, unveiling, and journey of the sculpture. The sculpture has never been placed in a typical museum context: it was unveiled at Alexanderplatz in Berlin in and has been travelling since.

An analysis of Anything to Say? demonstrates a number of ways in which art functions to strengthen the privacy movement. Taking a stand and expressing your thoughts does not come naturally to everyone; it takes a certain amount of courage – as the sculpture’s subtitle A Monument of Courage indicates. By inviting individuals to stand on the fourth, empty chair, the sculpture encourages them to do the same as whistleblowers: to step out of their comfort zone and become visible. Young or old, rich or poor, German or not, part of the movement or not: the sculpture gives the audience a reason to connect. Furthermore, here as in the case of Panda to Panda, the sculpture carries out some of the beliefs of the privacy movement, informing individuals within as well as outside of the movement.

Anything to Say? not only highlights the importance of freedom of speech and freedom of information; it also comes from the personal experiences of whistleblowers and it shows great respect for them. It encourages the audience to show the same courage as Assange, Snowden and Manning have shown, but the sculpture in itself is also a sign of gratitude towards them. Furthermore, the sculpture in itself represents movement ideas and values, but by asking members of the audience to stand on the chair and express themselves, it actually practices free speech and thereby practices one of the privacy movement’s aims.

Activist art is a valuable way for the privacy movement to express what it stands for. Although there is only a relatively small group of activists within the movement that actually creates art, it affects the entire movement; it encourages members within the movement, allows them to experience both their own and the group’s strength, and the personal character of the art reinforces the unity within the movement. In the next article of this series, protest as an expression of dissent of the privacy movement will be explored.

The series was originally published by EDRi member Bits of Freedom at https://www.bof.nl/tag/meeting-the-privacy-movement/.

Dissent in the privacy movement: whistleblowing, art and protest (12.07.2017)
https://edri.org/dissent-in-the-privacy-movement-whistleblowing-art-and-protest/

The privacy movement and dissent: Whistleblowing (23.08.2017)
https://edri.org/the-privacy-movement-and-dissent-whistleblowing/

(Contribution by Loes Derks van de Ven; Adaptation by Maren Schmid, EDRi intern)

* This research was finalised in 2015 and does not take into account the changes within the movement that have occurred since then.

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner


Sources:
Andelman, David A. “The Art of Dissent. A Chat with Ai Weiwei.” World Policy Journal 29.3 (2012): 15-21.
Goris, Gie. Art and Activism in the Age of Globalization. Ed. Lieven de Cauter, Ruben de Roo, and Karel Vanhaesebrouck. Rotterdam: NAi Publishers, 2011.
Reed, T.V. The Art of Protest. Culture and Activism from the Civil Rights Movement to the Streets of Seattle. Minneapolis: University of Minnesota Press, 2005.
Simonds, Wendy. “Presidential Address: The Art of Activism.” Social Problems 60.1 (2013): 1-26.

close
04 Oct 2017

Tear down the tracking wall

By Bits of Freedom

It has become a daily routine: “consenting to” being tracked, on the basis of meaningless explanations (or no explanation at all) before you’re allowed access to a website or online service. It’s about time to set limits to this tracking rat race.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

An ever-growing portion of our personal and professional communication, our news consumption and our contact with government, is mediated through the internet. Access to online information and services is crucial to participating in today’s society. Yet, on a daily basis we are forced to allow ourselves to be tracked – from across multiple websites and app , and across several devices – before we’re given access to information or digital services.

The infamous cookie walls you encounter when visiting websites are a prime example of this. If you want to get beyond that wall, you first have to consent to having your online behaviour minutely tracked. To be clear, we are not talking about the cookies that are necessary to, for example, store your settings or for gathering stats on the use of your website in a privacy friendly manner. We are talking about all those trackers that usually originate from multiple, completely different parties from the website you intended to visit, and that continue to track your behaviour across the internet.

Issues with tracking

Tracking raises many concerns. First of all, while we become more transparent to online tracking companies, a lot of the current practices, and the parties employing them, are highly opaque. We are unaware how much of our activity online is registered, analysed and used, by how many different parties, for what purposes nor what inferences about our activities are generated.

Secondly, the information collected through trackers makes us susceptible to manipulation – indeed, that is the usual purpose. This can have serious consequences for the power (im)balances between citizens and consumers on the one hand and governments, corporations and other organisations that have access to this data on the other. Just think of the instrumental role tracking plays in micro-targeted political advertising, price discrimination or exploiting the cognitive biases and specific weaknesses of individual users.

Third, the data gathered through tracking is increasingly used for making decisions about us. For example, the answer to whether you have access to credit and under what terms may depend on such data. This often happens under the cloak of long terms full of legalese you consented to which provide you no meaningful transparency. Even if you are aware that data about you is being used for making automated decisions, it is hard to challenge the inaccuracy of such decisions or the data they rely on.

An often heard response is that you are free to withhold your consent to being tracked. That is correct in theory, but much harder in the real world. In our daily lives it is often a choice between limited or no access at all, or subjecting yourself to opaque tracking. This is particularly problematic when the information or services you would like to access are provided by public institutions, health service providers or organisations that play an important role in society and that you therefore cannot simply avoid.

Think for instance of public institutions such as the Tax Administration, but also hospitals, health insurance companies, banks or internet access providers. By making access to their services conditional on your consent to being tracked, your consent becomes involuntary and essentially meaningless. This practice has to stop.

As a user you should be able to gather information and use services without being forced to consent to being tracked. And why shouldn’t we take it one step further and put an end to tracking walls for all the online information and services that we use?

What will the EU do?

At this very moment, European Union institutions are working on an overhaul of specific privacy rules for electronic communications, e-Privacy Regulation. Who is permitted to read your messages, are tracking walls allowed and may your phone be used to map your physical location without your consent? These are some of the important questions these new rules address. They will have a substantial impact on all internet users across the EU.

This overhaul of the rules offers an excellent opportunity to tear down tracking walls for all of Europe. EDRi Brussels office and EDRi members are not the only one advocating for this. The data protection authorities in Europe also recommend to put an end to this practice. In October 2017, the European Parliament will vote on the new rules proposed by the European Commission and the hundreds of amendments that have been submitted by different Members of the European Parliament (MEPs). Will the rights of internet users be safeguarded and will we get a digital environment free from opaque tracking practices?

This is a shortened version of an article originally published by EDRi member Bits of Freedom: https://bof.nl/2017/09/20/tear-down-the-tracking-wall/.

(Contribution by David Korteweg, EDRi member Bits of Freedom, the Netherlands; Adaptation by Maren Schmid, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
04 Oct 2017

TiSA impact assessment report ignores crucial human rights concerns

By Ana Ollo

In 2013, the European Commission decided to subject the draft Trade in Services Agreement (TiSA) to a Trade Sustainability Impact Assessment (SIA) in support of the negotiations. The Final Report, which was published in July 2017, fails to address several key fundamental rights concerns.

The report was conducted by the consultancy Ecorys and the Centre for Economic Policy Research (CEPR). The aim was to evaluate how TiSA’s provisions under negotiation could affect economic, social and human rights, as well as environmental issues, in the EU and in other TiSA parties and selected third countries.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

The report went through various review processes among stakeholders, to which EDRi responded in three occasions. The draft that preceded the final Report was published in May 2017. In June 2017, EDRi submitted comments regarding both the draft and its Annexes.

We welcome certain parts of the final report. It clearly says that there is a lack of evidence of meaningful barriers to e-commerce. In fact, it states that barriers to e-commerce identified by industry groups “are not necessarily the true barriers to e-commerce”. In addition, the report makes a distinction “between the true underlying barriers and the barriers that are reported” by industry, industry associations or individual stakeholders. It argues that “in the absence of robust evidence on policy impact and effectiveness […] it is tempting to rely on the input and suggestions of interest groups and stakeholders”, which leads to “the usual risk of being beholden to special interests or to be lost in a mosaic of different opinions, concerns and suggestions”.

Despite these important recognitions, the report still has at least three major problems:

First, the analysis overlooked several key human rights concerns. Freedom of expression and opinion was disregarded, despite its relevance in the context of TiSA, especially for potential provisions on intermediary liability and net neutrality proposed by some TiSA countries. To address these points, we suggested including an impact assessment of the lack of human rights commitments by TiSA parties.

Secondly, the report refers to data protection and privacy as “issues”, rather than fundamental rights that must be respected. Indeed, the failure to protect them constitutes a barrier to trade and not the opposite. In our comments, we pointed out that both the European Commission and the European Parliament have stated on several occasions that such rights cannot be subject to negotiations in trade agreements, and that this needs to be taken into account. Furthermore, we highlighted that the Final Report should not assess the data protection situation only from an EU perspective, as the different TiSA parties have a variety of commitments in this regard.

Thirdly, the report includes contradictions with regard to data flows. While it acknowledges the lack of evidence of the existence of meaningful barriers to e-commerce, it states in its human rights assessment that “the issue of data flows […] is particularly relevant”, without indicating what it may be relevant to. In the same vein, the report does not present evidence of the ostensible problems related to data flows, while it also says that “limitations to the free flow of data” are “a key concern for e-commerce”. Finally, it identifies the movement of people as the biggest trade barrier for computer services and telecommunication, but then states that “the core issue” is that of the free flow of data. The report warns about the risks of lacking robust evidence, whereas in this matter it is clear that such problem affected the assessment.

Despite all the concerns highlighted on several occasions, when the final Report was published, we learned that almost all of our suggestions and remarks had been disregarded. This is regrettable, as an independent academic study by the University of Amsterdam “Trade and Privacy: Complicated Bedfellows? How to achieve Data Protection-Proof Free Trade Agreements” (Irion, K., S. Yakovleva, and M. Bartl, 2016), that is even cited in the Final Report, shows that the EU has homework to do to bring trade agreements in line with EU law.

EDRi’s response to the Trade SIA consultation (02.06.2017)
https://edri.org/files/consultations/tsia_tisa_draftfinalreport_edricomments_20170602.pdf

EDRi’s input to the Draft Interim Technical Report “Trade SIA in support of negotiations on a plurilateral Trade in Services Agreement (TiSA)” (27.01.2017)
https://edri.org/files/TiSA/ecorysdraftinterimreport_edriinput_20170127.pdf

EDRi’s response to the Ecorys Survey on TiSA commissioned by the European Commission (15.03.2016)
https://edri.org/files/TiSA/TiSA_ecoryssurvey_EDRiresponse.pdf

EDRi’s position paper on TiSA (01.2016)
https://edri.org/files/TiSA_Position_Jan2016e.pdf

Documents regarding TiSA’s Trade Sustainability Impact Assessment since 2013
http://ec.europa.eu/trade/policy/policy-making/analysis/policy-evaluation/sustainability-impact-assessments/#study-geo-19

Trade Sustainability Impact Assessment – Final Report (07.2017)
www.trade-sia.com/tisa/wp-content/uploads/sites/7/2014/02/TiSA-Final-Report.pdf

Trade Sustainability Impact Assessment – Annexes to the Final Report (07.2017)
www.trade-sia.com/tisa/wp-content/uploads/sites/7/2014/02/TiSA-Final-Report-Annexes.pdf

(Contribution by Ana Ollo, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
20 Sep 2017

Human Rights Court sets limits on right to monitor employees

By Anne-Morgane Devriendt

On 5 September 2017, the Grand Chamber of the European Court for Human Rights (ECtHR) ruled on the Bărbulescu v. Romania case. It found that there was a breach of the right to family life and correspondence (Article 8 of the European Convention on Human Rights), as claimed by Mr Bărbulescu. Mr Bărbulescu was fired after his employer monitored his communications and found that he had used company property to exchange messages with family members. Although the ruling does not forbid employee monitoring, it clarifies how this can be done respecting fundamental rights.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The Grand Chamber questioned the earlier national court decisions. It noted that national courts did not properly assess whether Mr Bărbulescu had been warned that he might be monitored, and to what extent he would be monitored. The Court also clarified the limits regarding legal monitoring of an employee by their employer and the ways national courts should assess them.

First, one of the key aspects that the Court pointed out was the lack of information given to Mr Bărbulescu on the monitoring to which he might be subject. Second, the Court ruled that, in addition to the obligation of providing information, monitoring of employees always needs to be done for a legitimate aim, and in a way that is proportionate to that aim and that does not breach their privacy more than necessary to achieve the goal. None of these safeguards had been followed in this case, as the Court pointed out in the paragraph 140 of its ruling: “the domestic courts failed to determine, in particular, whether the applicant had received prior notice from his employer of the possibility that his communications on Yahoo Messenger might be monitored; nor did they have regard either to the fact that he had not been informed of the nature or the extent of the monitoring, or to the degree of intrusion into his private life and correspondence. In addition, they failed to determine, firstly, the specific reasons justifying the introduction of the monitoring measures; secondly, whether the employer could have used measures entailing less intrusion into the applicant’s private life and correspondence; and thirdly, whether the communications might have been accessed without his knowledge”.

It needs to be stressed that the ruling does not find monitoring of employees’ communications illegal in all situations, but that the power to monitor employees is limited. The judgement limits the employers’ right to monitor employees’ communications by limiting the scope and degree of intrusion, legitimate justification and proportionality of the monitoring. All of these should have been done in this case and should be in any similar cases in the future. The Court clarified that an employee keeps enjoying his right to private and family life also in the workplace.

Press release for the Grand Chamber judgement (05.09.2017)
http://hudoc.echr.coe.int/eng?i=003-5825428-7419362

Romanian whose messages were read by employer “had privacy breached” (05.09.2017)
https://www.theguardian.com/law/2017/sep/05/romanian-chat-messages-read-by-employer-had-privacy-breached-court-rules

Privacy International response to Grand Chamber of the European Court for Human Rights Bărbulescu v. Romania judgement (05.09.2017)
https://medium.com/@privacyint/privacy-international-response-to-grand-chamber-of-the-european-court-for-human-rights-barbulescu-v-cc722b73086b

(Contribution by Anne-Morgane Devriendt, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
18 Sep 2017

Cross-border access to data: EDRi delivers international NGO position to Council of Europe

By EDRi

Today, 18 September 2017, a global coalition of civil society organisations, led by European Digital Rights (EDRi), submitted to the Council of Europe its comments on how to protect human rights when developing new rules on cross-border access to electronic evidence (“e-evidence”). The Council of Europe is currently preparing an additional protocol to the Cybercrime Convention. EDRi’s Executive Director Joe McNamee handed the comments over to Mr. Alexander Seger, the Executive Secretary of the Cybercrime Convention Committee (T-CY) of the Council of Europe.

Joe McNamee, Executive Director of EDRi presents Alexander Seger with his contribution on the forthcoming Cybercrime Protocol. (Photo: Candice Imbert / Council of Europe)

Over the next two and a half years, the work on the new protocol needs to incorporate the civil society principles presented today,

said Joe McNamee, Executive Director of European Digital Rights.

Global civil society is engaging in this process to ensure that any harmonisation in this crucial policy area is up to the highest human rights standards, in line with the ethos of the Council of Europe,

he added.

We are a group of 14 civil society organisations from around the world. We submitted our comments and suggestions on the Terms of Reference for drafting a Second Protocol to the Cybercrime to the Council of Europe. Our aim is to make sure that human rights are fully respected in the preparation of the new protocol. In this global submission, we emphasise the importance of an inclusive, open and transparent drafting process. To facilitate the Council of Europe’s and the State-Parties’ work, we have elaborated key principles that will serve to guide the work of the Drafting group and allow us to engage constructively in the coming two and a half years.

It is vital that the new protocol, if adopted, include and respect three basic principles:

  1. Enforcement of jurisdiction by a State or State agency on the territory of another State cannot happen without the knowledge and agreement of the targeted State.
  2. State-parties must comply with human rights principles and requirements, including under any powers granted or envisaged in or under the Cybercrime Convention and the proposed additional protocol.
  3. Unjustified forced data localisation should be banned. Data transfers between jurisdictions should not occur in the absence of clear data protection standards.

We remain open to work with other civil society organisations in integrating these principles.

Background information:

Electronic evidence (“e-evidence”) refers to digital or electronic evidence, such as contents of social media, emails, messaging services or data held in the “cloud”. Access to these data is often required in criminal investigations. Since in the digital environment the geographical borders are often blurred, investigations require cross-border cooperation between public authorities and between public authorities and the private sector.

The new optional protocol aims to address three areas of activity:

  1. the direct gathering of electronic evidence online by law enforcement agencies in one State, from ICT infrastructure and devices in another State;
  2. closer cooperation between designated bodies in different states in relation to cross-border investigations and transnational collecting of evidence;
  3. the direct requesting and obtaining of possibly highly sensitive personal information by law enforcement agencies in one State from private sector companies in another State, without the knowledge or consent of the latter country, bypassing its laws and potentially violating its sovereignity.

Read more:

New legal tool on electronic evidence: Council of Europe welcomes civil society opinion (18.09.2017)
https://www.coe.int/en/web/human-rights-rule-of-law/-/new-legal-tool-on-electronic-evidence-council-of-europe-welcomes-civil-society-opinion

Global Civil Society Submission to the Council of Europe: Comments and suggestions on the Terms of Reference for drafting a Second Optional Protocol to the Cybercrime Convention (08.09.2017)
https://edri.org/files/surveillance/cybercrime_2ndprotocol_globalsubmission_e-evidence_20170908.pdf

Access to e-evidence: Inevitable sacrifice of our right to privacy? (14.06.2017)
https://edri.org/access-to-e-evidence-inevitable-sacrifice-of-our-right-to-privacy/

EDRi position paper on the Cybercrime Convention – cross-border access to electronic evidence (17.01.2017)
https://edri.org/files/surveillance/cybercrime_accesstoevidence_positionpaper_20170117.pdf

EDRi letter to the Council of Europe on the report of the T-CY Cloud Evidence Group (2016)5 (10.11.2016)
https://edri.org/files/surveillance/letter_coe_t-cy_accesstoe-evidence_cloud_20161110.pdf

Professor Douwe Korff’s comments on the T-CY report (2016)5 (09.11.2016)
https://edri.org/files/surveillance/korff_note_coereport_leaaccesstocloud%20data_final.pdf

Twitter_tweet_and_follow_banner

close
13 Sep 2017

Five things the online tracking industry gets wrong

By Diego Naranjo

The Interactive Advertising Bureau (IAB) Europe, one of the loudest enemies of the e-Privacy Regulation, is the association of online tracking and adverting companies. On 7 September, IAB Europe published a report titled: “Europe Online: An experience driven by advertising”.

In the report, some of the key issues are clearly displayed, but some are hidden behind the large misleading headlines and graphics. The IAB Europe Report says:

1) “In the online world most users’ experience is predominantly free.”

The report conveys the message that online users are using services without paying for the services in cash. This is true in many cases. However, it cleverly creates a false dichotomy that the only alternative to massive, untransparent profiling and tracking is unspecified costs for users.

It is clear that they are unknowingly “paying” with their data, without any clarity about the financial value or security cost of handing over their data nor, indeed, the actual cost of providing the “free” services. In the online world, companies offering “free” services live from insights into how to manipulate their users. Often the “free” websites have no idea about (nor control over) where their visitors’ data goes, what other data it is merged with, and what uses that data are put to.

To provide the best services for their actual customers (the companies paying to place advertisements or cookies), advertisers sometimes get access to the content of your emails, track your physical movements, analyse your browsing habits, or listen to the interactions of your children with their toys.

Even though the way online tracking happens is not immediately obvious, the results of the Eurobarometer on e-Privacy show clearly what matters to people: 92% of EU citizens said that it is very important that the personal information (such as their pictures, contact lists, etc.) on their computer, smartphone, tablet or any other device is only accessed with their permission. The same percentage highlighted the importance of protecting their online communications (e-mails and online instant messaging).

2) “Nine in ten online users (92%) would stop accessing their most-used free news, content or service site or app if it switched to paid access only.”

Here again, a false dichotomy was presented to users, to generate the response requested by IAB. The approach misleads readers by implying that no innovation is possible, no solutions other than the status quo exist. However, it is not true that different business models cannot be created – we do not have to rely on a model that has created a quasi-duopoly for Google and Facebook. For example, there are successful micropayment models for quality news sources. Also, innovation around contextual advertising is increasingly successful to achieve its goals, without engaging in invasive profiling and tracking of individuals. Such innovation has the capacity to generate a level playing field, as an alternative to the current duopoly stranglehold of the online advertising market.

The statement closes the door to alternative ways of payment. Furthermore, it ignores the fact that a majority of EU citizens think it is “unacceptable to have their online activities monitored in exchange for unrestricted access to a certain website (64%) or to pay in order not to be monitored when using a website (74%)”, as shown by the Eurobarometer.

3) “Most users are either positive or neutral about online advertising.”

Another misrepresentation. Online advertising is online advertising. Advertising based on tracking and profiling is advertising based on tracking and profiling. Asking about one and suggesting that the answer is about the other is blatantly misleading. This is demonstrated when report admits that 58% of users are not happy with their browsing data being shared as the basis for advertising. Later on in its “research”, the IAB admits that 80% would not like to see their data shared with third parties for advertising purposes.

The use of ad-blockers increased up to 30% in 2016. Now 11% of internet users worldwide are using one. And yet the online advertising industry still refuses to acknowledge that innovation is even possible.

4) “Four in ten users (42%) are happy with their browsing data being shared as the basis for advertising, stating they don’t mind seeing personalised advertising based on their browsing data in exchange for free news, content or services.”

This suggests that 58% of online users do not feel comfortable with their browsing being analysed in htis way.

The Eurobarometer report on the e-Privacy Regulation says that six in ten respondents (60%) have already changed the privacy settings on their internet browser, for example, to delete browsing history or cookies. It also shows that 40% of respondents avoid certain websites because they are worried their online activities are monitored, and that 71% of them say it is unacceptable for companies to share information about them without their permission, even if it helps companies provide new services they may like.

5) “Continually approving the use of cookies as a precondition for accessing a site was the least popular and most divisive of the two options.”

Yet another false dichotomy: it has been done badly so the only option is not to do it at all. The way that the e-Privacy Directive was implemented led to the “cookie” pop-up notices that users often see. These cookie notices are sometimes intrustive, almost always demonstrably factually incorrect and therefore inefficient.However, there is no reason to believe that there is therefore no other – more efficient and informative – way to protect citizens’ privacy.

The study conducted for the IAB report gave respondents two options: that every app asks every time for consent for the use of their data, or that the apps only show how their data is being used, without asking for their consent. Obviously, most of the respondents chose the lesser of two evils. In reality, users want services to work differently: According to Eurobarometer, eight in ten (82%) said that it is important that tools for monitoring their activities online (such as cookies) can only be used with their permission, and 56% stated that this is very important to them.

The businesses that listen to consumers and hear their concerns about current tracking based models will have an advantage. They will understand the importance of earning the trust of their clients – an essential element of running a successful business – and develop towards less privacy intrusive business models. They will, as long as untransparent, trust-eroding practices are restricted by law – and this is exactly what the IAB “research” is designed to prevent.

Europe Online: An experience driven by advertising
https://www.iabeurope.eu/wp-content/uploads/2017/09/EuropeOnline_FINAL.pdf

e-Privacy Directive: Frequently Asked Questions (05.10.2016)
https://edri.org/epd-faq/

e-Privacy revision: Document pool (10.01.2017)
https://edri.org/eprivacy-directive-document-pool/

Your privacy, security and freedom online are in danger (14.09.2016)
https://edri.org/privacy-security-freedom/

Twitter_tweet_and_follow_banner

close
06 Sep 2017

Controversial testing of facial recognition software in Germany

By Anne-Morgane Devriendt

At the end of August 2017, German police has been testing a facial recognition software at Südkreuz train station in Berlin. The system was tested on 300 volunteers. The goal was to evaluate the accuracy of the software in recognising and distinguishing them from the crowd – a feature that the police hopes to ultimately use to track and arrest crime and terrorism suspects.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

However, this testing has been subject to criticism regarding its parameters and its efficiency in the fight against terrorism. The experiment raises two concerns: the terms of the experiment and the relevance of such a measure against terrorism.

In the aftermaths of recent terrorist attacks, mass surveillance measures have been increasingly introduced in Europe, as a means to “fight against terrorism”. These measures might give citizens the impression that the government is taking action, but there is no evidence that they are efficient towards this goal.

By using facial recognition software, Thomas de Maizière, the German Minister of the Interior, aims at strengthening the public’s sense of security and help the fight against terrorism. He considers that it does not undermine civil liberties, but lawyers and civil society organisations disagree, first and foremost on the terms of the experiment. The facial recognition software was tested on volunteers, who carried around bluetooth sensors transmitting information about their location. German EDRi member Digitalcourage reported that these sensors provide information that is not useful for the results of the experiment and that it was not communicated to the volunteers. Furthermore, Digitalcourage affirms that this data is easily accessible by anyone.

Beyond the technical issues and the lack of consent, it has been denounced by lawyers as unconstitutional and uncalled for, because it costs more in terms of civil rights than it can bring to the fight against terrorism. The usefulness of mass surveillance in improving security is questionable, to say the least. The fact that those involved in recent terrorist attacks were known by the intelligence services and had previously been under surveillance did not stop the attacks. It would require immense resources to constantly follow all potential suspects. It is difficult to see how introducing tools such as facial recognition in public places to widen the scope of surveillance, and thus increasing the amount of data to be processed by law enforcement, could help preventing future terrorist attacks.

Facial recognition at the Südkreuz station: Federal police did not inform correctly – We request the end of the experiment
https://digitalcourage.de/blog/2017/gesichtsscan-beenden

Berlin starts controversial test of facial recognition cameras at train station (02.08.2017)
https://www.thelocal.de/20170802/berlin-launches-controversial-test-of-facial-recognition-cameras-at-train-station

German police test facial recognition cameras at Berlin station (01.08.2017)
https://www.reuters.com/article/us-germany-security/german-police-test-facial-recognition-cameras-at-berlin-station-idUSKBN1AH4VR

Opinion: Facial recognition tech makes suspects of us all (31.08.2017)
http://gearsofbiz.com/opinion-facial-recognition-tech-makes-suspects-of-us-all/37827

Germany’s facial recognition pilot program divides public (24.08.2017)
http://www.dw.com/en/germanys-facial-recognition-pilot-program-divides-public/a-40228816

Facial recognition software to catch terrorists being tested at Berlin station (02.08.2017)
http://www.telegraph.co.uk/news/2017/08/02/facial-recognition-software-catch-terrorists-tested-berlin-station/

Facial recognition cameras at Berlin station are tricking volunteers, activists claim (23.08.2017)
https://www.thelocal.de/20170823/berlins-facial-recognition-cameras-criticized-for-collecting-more-data-than-necessary

(Contribution by Anne-Morgane Devriendt, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
06 Sep 2017

Netherlands: Sharing of travel data violated students’ privacy

By Bits of Freedom

It was all over the news on 22 August 2017: Translink, the company responsible for the Dutch public transport card “OV-chipkaart” had been passing student travel data to the Education Executive Agency responsible for student finance in the Netherlands (DUO). DUO uses this data to figure out whether students who claim to live on their own – and therefore receive a supplementary grant – actually still live with their parents. A court ruled that this was violating students’ privacy. The same day, Dutch EDRi member Bits of Freedom called upon students to issue a right of access request to DUO and Translink. The students were encouraged to ask the following questions:

  1. Which data does DUO have on me and if I didn’t supply this data myself, how did DUO obtain it?
  2. Which data does Translink have on me and with whom has this data been shared?

Where and when we travel, whom we call, what we buy: sometimes it seems records are kept of every single thing we do. We are becoming more and more transparent and easier to influence for companies and governments. Based on the data that is gathered about us, conclusions are drawn with tangible, sometimes far-reaching consequences. Therefore it is important that we gain insight into who knows what about us. And of course, what is being done with that information.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Imagine: you live in a dorm room when one of your parents becomes seriously ill. You are at your parents’ home for weeks or even months on end. You don’t actually live there, but you do sleep over. Is it really possible for a DUO employee to make that distinction based on your public transport data? We don’t think so. You can interpret data in multiple ways and often it does not tell the whole story. Conclusions that someone else reaches by looking at your data are not always correct. But still, you are the one who has to deal with the consequences.

It is indeed important that fraud is addressed. However, it is also important that the tools used to do so are proportionate to the offence. In this case, the Dutch court ruled that DUO cannot request this kind of privacy-sensitive information just like that. And even Translink really does know better: in its terms and conditions, Translink states that it will only hand over data as part of a criminal investigation and therefore only to the police and judiciary. By deviating from its own commitment, the company undermines trust in its service.

The Dutch constitution states that everyone is entitled to respect of their personal environment. The Dutch Data Protection Act (Wbp) is the most important law regarding the collection and sharing of personal data. This law also gives citizens the right to gain insight into their own data and the right to correct it. By executing these rights, you can verify whether the processing of your personal data is correct, complete, relevant and lawful. Bits of Freedom’s Privacy Review Machine can help you with this.

DUO and the OV-chipkaart: Ask for clarification about your data! (only in Dutch, 22.08.2017)
https://www.bof.nl/2017/08/22/duo-en-de-ov-chipkaart-vraag-om-opheldering-over-jouw-gegevens/

Privacy Review Machine (only in Dutch)
https://pim.bof.nl/

(Contribution by Evelyn Austin, EDRi member Bits of Freedom, the Netherlands; Translation: Philip Westbroek)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close