A measure which would be illegal if implemented by a government should also be illegal if implemented by industry as a “voluntary” measure, as a result of government pressure or for public relations or anti-competitive reasons. However, as key international legal instruments, such as the European Charter of Fundamental Rights and the European Convention on Human Rights, as well as national constitutions are binding for states and governments, they are not directly applicable to other entities, such as private companies. As a result, there is a major trend towards governments persuading or coercing companies to impose restrictions on fundamental freedoms under the guise of “self-regulation,” thereby circumventing legal protections.

17 Oct 2017

Privacy Camp 2018: Speech, settings and [in]security by design

By Kirsten Fiedler

Join us for the 6th annual Privacy Camp! Privacy Camp will take place on 23 January 2018 in Brussels, Belgium, just before the start of the CPDP conference. Privacy Camp brings together civil society, policy-makers and academia to discuss existing and looming problems for human rights in the digital environment. In the face of a “shrinking civic space” for collective action, the event aims to provide a platform for actors from across these domains to discuss and develop shared principles to address key challenges for digital rights and freedoms of individuals.

Our theme this year is “speech, settings and [in]security by design”. The two main tracks of the event therefore will focus on the one hand on the security of devices and infrastructure, and on the other hand on areas and policies where legitimate speech is endangered. Participate!

The first track will include sessions on state hacking and malware, law enforcement access to user data (so-called “e-evidence”), device security with hands-on tutorials on how to protect your communications better.

The second track will include sessions on algorithmic decision-making and discrimination via big data and privacy-invasive measures to censor legitimate speech online, as well as the hacking of democracies via the spread of misinformation and propaganda.

The event is co-organised by European Digital Rights (EDRi), Privacy Salon, USL-B Institute for European Studies and VUB-LSTS. Participation is free. Registrations will open in early December.


We invite you to propose a panel for one of these two tracks:

Track 01 [in]security of devices
Topics: #statehacking #encryption #surveillance #statemalware #Eevidence #security #technopolitics

Track 02 [in]security of speech
Topics: #uploadfilters #censorship #algorithms #discrimination #accountability #hackingelections #misinformation #propaganda

When submitting your proposal:

  • Indicate a clear objective for your session: What would be a good outcome for you?
  • Indicate other speakers that could participate in your panel (and let us know which speaker has already confirmed, at least in principle, to participate).
  • Make it as participative as possible, think about how to include the audience and diverse actors as much as possible.
  • Send us a description of no more than 500 words.

After the deadline, we will review your submission and let you know by 6 December whether your panel can be included in the programme. It is possible that we suggest to merge proposals if they are very similar.

Please send your proposal via email to Maren <edri.intern3(at)edri(dot)org>!

If you have questions, please contact Kirsten <kirsten.fiedler(at)edri(dot)org> or Imge <imge.ozcan(at)vub(dot)be>.

16 Oct 2017

EU’s plans on encryption: What is needed?

By Maryant Fernández Pérez

On 18 October 2017, the European Commission is expected to publish a Communication on counter-terrorism, which will include some lines on encryption.

What is encryption? Why is it important?

When we send an encrypted message (or store an encrypted document), no one else but the intended recipient can read it using a unique key. So even if someone manages to intercept the message when it’s on its way to the recipient, they will not be able to read its contents without the key – they can only see something that looks like a random set of characters. Encryption ensures the confidentiality of our personal data and company secrets. This is not only essential for our democratic rights and freedoms, but it also promotes trust in our digital infrastructure and communications, which is vital for innovation and economic growth. For example, encryption is essential for securing online banking transactions, and protecting the confidentiality of sources in journalism.

Encryption workarounds needed?

The European Commission has come under pressure from some EU Member States to take actions to address the perceived problem of data not being available to law enforcement authorities, due to encryption. This issue is frequently hyped as a major problem, and certain politicians have suggested simplistic and counter-productive policies to weaken encryption as a “solution” to them.

There are several techniques that law enforcement authorities use to access encrypted data. One approach consists in obtaining the key to decrypt the data, for instance, through a physical search when the key is saved on a USB drive. The key can also be obtained from the user directly, for example via social engineering or legal obligation. Another approach is to access the decrypted data through bypassing the key by exploiting a flaw or weakness in the system or by installing software or spyware. However, the existence of workarounds does not mean that law enforcement should resort to them nor that they would be necessary or proportionate, or even compatible with human rights law.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

From a technical point of view, encryption cannot be weakened “just a little”, without potentially introducing additional vulnerabilities, even if unintentionally. When there is a vulnerability, anyone can take advantage of it, not just the police or intelligence services. Sooner or later, a secret vulnerability will be exploited by a malicious user, perhaps the same one it was meant to be safeguarding us from. Law enforcement aims are legitimate. However, as pointed out by the European Union Agency for Network and Information Security (ENISA), limiting the use of encryption will create vulnerabilities, lower trust in the economy and damage civil society and industry alike.

What should the European Union do?

A more balanced approach is needed, which avoids much of the rhetoric that is often heard in relation to encryption. Such an approach would recognise a variety of options for addressing this issue without compromising everybody’s security or violating human rights.

Saying “no” to backdoors is a step into the right direction, but not the end of the debate, as there are still many ways to weaken encryption. The answer to security problems like those created by terrorism cannot be the creation of security risks. On the contrary, the EU should focus on stimulating the development and the use of high-grade standards for encryption, and not in any way undermine the development, production or use of high-grade encryption.

We are concerned by the potential inclusion of certain aspects of the forthcoming Communication, such as the increase of capabilities of Europol and what this may entail, and references to removal of allegedly “terrorist” content without accountability in line with the Commission’s recent Communication on tackling illegal content online. We remain vigilant regarding the developments in the field of counter-terrorism.

Read more:

Encryption – debunking the myths (03.05.2017)

EDRi delivers paper on encryption workarounds and human rights (20.09.2017)

EDRi position paper on encryption (25.01.2016)

How the internet works (23.01.2012, available in six languages)


16 Oct 2017

Civil society calls for the deletion of the #censorshipmachine


Today, 16 October, European Digital Rights (EDRi), together with 56 other civil society organisations, sent an open letter to EU decision makers. The letter calls for the deletion of the Article 13 of the Copyright Directive proposal, pointing out that monitoring and filtering of internet content that it proposes breach citizens’ fundamental rights.

The proposals in the Copyright Directive would relegate the European Union from a digital rights defender in global internet policy discussions to the leader in dismantling fundamental rights, to the detriment of internet users around the world,

said Joe McNamee, Executive Director of EDRi.

The censorship filter proposal would apply to all online platforms hosting any type of user-uploaded content such as YouTube, WordPress, Twitter, Facebook, Dropbox, Pinterest or Wikipedia. It would coerce platforms into installing filters that prevent users from uploading copyrighted materials. Such a filter would require the monitoring of all uploads and would be unable to differentiate between copyright infringements and legitimate uses of content authorised by law. It undermines legal certainty for European businesses, as it creates legal chaos and offers censorship filters as a solution.

The letter points out that the censorship filter proposal of the Article 13:

  1. would violate the right to freedom of expression set out in the Charter of Fundamental Rights;
  2. provoke such legal uncertainty that online services would have no other option than to monitor, filter and block EU citizens’ communications; and
  3. includes obligations on internet companies that would be impossible to respect without imposing excessive restrictions on citizens’ fundamental rights.

Read the letter below.

In September 2016, The European Commission published its proposal for a new Copyright Directive that aims at modernising EU copyright rules. The proposal has received mixed responses so far in the European Parliament and is awaiting a vote in the Legal Affairs Committee of the Parliament.


Article 13 – Monitoring and Filtering of Internet Content is Unacceptable
Open letter

Dear President Juncker,
Dear President Tajani,
Dear President Tusk,
Dear Prime Minister Ratas,
Dear Prime Minister Borissov,
Dear Ministers,
Dear MEP Voss,
Dear MEP Boni,

The undersigned stakeholders represent fundamental rights organisations.

Fundamental rights, justice and the rule of law are intrinsically linked and constitute core values on which the EU is founded. Any attempt to disregard these values undermines the mutual trust between member states required for the EU to function. Any such attempt would also undermine the commitments made by the European Union and national governments to their citizens.

Article 13 of the proposal on Copyright in the Digital Single Market include obligations on internet companies that would be impossible to respect without the imposition of excessive restrictions on citizens’ fundamental rights.

Article 13 introduces new obligations on internet service providers that share and store user-generated content, such as video or photo-sharing platforms or even creative writing websites, including obligations to filter uploads to their services. Article 13 appears to provoke such legal uncertainty that online services will have no other option than to monitor, filter and block EU citizens’ communications if they are to have any chance of staying in business.

Article 13 contradicts existing rules and the case law of the Court of Justice. The Directive of Electronic Commerce (2000/31/EC) regulates the liability for those internet companies that host content on behalf of their users. According to the existing rules, there is an obligation to remove any content that breaches copyright rules, once this has been notified to the provider.

Article 13 would force these companies to actively monitor their users‘ content, which contradicts the ‘no general obligation to monitor‘ rules in the Electronic Commerce Directive. The requirement to install a system for filtering electronic communications has twice been rejected by the Court of Justice, in the cases Scarlet Extended (C 70/10) and Netlog/Sabam (C 360/10). Therefore, a legislative provision that requires internet companies to install a filtering system would almost certainly be rejected by the Court of Justice because it would contravene the requirement that a fair balance be struck between the right to intellectual property on the one hand, and the freedom to conduct business and the right to freedom of expression, such as to receive or impart information, on the other.

In particular, the requirement to filter content in this way would violate the freedom of expression set out in Article 11 of the Charter of Fundamental Rights. If internet companies are required to apply filtering mechanisms in order to avoid possible liability, they will. This will lead to excessive filtering and deletion of content and limit the freedom to impart information on the one hand, and the freedom to receive information on the other.

If EU legislation conflicts with the Charter of Fundamental Rights, national constitutional courts are likely to be tempted to disapply it and we can expect such a rule to be annulled by the Court of Justice. This is what happened with the Data Retention Directive (2006/24/EC), when EU legislators ignored compatibility problems with the Charter of Fundamental Rights. In 2014, the Court of Justice declared the Data Retention Directive invalid because it violated the Charter.

Taking into consideration these arguments, we ask the relevant policy-makers to delete Article 13.


Civil Liberties Union for Europe (Liberties)
European Digital Rights (EDRi)

Access Info
Article 19
Associação D3 – Defesa dos Direitos Digitais
Associação Nacional para o Software Livre (ANSOL)
Association for Progressive Communications (APC)
Association for Technology and Internet (ApTI)
Association of the Defence of Human Rights in Romania (APADOR)
Associazione Antigone
Bangladesh NGOs Network for Radio and Communication (BNNRC)
Bits of Freedom (BoF)
BlueLink Foundation
Bulgarian Helsinki Committee
Center for Democracy & Technology (CDT)
Centre for Peace Studies
Centrum Cyfrowe
Coalizione Italiana Libertà e Diritti Civili (CILD)
Code for Croatia
Culture Action Europe
Electronic Frontier Foundation (EFF)
Estonian Human Rights Centre
Freedom of the Press Foundation
Frënn vun der Ënn
Helsinki Foundation for Human Rights
Hermes Center for Transparency and Digital Human Rights
Human Rights Monitoring Institute
Human Rights Watch
Human Rights Without Frontiers
Hungarian Civil Liberties Union
Index on Censorship
International Partnership for Human Rights (IPHR)
International Service for Human Rights (ISHR)
JUMEN – Human Rights Work in Germany
Justice & Peace
La Quadrature du Net
Media Development Centre
Miklos Haraszti (Former OSCE Media Representative)
Modern Poland Foundation
Netherlands Helsinki Committee
One World Platform
Open Observatory of Network Interference (OONI)
Open Rights Group (ORG)
Plataforma en Defensa de la Libertad de Información (PDLI)
Reporters without Borders (RSF)
Rights International Spain
South East Europe Media Organisation (SEEMO)
South East European Network for Professionalization of Media (SEENPM)
The Right to Know Coalition of Nova Scotia (RTKNS)

CC: Permanent and Deputy Permanent Representatives of the Members States to the EU
CC: Chairs of the JURI and LIBE Committees in the European Parliament
CC: Shadow Rapporteurs and MEPs in the JURI and LIBE Committees in the European Parliament
CC: Secretariats of the JURI and LIBE Committees in the European Parliament
CC: Secretariat of the Council Working Party on Intellectual Property (Copyright)
CC: Secretariat of the Council Working on Competition
CC: Secretariat of the Council Research Working Party

Read more:

Article 13 Open letter – Monitoring and Filtering of Internet Content is Unacceptable (16.10.2017)

Over 50 Human Rights & Media Freedom NGOs ask EU to Delete Censorship Filter & to Stop Copyright Madness (16.10.2017)

Deconstructing the Article 13 of the Copyright proposal of the European Commission, revision 2

The Copyright Reform: a guide for the perplexed

Copyright reform: Document pool

Six states raise concerns about legality of Copyright Directive (05.09.2017)

Proposed Copyright Directive – Commissioner confirms it is illegal (28.06.2017)


13 Oct 2017

Europe’s governments win the Big Brother Awards 2017 for opening the pandora’s box of surveillance


On Friday 13 October, the annual Belgian Big Brother Awards – a negative prize for the worst privacy abuser of the year – took place in Brussels. The jury awarded the European trend of state hacking, European Digital Right’s (EDRi) nomination, the title of the ultimate privacy villain. The public voted Automatic number-plate recognition (ANPR) cameras as their “favourite”.

State hacking has rapidly become a very powerful tool for intelligence services in recent years. Europe’s governments have been expanding the possibilities of states spying on their own citizens and pushed the fashion of “insecurity by design”. In 2017, the Belgian government followed this trend and adopted legislation that gives law enforcement authorities permission to access computers and other devices remotely. It adopted a text modifying the law on “security and intelligence services” granting the authorities new broad surveillance powers. To put it simply, it legalised the most intrusive form of government hacking.

“Government hacking affects people’s privacy rights and freedom of expression in new and deeply invasive ways – it also means an undermining of the security of the internet. Governments engaged in such activities have systematically failed to implement minimum safeguards for human rights”, said Kirsten Fiedler, Managing Director of EDRi.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

“The WannaCry attack  has highlighted that there are serious repercussions when known vulnerabilities are not immediately reported and fixed. Current practices of the intelligence services are damaging not only for the security of European citizens, but also for businesses, public administrations and critical infrastructures – like hospitals, schools and public transport”, she added.

What is state or government hacking?

Hacking means the manipulation of software, data, computer systems, networks, or other electronic devices without the permission of the person or organisation responsible. For instance, malicious software developed by a government, often relying on software flaws that are not publicly known. This means that the software flaws remain open and available for criminals to exploit. Governments hack devices with the aim to monitor computer activities and get access to sensitive data.

In 2014, it was revealed that the British intelligence service, the Government Communications Headquarters (GCHQ), had hacking capacities to activate a device’s microphone or webcam, to identify the location of a device, to collect login details and passwords for websites and record internet browsing histories on a device. The German intelligence service developed similar software, which was discovered in 2011 by EDRi-member Chaos Computer Club (CCC). Now, in March 2017, the Belgian government has given its services the power to remotely access its citizens’ devices and install malware (see Art. 38, Law modifying the law from 30 November 1998 governing the intelligence and security services).

Why is government hacking a problem?

Giving intelligence services such powers makes it difficult for individuals to protect their personal data and companies to protect their trade secrets from these kinds of attacks. Moreover, it allows foreign intelligence services to more effortlessly spy on state secrets, and it opens pandora’s box for third parties to access and control critical infrastructures – this could for example plunge hospitals into chaos. It gives governments an incentive not to report software vulnerabilities that it is aware of, facilitating crime in the name of fighting crime.

EDRi’s paper “Encryption Workarounds – A digital rights perspective” (pages 9-11) includes proposals for safeguards that need to be met to provide adequate protection of fundamental rights in cases of government hacking.

The Big Brother Awards are based on a concept created by EDRi member Privacy International. The goal is to draw attention to violations of privacy. The Belgian Big Brother Awards is organised by EDRi member Liga Voor Mensenrechten in collaboration with PROGRESS Lawyers Network (PLN),, La Ligue des droits de l’Homme (LDH) and European Digital Rights (EDRi).

Belgian Big Brother Awards 2017

Encryption Workarounds – A digital rights perspective

Big Brother Awards Belgium: Facebook is the privacy villain of the year (06.10.2016)


05 Oct 2017

Dear MEPs: We need you to protect our privacy online!


They’re hip, they’re slick and they follow you everywhere. They know you like new shoes, playing tennis and tweeting at odd hours of the morning. Do you know what that says about your health, your relationships and your spending power? No? Well, the online companies do. They follow you everywhere you go online, they have a perfect memory, they know the sites you visited last year even if you’ve forgotten… Look who’s stalking.

European legislation protecting your personal data was updated in 2016, but the battle to keep it safe is not over yet. The European Union is revising its e-Privacy rules. We welcomed the European Commission (EC) proposal as a good starting point, but with room for improvement. The online tracking industry is lobbying fiercely against it. Online tracking and profiling gave us filter bubbles and echo chambers. Yet the lobbyists lobby for it under the pretext of “saving the internet”, “protecting quality journalism” – even “saving democracy”.

The European Parliament is currently debating its position on the EC proposal. Some Members of the European Parliament (MEPs) support “tracking business, as usual” while others support a strong future-proof norm to protect the privacy, innovation and security of future generations of EU citizens and businesses.

Priorities for defending privacy and security:

1) Protect confidentiality of our communications – both in transit and at rest!
Confidentiality of communications needs to be protected both in transit and when it is stored. Lobbyists have been campaigning for a technicality that would allow them to read and exploit your emails stored in the cloud. (Art. 5)

2) Protect our privacy: Do not add loopholes to security measures!
A “legitimate interest” exception was not included in any version of the previous e-Privacy Directives. This would be a major weakening of the legislation compared with existing rules. Our member Bits of Freedom wrote about the problems with “legitimate interest” here. (several Articles and Recitals)

3) Do not let anyone use our data without asking for our consent!
It is crucial to keep consent as the legal ground to process communications data. Neither “legitimate interest” nor “further processing” should be allowed to weaken the security and privacy of European citizens and businesses (Art.6)

4) Privacy should not be an option – what we need is privacy by default!
Provisions about default privacy settings need to be strengthened and improved, certainly not watered down or deleted. e-Privacy must ensure “privacy by design and by default” and not, as in the EC proposal, “privacy by option”. You can find our specific proposals here. The European Parliament previously adopted a Directive that criminalises unauthorised access to computer systems. It would be completely incoherent if it were to adopt legislation that foresees default settings that do not protect against unauthorised access to devices. (Art. 10)

5) No new exceptions to undermine our privacy!
Exceptions for Member States cannot become a carte blanche rendering e-Privacy useless. Therefore, the safeguards established by the Court of Justice of the European Union on cases regarding the exceptions in the relevant sections of the e-Privacy Regulation should be diligently respected – the scope of the exception should not be expanded. (Art. 11)

6) Do not undermine encryption!
Imposing a ban on undermining or attacking encryption should be a priority.

7) Protect our devices (hardware+software) by design and by default!
Hardware and software security need to be protected by design and by default.

MEPs, protect our #ePrivacy – Support amendments that follow the principles listed above!

e-Privacy revision: Document pool

e-Privacy: Consent (pdf)

e-Privacy: Legitimate interest (pdf)

e-Privacy: Privacy by design and by default (pdf)

e-Privacy: Offline tracking (pdf)

Your privacy, security and freedom online are in danger (14.09.2016)

Five things the online tracking industry gets wrong (13.09.2017)

ePrivacy Regulation: Call a representative and make your voice heard!

Who’s afraid of… e-Privacy? (04.10.2017)


04 Oct 2017

ENDitorial: Tinder and me: My life, my business

By Maryant Fernández Pérez

Tinder is one of the many online dating companies of the Match Group. Launched in 2012, Tinder started being profitable as of 2015, greatly thanks to people’s personal data. On 3 March 2017, journalist Judith Duportail asked Tinder to send her all her personal data they had collected, including her “desirability score”, which is composed of the “swipe-left-swipe-right” ratio and many other pieces of data and mathematic formulae that Tinder does not disclose. Thanks to her determination and support from lawyer Ravi Naik, privacy expert Paul-Olivier Dehaye and the work of Norwegian consumers advocates, Judith reported on 27 September 2017 that she received 800 pages about her online dating-related behaviour.

----------------------------------------------------------------- Support our work with a one-off-donation! -----------------------------------------------------------------

Tinder did not disclose how desirable the company considered Duportail to be, though, even if it had disclosed it to another journalist. The 800 pages contained information such as her Facebook “likes”, her Instagram pictures (even if she had deleted her account), her education, how many times she had connected to Tinder, when and where she entered into online conversations, and many more things. “I was amazed by how much information I was voluntarily disclosing”, Duportail stated.

800 pages of personal data – surprising?

As a Tinder user, you should know that you “agree” to Tinder’s terms of use, privacy policy and safety tips, as well as other terms disclosed if you purchase “additional features, products or services”. These include the following:

  • “You understand and agree that we may monitor or review any Content you post as part of a Service.”
  • “If you chat with other Tinder users, you provide us the content of your chats.”
  • “We do not promise, and you should not expect, that your personal information, chats, or other communications will always remain secure.”
  • “By creating an account, you grant to Tinder a worldwide, transferable, sub-licensable, royalty-free, right and license to host, store, use, copy, display, reproduce, adapt, edit, publish, modify and distribute information you authorize us to access from Facebook, as well as any information you post, upload, display or otherwise make available (collectively, ‘post’) on the Service or transmit to other users (collectively, ‘Content’).”
  • “You agree that we, our affiliates, and our third-party partners may place advertising on the Services.”
  • “If you’re using our app, we use mobile device IDs (the unique identifier assigned to a device by the manufacturer), or Advertising IDs (for iOS 6 and later), instead of cookies, to recognize you. We do this to store your preferences and track your use of our app. Unlike cookies, device IDs cannot be deleted, but Advertising IDs can be reset in “Settings” on your iPhone.”
  • “We do not recognize or respond to any [Do Not Track] signals, as the Internet industry works toward defining exactly what DNT means, what it means to comply with DNT, and a common approach to responding to DNT.”
  • “You can choose not to provide us with certain information, but that may result in you being unable to use certain features of our Service.”

Tinder explains in its Privacy Policy – but not in the summarised version of the terms – that you have a right to access and correct your personal data. What is clear to the company is that you “voluntarily” provided your information (and that of others). Duportail received part of the information Tinder and its business partners hold, no doubt partly because she is a journalist. Her non-journalist friends have not experienced the same benevolence. Your personal data has an effect not only on your online dates, “but also what job offers you have access to on LinkedIn, how much you will pay for insuring your car, which ad you will see in the tube and if you can subscribe to a loan”, Paul-Olivier Dehaye highlights.

Worse still, even if you close your account or delete info, Tinder or its business partners do not necessarily delete it. And the worst, you’ve “agreed” to it: “If you close your account, we will retain certain data for analytical purposes and recordkeeping integrity, as well as to prevent fraud, enforce our Terms of Use, take actions we deem necessary to protect the integrity of our Service or our users, or take other actions otherwise permitted by law. In addition, if certain information has already been provided to third parties as described in this Privacy Policy, retention of that information will be subject to those third parties’ policies.”

You should be in control

Civil society organisations fight this kind of practices, to defend your rights and freedoms. For instance, the Norwegian Consumer Council successfully worked for Tinder to change its terms of service. On 9 May 2017, EDRi and its member Access Now raised awareness about period trackers, dating apps like Tinder or Grindr, sex extortion via webcams and the “internet of (sex) things” at the re:publica 17 conference. Ultimately, examples like Duportail’s shows the importance of having strong EU data protection and privacy rules. Under the General Data Protection Regulation, you have a right to access your personal data, and companies should provide privacy by default and design in their services. Now, we are working on the e-Privacy Regulation to ensure you have real consent instead of a tick on a box of something you never read, to prevent companies from tracking you unless you provide express and specific consent, among many other things.

Now that you know about this or have been reminded of this, spread the word! It does not matter whether you are on Tinder or not. This is about your online future.

I asked Tinder for my data. It sent me 800 pages of my deepest, darkest secrets (26.09.2017)

Getting your data out of Tinder is really hard – but it shouldn’t be (27.09.2017)

Safer (digital) sex: pleasure is just a click away (09.05.2017)

Tinder bends for consumer pressure (30.03.2017)

(Contribution by Maryant Fernández Pérez, EDRi)



04 Oct 2017

The privacy movement and dissent: Art

By Guest author

This is the third blogpost of a series, originally published by EDRi member Bits of Freedom, that explains how the activists of a Berlin-based privacy movement operate, organise, and express dissent. The series is inspired by a thesis by Loes Derks van de Ven, which describes the privacy movement as she encountered it from 2013 to 2015.*

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

Although there are relatively few privacy movement members involved in the actual process of creating art, it does affect the movement as a whole. Art reflects the movement’s beliefs and is used as a weapon of resistance against injustice.

The two art projects of the privacy movement which will be introduced in this article are Panda to Panda and Anything to Say?. They both share a number of features that belong to activist art in general. One of these features is the way activist art comes into being; the art activists create almost always comes from personal experiences and wants to draw attention to and gain recognition for those experiences. In addition, it problematises authority, domination, and oppression and seeks to alter the current situation. Moreover, activists like their work to evoke emotion and provoke intellectually, and they aim to form a community among those who share a similar aversion to oppression.

Panda to Panda (2015) is part of a larger project called Seven on Seven, a project initiated by Rhizome, the influential platform for new media art affiliated with the New Museum in New York City. Each year, Rhizome matches seven artists with seven technologists. In 2015, one of the pairs Rhizome invited to participate were Ai Weiwei and Jacob Appelbaum. The result of their collaboration, Panda to Panda, consists of twenty stuffed pandas from which the stuffing has been replaced with shredded documents that Glenn Greenwald and Laura Poitras received from Edward Snowden. In addition, a micro SD card with the documents on it has been placed inside each panda. By distributing the pandas to as many places as possible, the pandas function as a “distributed backup” that is difficult to destroy, since that would mean destroying all twenty objects. The project was documented by Ai, who shared the images with his followers on social media. Laura Poitras was invited to film the process and eventually published the film in the online edition of The New York Times.

Panda to Panda is an example of ethico-political subversion, in which authority is undermined in a number of ways. First, the project in its totality is a complaint against government surveillance and state power. As Ai, Appelbaum, and Poitras were working on the project, they continuously filmed each other. With the constant filming they emphasise and visualise the surveillance they are under: while they film each other, they are also watched by the surveillance cameras placed in front of Ai’s studio by the Chinese authorities. There is a constant awareness of always being under watch.

Second, the pandas also have a symbolic meaning. From Appelbaum’s frame of reference, Panda to Panda is a variation on peer-to-peer communication, a means of communication in which there is no hierarchy and that allows all peers to interact in an equal way. This system is seen as a philosophy of egalitarian human interaction on the internet. This reference also materialises the goals of the movement. From Ai’s frame of reference, the pandas satirically reference popular culture: in China, the secret police, the “government spies” that also monitor Ai, are often referred to as pandas.

Anything to Say? A Monument of Courage (2015) is a life-size bronze sculpture by American author Charles Glass and Italian artist Davide Dormino. The sculpture portrays three people: Julian Assange, Edward Snowden, and Bradley Manning (who is now Chelsea Manning). The three each stand on a chair, a fourth chair is left empty. This fourth chair is meant for other individuals to stand on, to enable them to stand with the whistleblowers and freely express themselves. Anything to Say? has its own Twitter account where followers can follow the realisation, unveiling, and journey of the sculpture. The sculpture has never been placed in a typical museum context: it was unveiled at Alexanderplatz in Berlin in and has been travelling since.

An analysis of Anything to Say? demonstrates a number of ways in which art functions to strengthen the privacy movement. Taking a stand and expressing your thoughts does not come naturally to everyone; it takes a certain amount of courage – as the sculpture’s subtitle A Monument of Courage indicates. By inviting individuals to stand on the fourth, empty chair, the sculpture encourages them to do the same as whistleblowers: to step out of their comfort zone and become visible. Young or old, rich or poor, German or not, part of the movement or not: the sculpture gives the audience a reason to connect. Furthermore, here as in the case of Panda to Panda, the sculpture carries out some of the beliefs of the privacy movement, informing individuals within as well as outside of the movement.

Anything to Say? not only highlights the importance of freedom of speech and freedom of information; it also comes from the personal experiences of whistleblowers and it shows great respect for them. It encourages the audience to show the same courage as Assange, Snowden and Manning have shown, but the sculpture in itself is also a sign of gratitude towards them. Furthermore, the sculpture in itself represents movement ideas and values, but by asking members of the audience to stand on the chair and express themselves, it actually practices free speech and thereby practices one of the privacy movement’s aims.

Activist art is a valuable way for the privacy movement to express what it stands for. Although there is only a relatively small group of activists within the movement that actually creates art, it affects the entire movement; it encourages members within the movement, allows them to experience both their own and the group’s strength, and the personal character of the art reinforces the unity within the movement. In the next article of this series, protest as an expression of dissent of the privacy movement will be explored.

The series was originally published by EDRi member Bits of Freedom at

Dissent in the privacy movement: whistleblowing, art and protest (12.07.2017)

The privacy movement and dissent: Whistleblowing (23.08.2017)

(Contribution by Loes Derks van de Ven; Adaptation by Maren Schmid, EDRi intern)

* This research was finalised in 2015 and does not take into account the changes within the movement that have occurred since then.



Andelman, David A. “The Art of Dissent. A Chat with Ai Weiwei.” World Policy Journal 29.3 (2012): 15-21.
Goris, Gie. Art and Activism in the Age of Globalization. Ed. Lieven de Cauter, Ruben de Roo, and Karel Vanhaesebrouck. Rotterdam: NAi Publishers, 2011.
Reed, T.V. The Art of Protest. Culture and Activism from the Civil Rights Movement to the Streets of Seattle. Minneapolis: University of Minnesota Press, 2005.
Simonds, Wendy. “Presidential Address: The Art of Activism.” Social Problems 60.1 (2013): 1-26.

04 Oct 2017

No justification for internet censorship during Catalan referendum

By Electronic Frontier Foundation

The ruthless efficiency with which the Spanish government censored the internet ahead of the referendum on Catalan independence foreshadowed the severity of its crackdown at polling places on 1 October. EDRi member Electronic Frontier Foundation previously wrote about one aspect of that censorship; the raid of the .cat top-level domain registry. But there was much more to it than that, and many of the more than 140 censored domains and internet services continue to be blocked today.

It began with the seizure of the domain, the official referendum website, on 13 September by the Spanish military police Guardia Civil, pursuant to a warrant issued by the Supreme Court of Catalonia. Over the ensuring days this order was soon extended to a number of other and unofficial mirrors of the website, such as and, which were seized if they were hosted at a .cat domain, and blocked by Internet Service Providers (ISPs) if they were not. The fact that Spanish ISPs already blocked websites such as the Pirate Bay under court order enabled the blocking of additional websites to be rolled out swiftly.

One of these subsequent censorship orders, issued on 23 September, was especially notable in that it empowered the Guardia Civil to block not only a list of named websites, but also any future sites with content related to the referendum, publicised on any social network by a member of the Catalan Government. This order accelerated the blocking of additional websites without any further court order. These apparently included the censorship of non-partisan citizen collectives (such as and other non-profit organisations (,,, and campaign websites by legal political parties (

On 29 September, a separate court order was obtained requiring Google to remove a voting app from the Google Play app store. Similar to the 23 September order, the order also required Google to remove any other apps developed by the same developer. Those violating such orders by setting up mirrors, reverse proxies, or alternative domains for blocked content were summoned to court and face criminal charges. One of these activists also had his GitHub and Google accounts seized.

Needless to say, even if the blocking of electoral information pursuant to these court orders was legitimate and proportionate (we don’t believe it was), it was inevitable that the orders would result in overblocking and restriction of lawful content. An example of this was the blocking of the domain This domain is the main webserver for the InterPlanetary File System (IPFS), an experimental internet protocol for distributed storage of information. Although some information on the 1 October referendum was hosted on this distributed filesystem, this was a tiny proportion of the information that was blocked. Closer to home, on the day of the referendum itself, the internet was shut down at polling places in an effort to prevent votes from being transmitted to returning officers.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

Throughout this unrest, a group of activists sharing the Twitter account @censura1oct has been verifying the blocks from multiple ISPs, and sharing information about the technical measures used. All of the censorship measures that were put in place in the leadup to the referendum appeared to remain in place at the time of writing of this article, though we don’t know for how much longer. The Spanish government no doubt hopes that its repression of political speech in Catalonia will be forgotten if the censored sites come back online quickly. We need to ensure that that is not the case.

With an extremely narrow range of exceptions, government censorship of the internet is prohibited by Article 19 of the Universal Declaration of Human Rights, and by Article 10 of the European Convention on Human Rights, both of which guarantee everyone’s right to receive and impart information and ideas regardless of frontiers. The Spanish government’s censorship of online speech during the Catalan referendum period is so wildly disproportionate and overbroad, that its violation of these instruments seems almost beyond dispute.

This article was originally published at

.cat Domain a Casualty in Catalonian Independence Crackdown (21.09.2017)

Spanish court orders Google to delete app used for Catalan independence vote (29.09.2017)

The Catalan High Court of Justice orders to block the websites about the 1-O that are advertised by the Catalan Government members through social media (only in Spanish, 23.09.2017)

The manager arrested in Madrid due to the 1-O leaves the dungeons to go on trial in Barcelona (only in Spanish, 20.09.2017)

The Ministry of Internal Affairs manages to block the application of the universal census throughout Catalonia (only in Spanish, 01.10.2017)

Freedom of expression: Limitations

(Contribution by Jeremy Malcolm, EDRi member Electronic Frontier Foundation)



04 Oct 2017

Tear down the tracking wall

By Bits of Freedom

It has become a daily routine: “consenting to” being tracked, on the basis of meaningless explanations (or no explanation at all) before you’re allowed access to a website or online service. It’s about time to set limits to this tracking rat race.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

An ever-growing portion of our personal and professional communication, our news consumption and our contact with government, is mediated through the internet. Access to online information and services is crucial to participating in today’s society. Yet, on a daily basis we are forced to allow ourselves to be tracked – from across multiple websites and app , and across several devices – before we’re given access to information or digital services.

The infamous cookie walls you encounter when visiting websites are a prime example of this. If you want to get beyond that wall, you first have to consent to having your online behaviour minutely tracked. To be clear, we are not talking about the cookies that are necessary to, for example, store your settings or for gathering stats on the use of your website in a privacy friendly manner. We are talking about all those trackers that usually originate from multiple, completely different parties from the website you intended to visit, and that continue to track your behaviour across the internet.

Issues with tracking

Tracking raises many concerns. First of all, while we become more transparent to online tracking companies, a lot of the current practices, and the parties employing them, are highly opaque. We are unaware how much of our activity online is registered, analysed and used, by how many different parties, for what purposes nor what inferences about our activities are generated.

Secondly, the information collected through trackers makes us susceptible to manipulation – indeed, that is the usual purpose. This can have serious consequences for the power (im)balances between citizens and consumers on the one hand and governments, corporations and other organisations that have access to this data on the other. Just think of the instrumental role tracking plays in micro-targeted political advertising, price discrimination or exploiting the cognitive biases and specific weaknesses of individual users.

Third, the data gathered through tracking is increasingly used for making decisions about us. For example, the answer to whether you have access to credit and under what terms may depend on such data. This often happens under the cloak of long terms full of legalese you consented to which provide you no meaningful transparency. Even if you are aware that data about you is being used for making automated decisions, it is hard to challenge the inaccuracy of such decisions or the data they rely on.

An often heard response is that you are free to withhold your consent to being tracked. That is correct in theory, but much harder in the real world. In our daily lives it is often a choice between limited or no access at all, or subjecting yourself to opaque tracking. This is particularly problematic when the information or services you would like to access are provided by public institutions, health service providers or organisations that play an important role in society and that you therefore cannot simply avoid.

Think for instance of public institutions such as the Tax Administration, but also hospitals, health insurance companies, banks or internet access providers. By making access to their services conditional on your consent to being tracked, your consent becomes involuntary and essentially meaningless. This practice has to stop.

As a user you should be able to gather information and use services without being forced to consent to being tracked. And why shouldn’t we take it one step further and put an end to tracking walls for all the online information and services that we use?

What will the EU do?

At this very moment, European Union institutions are working on an overhaul of specific privacy rules for electronic communications, e-Privacy Regulation. Who is permitted to read your messages, are tracking walls allowed and may your phone be used to map your physical location without your consent? These are some of the important questions these new rules address. They will have a substantial impact on all internet users across the EU.

This overhaul of the rules offers an excellent opportunity to tear down tracking walls for all of Europe. EDRi Brussels office and EDRi members are not the only one advocating for this. The data protection authorities in Europe also recommend to put an end to this practice. In October 2017, the European Parliament will vote on the new rules proposed by the European Commission and the hundreds of amendments that have been submitted by different Members of the European Parliament (MEPs). Will the rights of internet users be safeguarded and will we get a digital environment free from opaque tracking practices?

This is a shortened version of an article originally published by EDRi member Bits of Freedom:

(Contribution by David Korteweg, EDRi member Bits of Freedom, the Netherlands; Adaptation by Maren Schmid, EDRi intern)



04 Oct 2017

TiSA impact assessment report ignores crucial human rights concerns

By Ana Ollo

In 2013, the European Commission decided to subject the draft Trade in Services Agreement (TiSA) to a Trade Sustainability Impact Assessment (SIA) in support of the negotiations. The Final Report, which was published in July 2017, fails to address several key fundamental rights concerns.

The report was conducted by the consultancy Ecorys and the Centre for Economic Policy Research (CEPR). The aim was to evaluate how TiSA’s provisions under negotiation could affect economic, social and human rights, as well as environmental issues, in the EU and in other TiSA parties and selected third countries.

----------------------------------------------------------------- Support our work with a one-off-donation! -----------------------------------------------------------------

The report went through various review processes among stakeholders, to which EDRi responded in three occasions. The draft that preceded the final Report was published in May 2017. In June 2017, EDRi submitted comments regarding both the draft and its Annexes.

We welcome certain parts of the final report. It clearly says that there is a lack of evidence of meaningful barriers to e-commerce. In fact, it states that barriers to e-commerce identified by industry groups “are not necessarily the true barriers to e-commerce”. In addition, the report makes a distinction “between the true underlying barriers and the barriers that are reported” by industry, industry associations or individual stakeholders. It argues that “in the absence of robust evidence on policy impact and effectiveness […] it is tempting to rely on the input and suggestions of interest groups and stakeholders”, which leads to “the usual risk of being beholden to special interests or to be lost in a mosaic of different opinions, concerns and suggestions”.

Despite these important recognitions, the report still has at least three major problems:

First, the analysis overlooked several key human rights concerns. Freedom of expression and opinion was disregarded, despite its relevance in the context of TiSA, especially for potential provisions on intermediary liability and net neutrality proposed by some TiSA countries. To address these points, we suggested including an impact assessment of the lack of human rights commitments by TiSA parties.

Secondly, the report refers to data protection and privacy as “issues”, rather than fundamental rights that must be respected. Indeed, the failure to protect them constitutes a barrier to trade and not the opposite. In our comments, we pointed out that both the European Commission and the European Parliament have stated on several occasions that such rights cannot be subject to negotiations in trade agreements, and that this needs to be taken into account. Furthermore, we highlighted that the Final Report should not assess the data protection situation only from an EU perspective, as the different TiSA parties have a variety of commitments in this regard.

Thirdly, the report includes contradictions with regard to data flows. While it acknowledges the lack of evidence of the existence of meaningful barriers to e-commerce, it states in its human rights assessment that “the issue of data flows […] is particularly relevant”, without indicating what it may be relevant to. In the same vein, the report does not present evidence of the ostensible problems related to data flows, while it also says that “limitations to the free flow of data” are “a key concern for e-commerce”. Finally, it identifies the movement of people as the biggest trade barrier for computer services and telecommunication, but then states that “the core issue” is that of the free flow of data. The report warns about the risks of lacking robust evidence, whereas in this matter it is clear that such problem affected the assessment.

Despite all the concerns highlighted on several occasions, when the final Report was published, we learned that almost all of our suggestions and remarks had been disregarded. This is regrettable, as an independent academic study by the University of Amsterdam “Trade and Privacy: Complicated Bedfellows? How to achieve Data Protection-Proof Free Trade Agreements” (Irion, K., S. Yakovleva, and M. Bartl, 2016), that is even cited in the Final Report, shows that the EU has homework to do to bring trade agreements in line with EU law.

EDRi’s response to the Trade SIA consultation (02.06.2017)

EDRi’s input to the Draft Interim Technical Report “Trade SIA in support of negotiations on a plurilateral Trade in Services Agreement (TiSA)” (27.01.2017)

EDRi’s response to the Ecorys Survey on TiSA commissioned by the European Commission (15.03.2016)

EDRi’s position paper on TiSA (01.2016)

Documents regarding TiSA’s Trade Sustainability Impact Assessment since 2013

Trade Sustainability Impact Assessment – Final Report (07.2017)

Trade Sustainability Impact Assessment – Annexes to the Final Report (07.2017)

(Contribution by Ana Ollo, EDRi intern)