24 May 2018

Press Release: GDPR: A new philosophy of respect


The General Data Protection Regulation (GDPR) is going in effect tomorrow, on 25 May 2018, strengthening and harmonising individuals rights in regards to personal data. A much celebrated success for all privacy advocates, GDPR is more than just a law.

GDPR is a new philosophy that promotes a culture of trust and security and that enables an environment of Respect-by-Default

said Joe McNamee, Executive Director of European Digital Rights.

The Directive adopted in 1995 was characterised by a tendency towards bureaucratic compliance with little enforcement. The GDPR represents a recalibration of focus, establishing a new balance between companies, people and data. The framework does not only protect, but also changes, perceptions of personal data. On one hand, GDPR protects individuals from companies and governments abusing their personal data and promotes privacy as a standard. On the other, it gives businesses the chance to develop processes with privacy-by-default in mind, ensuring in this way both individuals’ trust and legal compliance . GDPR minimises the risk of some companies’ bad behaviour undermining trust in all actors.

The GDPR is capable of setting the highest regional standards for the protection of personal data; once well implemented, we need updated global rules

said Diego Naranjo, Senior Policy Advisor of European Digital Rights.

While not perfect, because no legislation is perfect, the GDPR is probably the best possible outcome in the current political context. We will now have to rely on each EU Member State’s Data Protection Authority (DPA) to do their jobs correctly and on governments to ensure enough resources have been allocated to allow this to happen.

To promote educational efforts around GDPR, we have developed an online resources that help everyone better understand their new rights and responsibilities, the “GDPR Explained” campaign which will be launched shortly.

Read more:

The four year battle for the protection of your data (24.05.2018)

EU Data Protection Package – Lacking ambition but saving the basics (17.12.2015)

24 May 2018

The four year battle for the protection of your data

By Bits of Freedom

In 2012, what would become a four-year process started: the creation of new European data protection rules. The General Data Protection Regulation would replace the existing European Data Protection Directive adopted in 1995 and enhance and harmonise data protection levels across Europe. The result is an influential piece of legislation that touches on the lives of 500 million people and creates the highest regional standard for data protection.

A lobbyist feeding frenzy

With so much at stake, civil society was preparing for strong push-back from companies. But we could never have dreamed just how dead set corporate lobbyists were on undermining citizens’ rights – or the lengths they would go to to achieve their goals. Former European Commissioner Viviane Reding said it was the most aggressive lobbying campaign she had ever encountered. The European Parliament was flooded with the largest lobby offensive in its political history.

Civil society fights back

The European Digital Rights network worked together and continued to fight back. Among other things we had to explain that data leaks are dangerous and need to be reported, and that it’s not acceptable to track and profile people without their consent. We were up against the combined resources of the largest multinational corporations and data-hungry governments, but we also had two things in our favor: the rapporteur Jan Philipp Albrecht and his team were adamant about safeguarding civil rights, and in 2013 the Snowden-revelations made politicians more keen on doing the same. Against all odds, we prevailed!

GDPR isn’t perfect, but it is a way forward

The General Data Protection Regulation that was adopted in 2016, and will be enforced starting May 25th, is far from perfect. As we pointed out in 2015, we did however manage to save “the essential elements of data protection in Europe”, and now have a tool with which to hold companies and governments using your data to account. We are committed to doing just that. We will continue to fight for your privacy, speak out when and where it is necessary and help you do the same.

EU Data Protection Package – Lacking ambition but saving the basics (17.12.2015)

EDRi GDPR document pool

(Contribution by Evelyn Austin, EDRi member Bits of Freedom, the Netherlands)


16 May 2018

ENDitorial: Can design save us from content moderation?

By Bits of Freedom

Our communication platforms are polluted with racism, incitement to hate, terrorist propaganda and Twitter-bot armies. Some of that is due to how our platforms are designed. Content moderation and counter speech as “solutions” to this problem both fall short. Could smart design help mitigate some of our communication platforms’ more harmful effects? How would our platforms work if they were designed for engagement rather than attention?

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

The debate on how to deal with harmful content generally focuses on two arguments, or solutions if you will. The first is content moderation, the second is counter speech.

Content moderation

The relationship between governments and the big networking platforms is quite complex. Your opinion on how these platforms should operate and if they should be regulated, depends among other things on if you consider a platform like Facebook or YouTube to be the open internet, or if you consider them closed, privately owned spaces. The truth, of course, lies somewhere in the middle. These are private spaces that are used as public spaces and they are becoming larger and more powerful by the minute.

Faced with this situation, and not wanting to be seen as to be doing nothing to counter harmful content, governments are forcing platforms to take action – and thereby skillfully avoid taking responsibility themselves. The outsourcing of public responsibility to private parties – and being very vague about what that responsibility entails – is bad for approximately a gazillion reasons.

1. Platforms will over-censor: It encourages platforms to rather be (legally and politically) safe than sorry when it comes to the content users upload. That means they will expand the scope of their terms of service to be able to delete any content or any account for any reason. Without doubt this will occasionally help platforms take content down that shouldn’t be online, but it has already also lead to a lot of content being removed that had every right to be online.

2. Multinationals will decide what’s right and wrong: Having your freedom of speech regulated by US multinationals means that nothing you say will be allowed to be heard unless it falls within the boundaries of US morality, or suits the business interests of US companies. Another possible outcome: if Facebook chooses to comply with countries’ individual demands – and why wouldn’t it? – the result could also be that only those pieces of information will be allowed that are acceptable to every one of those countries.

3. Privatised law enforcement will replace actual law enforcement: Putting companies in charge of taking down content is a form of privatised law enforcement that involves zero enforcement of the actual law. In doing so, you bypass the legal system and people who should actually be charged with an offense and appear before the court, never do. People who wrongfully have content removed, have very few ways to object.

4. We’ll become more vulnerable: This normalises a situation where companies can regulate us in ways governments legally cannot, as companies aren’t bound (or hindered) by international law or national constitutions. Put gently, this solution is not proving to be an unabashed success.

Counter speech

The other proposal, counter speech boils down to the belief that the solution to hate speech is free speech. We cannot have a functioning democracy without free speech, but this argument completely neglects to acknowledge the underlying societal power imbalances, the systemic sexism and racism that informs our media, our software and our ideas.

As Bruce Schneier neatly put it: “[Technology] magnifies power in both directions. When the powerless found the Internet, suddenly they had power. But […] eventually the powerful behemoths woke up to the potential — and they have more power to magnify.” We have to acknowledge that as long as there are structural imbalances in society, not all voices are equal. And until they are, counter speech is never going to be a solution for hate speech. So this solution, too, will continue to fall short.

Design matters

This brings us to design. Just like the internet isn’t responsible for hate speech, we can’t and shouldn’t look to design to solve it. But design can help mitigate some of our communication platforms’ more harmful effects.

Here is a very recent example where we see this clearly. In 2015, Coraline Ada Ehmke, a coder, writer and activist, was approached by GitHub to join their Community & Safety team. This team was tasked with “making GitHub more safe for marginalized people and creating features for project owners to better manage their communities”.

Ehmke had experienced harassment on GitHub herself. A couple of years ago someone created a dozen repositories, and gave them all racist names. This person then added Coraline as a contributor to those repositories, so that when you viewed her user page, it would be strewn with racial slurs – the names of the repositories.

A few months after Ehmke started, she finished a feature called “repository invitations”: project invites. This basically means that you can’t add another user to the project you’re working on without their consent. The harassment she had suffered wouldn’t be able to happen to anyone again. Instead of filtering out the bullshit, or going through an annoying and probably ineffective process of having bullshit removed, what Ehmke did was basically give the user control over her own space, and create a situation in which the bullshit never gets the chance to materialise.

Taking “censorship”, renaming it “content moderation”, and subsequently putting a few billion-dollar companies in charge isn’t a great idea if we envision a future where we still enjoy a degree of freedom. Holding on to the naive idea of the internet offering equal opportunity to all voices isn’t working either. What we need to do is keep the internet open, safeguard our freedom of speech and protect users. Ehmke has shown that smart design can help.

What you can do

If you’re an activist: don’t let the success of your work depend on these platforms. Don’t allow Facebook, Google and Twitter to become a gatekeeper between you and the members of your community, and don’t consent to your freedom of speech becoming a byline in a 10 000-word terms of service “agreement”. Be as critical of the technology you use to further your cause as you are of the people, lobbyists, institutions, companies and governments you’re fighting. The technology you use matters.

Finally, if you’re a designer: be aware of how your cultural and political bias shapes your work. Build products that foster engagement rather than harvest attention. And if you have ideas about how design can help save the internet, please get in touch.

This is a shortened version of an article originally published at https://www.bof.nl/2017/12/06/can-design-save-us-fromcontent-moderation/.

(Contribution by Evelyn Austin, EDRi member Bits of Freedom, the Netherlands)



16 May 2018

Upload filters endanger freedom of expression

By Maria Roson

In September 2016, the European Commission proposed a controversial draft for the new Copyright Directive that includes de facto mandatory automated upload filters for every internet user in the EU. This mechanism, designed to prevent alleged copyright infringements, is leaving the tackling of potentially illegal content uploaded by users to algorithms. However, an idea that was supposed to be an efficient way to safeguard author’s rights, has in reality turned out to be a “censorship machine” that is not even addressing the so-called “value gap” between platforms and rightsholders. Instead, it reveals a “values gap” between European policy-makers and the European values they are meant to uphold.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Giving the obligation of deciding what can be expressed or not online to algorithms, without any human involvement, can pose serious risks to our societies. Particularly sensitive issues deeply linked with the respect of our fundamental right to freedom of expression should be decided by a court, not by a machine. There are already many real life examples about how automated upload filters are failing, censoring a broad range of content from innocent videos to human rights activism.

Kittens purring infringes copyright: YouTube’s content ID system that filters the uploads by its users thought that a cat purring was a copyright infringement. The purring was identified as a musical composition already owned by a company, making the purring a ”pirate” product. This perfectly illustrates the randomness of the content spotted by the filter.

Content used for educational purposes: Harvard’s Professor Lawrence Lessig had one of his lectures removed by the platform’s copyright filter because he was using parts of several well-known songs. Even though the music was part of the didactic material used in his lecture and was therefore legal to be employed for educational purposes, YouTube proceeded to mute his entire lecture. This is a very representative example of how filters can restrict access to culture and education without taking into account the exceptional use of protected content.

Human rights activism censored: There are several examples of how automated upload filters are censoring human rights activists. As it has been proven, some filters used to classify content which is “offensive”, “extremist” or simply “inadequate for minors” have ended up censoring videos which tried to denounce injustices. For instance, thousands of videos reporting atrocities in the Syrian war were removed. This resulted in a loss of extremely valuable material to prosecute war crimes. Other examples of this censorship is the removal of videos of LGTB activists.

The examples mentioned above show that automated upload filters can lead to illegitimate removal of material from the internet. In addition, they can encourage internet users to self-censor – and limit their uploads “voluntarily”, in fear of being censored. These practices deeply affect human rights such as freedom of expression and access to information, culture and education. If it is already complex understanding the status of freedoms to use cultural works in the EU by copyright experts, algorithms are even less likely to understand the context and the purpose of using protected copyright material, nor being the ones deciding whether a content is “offensive” or not suitable for the audience. European policy-makers must take into account this reality and seriously reconsider the use of filters and their impact on democratic societies.

Censorship Machine: busting the myths (13.12.2017)

When filters fail: These cases show we can’t trust algorithms to clean up the internet (28.09.2017)

YouTube’s Content ID (C)ensorship Problem Illustrated (02.03.2010)

(Contribution by María Rosón, EDRi intern)



16 May 2018

BBA Germany 2018: Spying on employees, refugees, citizens…

By Digitalcourage

The annual German Big Brother Awards gala was held by EDRi member Digitalcourage in Bielefeld on 20 April 2018. Following last year’s 30th anniversary, Digitalcourage received more local recognition, and a cooperation with the municipal theatre began last autumn. This year’s Big Brother Awards became the first to be presented in the city’s main theatre.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

The award in the “Workplace” category went to the company Soma Analytics, whose health app “Kelaa” stands out among intrusive health apps as it only works if the users’ employers use the counterpart software, the so-called “Kelaa Dashboard”. The app records all kinds of data via the phone’s various sensors. It focuses on stress and sleep, encouraging employees and employers to “work on” these issues. The analysed data include phone use, voice characteristics, typing behaviour and even sleep movements, for which users are encouraged to take their phones to bed with them. The employer receives summary reports about their employees’ mental state. While this comes with the usual claims that the personal data is aggregated and anonymised, the reports do allow inferences to smaller groups in order to identify “most stressful” departments.

An award in the “PR & Marketing” category went to the idea of a “Smart City”, with no individual winner. The award speech explained that what is touted as innovative and efficient, participatory and ecological, often becomes an entry point for total surveillance. CCTV can be “enhanced” with facial recognition and used, for example, to publicly shame transgressors by showing on monitors jaywalkers, completed with their personal data and automatic notification of employers (Shenzhen, China). People can be tracked with automatic alerts when they move away from their residence or workplace (Xinjiang, China). Street lights can be fitted with WiFi trackers and microphones to detect aggressive behaviour, and equipped with an option to spray scents to influence the people’s mood (Eindhoven, Netherlands). The projects are initiated by cities, but often the companies involved will treat the collected data as their private property. Citizens’ consent is not asked, but it is their data that is being collected and sold.

The “Technology” award went to Microsoft Germany for implanting telemetry – the collection and transmission of diagnostic data – in Windows 10 and making it practically impossible to deactivate. Even if skilled users go beyond hard-to-find dialogs and make changes deep within the system’s registry, where more than 50 settings related to telemetry are known, the system still sends requests for updates, recommendations etc., causing at least the user’s IP address to be logged. The laudation concluded by calling Microsoft’s products an “intolerable problem”.

In the “Administration” category, Cevisio Software und Systeme GmbH was awarded for their software for managing refugee shelters. Developed in collaboration with the Saxony branch of the Red Cross, the system stores personal data and registers movements and actions such as receiving a meal, receiving pocket money, borrowing books, or doing the laundry. This is merged with data from the Federal Office for Migration and Refugees to include information on family relationships and medical conditions as well as official documents and the state of asylum applications. The award highlighted this as a part of a trend of control-freakery that can be observed in other institutions’ interactions with refugees.

Amazon’s “Alexa” assistant was given the “Consumer Protection” award for being a “nosy, impertinent, all-too clever and gossipy bugging operation in a can”, and the most award-worthy of several similar products. As the awards speech points out, Alexa is always listening and always online, sending what is being said after its “wake word” to the Amazon Cloud to be processed and stored. The assistant has an app with a function to list and play all conversation fragments – even months later. Together with the “Smart City”, Alexa is part of a process of public and private spaces turning against the people that are living in them. Patents held by Amazon, Google and others point to possible future developments that include identifying individuals and recognising moods from voices. The issue, according to the award speech, is not abuse – it is the potential that the system has even today.

The parliamentary groups of the (conservative) CDU and the Green party in the parliament of Hesse were given the “Politics” award. These parties form the state’s coalition government, which is planning a new law regulating its domestic intelligence agency and a reform of its police law. The plans include allowing the recruitment of convicted criminals as undercover agents, and the exemption of agents from criminal investigation. Services will be able to undermine professional secrecy safeguards by recruiting holders of such secrets or placing agents in their contacts. The use of state trojans is to be allowed even as a “preventative” measure – an outright breach of an election promise by the Greens – and so is electronic tagging. The Green parliamentary group is pursuing these plans against a vote by its own party base, and the coalition as a whole is ignoring the overwhelmingly critical positions voiced by experts in a parliamentary hearing.

As in previous years, coverage of the awards on the website is fully bilingual and can be found on https://bigbrotherawards.de/en/2018. The gala was live-streamed with English interpretation. A recording of the German stream has been published on https://vimeo.com/digitalcourage, a version with the English audio will follow.

German Big Brother Awards 2018

(Contribution by Sebastian Lisken, EDRi member Digitalcourage, Germany)



16 May 2018

New Dutch law for intelligence services challenged in court

By Bits of Freedom

On 21 March 2018, the Dutch voted in an advisory referendum on the new Intelligence and Security Services Act. A majority of Dutch citizens voted against the law in its current form – a clear signal that the law is in urgent need of reconsideration. EDRi member Bits of Freedom has been fighting against important parts of this law since the first draft in 2015, so the outcome of the referendum comes as a positive news.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

In response to the outcome of the referendum, the Dutch government announced amendments to the new law that will be submitted to Parliament after the summer. Unfortunately, the announced amendments are primarily cosmetic in nature and do not address the widespread critique that the law enables the untargeted collection and sharing of large amounts of citizens’ personal data.

Meanwhile, the new law entered into force entirely on 1 May. This move effectively prevents the Dutch legislator (Lower and Upper House) from reviewing and possibly changing the announced amendments and from ensuring that the law respects the fundamental rights of citizens before it fully enters into force.

Implementing the law without listening to the critique of the voters who voted against it during the referendum, and without providing the legislator the opportunity to review and improve the amendments announced by the government, is not acceptable. That’s why Bits of Freedom, together with a broad coalition of NGOs and companies, will go to court (interim proceedings) as soon as the new law enters into force. The court challenge will focus on the most controversial provisions of the law enabling the untargeted collection and sharing of large amounts of personal data. The oral hearing will take place on 7 June.

The coalition consists of Bits of Freedom, Privacy First, the Dutch Committee of Jurists for Human Right (NJCM), the Dutch Association of Defence Counsel, Free Press Unlimited, Waag Society, Greenpeace International, BIT, Voys, Speakup and Platform Protection of Civil Rights. The coalition is coordinated by the Public Interest Litigation Project (PILP) and legal representation is provided by the law firm Boekx.


On 11 July 2017, the Dutch Senate passed the new Intelligence and Security Services Act. With the Senate vote, a years-long political battle came to an end: the secret services were given dragnet surveillance powers. Citizens subsequently called for a referendum, which was held on 21 March 2018. Since 2015, the law has faced overwhelming opposition from experts, industry, political parties, civil society, and citizens. The law proved particularly controversial on five points: the dragnet-surveillance power, real-time access to databases, third-party hacking, oversight, and the sharing of unevaluated data with foreign services.

Bits of Freedom challenges new law for secret services in court (18.04.2018)

Dutch Senate votes in favour of dragnet surveillance powers (26.07.2017)

Dutch House of Representatives passes dragnet surveillance bill (22.02.2017)

Dutch Minister reveals plans for dragnet surveillance (15.07.2015)

(Contribution by EDRi member Bits of Freedom, the Netherlands)



16 May 2018

Bavarians protest against vastly extended police powers

By Digitalcourage

A large anti-surveillance rally took place in Munich on 10 May 2018. 30 000 protesters showed their dismay about the Bavarian plans to reform the law on the tasks of the state’s police. Even the organisers were surprised by the scale of the demonstration – they had expected fewer than 10 000 people.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

This is the second change to the Police Tasks Act (PAG) in two years. In 2017, the legal threshold for strict police measures was lowered, requiring no more than an “imminent threat” for “preventive” measures such as electronic tags or an extended police custody that could in theory last indefinitely (as it can be extended by judicial decree, but without an actual court case, for three months at a time). The 2018 reform adds measures such as “preventive” DNA analysis extending to external features such as hair and skin colour or geographic origin, communications surveillance without evidence of a crime, use of undercover agents, state trojans for communications surveillance and remote evidence gathering on digital devices, and “intelligent” video surveillance. In effect, the Bavarian police will increasingly act like a secret service, in spite of a constitutional requirement in Germany for a strict separation between such tasks, introduced as an important lesson from its Nazi past and in particular from the excesses of the Gestapo (“Secret State Police”) of that time. According to the alliance against the Police Tasks Act, police powers have never been so vastly extended since 1945.

Unfortunately, the Bavarian parliament does not seem to be swayed by the protests – the reform to the Police Tasks Act was set to be voted through late on 15 May. Complaints to the Constitutional Court are being prepared. While Bavaria generally has certain political idiosyncrasies, this change is not an exception in the broader German political systems. In April, EDRi member Digitalcourage gave a Big Brother Award to the CDU and Green parties in Hesse for expanding the powers of both the state’s domestic intelligence agency and the police. In general, Digitalcourage is seeing a trend of drastic surveillance powers being introduced through police laws at the Federal State level, making it harder to for human rights organisations to campaign against such measures on a wider scale.

Nein zum neuen Bayerischen Polizeiaufgabengesetz — Kein Angriff auf unsere Freiheits- und Bürger*innenrechte! (only in German)

Bayerisches Polizeigesetz: Billige Tricks der CSU entlarvt (only in German, 23.04.2018)

Das härteste Polizeigesetz seit 1945 soll heute in Bayern beschlossen werden (only in German, 15.05.2018)

München: 40.000 protestieren gegen neues Polizeigesetz (11.05.2018)


München: Demo #noPAG – NEIN! Zum Polizeiaufgabengesetz (only in German, 14.05.2018)

The Big Brother Award 2018 in the “Politics” Category goes to the parliamentary groups of the Christian Democrats (CDU) and the Greens in the parliament of the Federal State of Hesse

(Contribution by Sebastian Lisken, EDRi member Digitalcourage, Germany)



16 May 2018

A guide to EDRi at RightsCon 2018

This year, three members of our Brussels office are attending RghtsCon in Toronto: Executive Director Joe McNamee, Senior Policy Adviser Maryant Fernández Pérez and Policy Intern Gemma Shields. The conference days are full of panels, meetings, informal get-togethers and fun activities. Here is our guide to the sessions moderated or attended by EDRi staff.

Wednesday 16 May

Do we need free speech legislation like we have privacy laws?
16:00 – 17:00 EDT, 203B with Maryant Fernández Pérez, Nate Cardozo, Paulina Gutiérrez, Moses Karanja, David Kaye, Stephen Turner and Lisa Vermeer.

Around the world, there is some degree of national legislation that circumscribes privacy rights in an environment where risks are obvious. In freedom of expression, the activities of intermediaries can also restrict our rights as citizens or consumers. National legislation that positively protects our free speech against arbitrary restrictions by online companies is broadly absent. Intermediary liability discussions are not new. We have been having similar agreements and disagreements without moving the discussion forward for decades. This session aims at discussing whether free speech legislation as a backstop to unending state demands for private regulation of our online activities would be a necessary and proportionate tool. If positive legislation is not the solution, what durable framework and other measures can protect freedom of expression online in the XXI century?

Thursday 17 May 

Human Rights Requirements of Cross-Border Data Demands
12:00 – 13:15 EDT, 201A with Maryant Fernández Pérez, Greg Nojiem, Lucy Pardon, María Paz Canales, David Lieber, Rauno Merissari, Bernard Shen

Law enforcement investigations increasingly rely on data stored outside the jurisdiction of the country that is investigating a crime. The current system of law enforcement requests made under mutual legal assistance treaties (MLATs) has not been able to keep up with the growing quantity of these requests. Requests often are not processed on a timely basis, which can result in criminals evading prosecution and law enforcement’s inability to interdict some crimes. The need for this data has spurred some governments to demand that their users’ data be stored locally, and has spurred others to issue surveillance demands that purport to have extraterritorial effect. In response, at least three processes are underway to develop mechanisms that address the problem: (i) the E-Evidence initiative of the European Union; (ii) bi-lateral agreements – such as those envisioned in the U.S. CLOUD Act — to permit direct demands on communications service providers made by particular countries in which they have no physical presence; and (iii) a negotiated protocol to the Budapest Cybercrime Convention that would enable signatories to the protocol to make direct demands on providers in other signatory nations. 

This round-table is designed to explore the human rights criteria that should be built into these new mechanisms. For example, must a signatory to the planned protocol to the Budapest convention have a legal system that requires judicial authorisation of data demands? Must a signatory to a bilateral agreement agree to give notice (or delayed notice) to the subject of its data demand? Is it realistic to expect that countries will change the processes by which they obtain access to Internet users’ data so the country can participate in one of these new mechanisms and get a speedy response?

10 days before the EU GDPR becomes applicable: are you ready?
12:00 – 13:15 EDT, 204C with Joe McNamee, Estelle Masse, Ann Cavoukian, Isabelle Falque-Pierrotin and Jeremy Rollison

On May 25, 2018, the EU General Data Protection Regulation will become applicable. This law sets new rules and obligations for companies collecting, using, selling, sharing and storing personal information from people leaving in the European Union. Almost 7 years after the launch of the negotiations on this law, several questions remains: are companies ready to comply with it? Are users aware of their rights and how to exercise it? How did EU state implement the law? Are data protection authorities ready and equipped to enforce the rules? This session will explore these issues in a dynamic format where the audience will be presented the reform and hear from a company, DPA and civil society about the opportunities and challenges brought by this law.

Can trade agreements such as CPTPP NAFTA and RCEP be used as a tool for advancing digital rights?
16:00 – 17:00 EDT, 204B with Maryant Fernández Pérez, Gisela Pérez de Acha, Sean Flynn, Deborah James, Cynthia Khoo, Jeremy Malcolm, Milton Mueller, Gus Rossi and Parminder Jeet Singh

While the worst of the IP chapter may be suspended in the Trans-Pacific Partnership (TPP), ongoing negotiations in trade agreements continue to put our digital rights at stake: the North American Free Trade Agreement (NAFTA), the Regional Comprehensive Economic Partnership (RCEP), and what remains of the TPP itself, now known as the Comprehensive and Progressive Agreement for Trans-Pacific Partnership (CPTPP). Across the board, digital rights advocates are concerned about the serious procedural shortcomings in trade agreements, such as lack of transparency and lack of meaningful consultation with civil society. Additionally, such agreements impact Internet policy in substantive ways, but do not accord with the established multistakeholder nature of Internet governance. However, differences of opinion have emerged as well. Some digital rights advocates see opportunity as well as threat, believing that trade agreements can be effective tools for advancing digital rights. For example, agreements might be used to persuade member countries to improve privacy protections and expand free expression rights or reduce online censorship. On the other hand, other experts among civil society have cautioned against this, noting that the particular model of trade agreements observed to date has increased inequality and given greater rights to big corporations at the expense of workers, consumers, and the environment. Thus, these agreements may be more susceptible to giving greater rights to monopolistic Internet giants at the expense of users in the digital context, such as our privacy and personal data rights, fair use rights, and the ability to innovate without permission. 

This session will be a round-table debate among experts and advocates in digital rights and trade, representing different points of view concerning the general issue outlined above. Panellists will engage with each other and the audience to explore this central question: Can regional and international trade agreements be used to advance digital rights, and if so, how? The panel will focus, in particular, on the electronic commerce / digital trade and intellectual property chapters of key trade agreements currently undergoing negotiations, such as NAFTA, RCEP, and CPTPP. Specific provisions to be discussed include data localisation and free flow of data, and possible harmonisation of users’ rights through fair use in copyright. Interactivity is built into the session as audience questions will be taken over the course of the debate and integrated into panelists’ discussion throughout.

Avoiding a Race to the Bottom: Bringing Human Rights to Cross-Border Data Demands
17:15 – 18:15 EDT, 206D with Maryant Fernández Pérez, Fanny Hidvegi, Karen Audcent, Eduardo Ferreyra, Tamir Israel, Katitza Rodriguez and Jeremy Rollison

Law enforcement authorities across the world are continuously seeking to gain access to data in different jurisdictions, wherever the data is held. Currently, the primary international mechanism for facilitating governmental cross border data access is the Mutual Legal Assistance Treaty (MLAT) process, a series of treaties between two or more States that create a formal basis for cooperation between designated authorities of the signatories. However, MLATs have been criticised for being slow and inefficient. The MLAT regime includes steps to protect privacy and due process, but agencies have increasingly sought to bypass it, by either cross-border hacking, or leaning on large service providers in foreign jurisdictions to hand over data voluntarily. In particular, access to data collected, processed and stored by big U.S. Internet companies has become an increasing demand worldwide due to the prominence of U.S. internet companies. To fix the problems with MLATs, several government and regional proposals have emerged. This session will explore how MLAT reform can address many current problems, as well as map different approaches to this issue in different jurisdictions to try to bring human rights to cross-border data demands in line with the Necessary and Proportionality principles.

From the proposed CLOUD act that would empower the U.S. executive branch to enter into bilateral surveillance agreements with foreign nations, to the negotiations of a similar agreement between the United States and the United Kingdom; to proposals in the European Union and the Council of Europe that may facilitate direct cooperation on access to data between internet companies and governments. We’ll discuss these proposals to try to identify common policy and legal challenges, divergences and potential for cooperating in a joint future strategy: how can we ensure that these proposals will avoid a race to the bottom in the protection and defence of our rights and freedoms? 

Friday 18 May 

Can global digital rights survive Europe’s Copyright Directive?
12:00 – 13:15 EDT, 206D with Joe McNamee, Renata Avila, Burcu Kilic, Dinah PoKempner and Abby Vollmer

The European Union has a strong reputation for protection of privacy, free speech and the rule of law. Its rules on the liability of internet companies are among the best in the world and have been widely copied. Now, the proposed EU Copyright Directive seeks to create legal chaos by re-defining the EU’s liability rules, re-defining hosting services as publishers and demanding measures such as mandatory surveillance of all uploads and filtering of any content identified by rightsholders. Participants will learn about the current stage of the decision-making process in the EU, with reactions from speakers with diverse geographical and legal perspectives, as well as contributing their own perspectives.

Workers Data Rights – Making sure the human remains in human resources
14:30 – 15:45, 202B with Joe McNamee, John C. Havens, Christina Colclough, Abhishek Gupta, Burcu Kilic and Parminder Jeet Singh

Join this great interactive panel to discuss how to fill a regulatory gap with regards workers’ data rights. Across the world, corporations are increasingly gathering, storing and using internal as well as external data in their human resource management; for example in recruitment, promotion, disciplinary or layoff processes. Some experts even ask whether data is taking the human out of human resources? Whilst consumers in many countries are covered by data privacy and protection laws, a huge regulatory gap exists: namely on workers’ data rights and protection. This session will address this gap by discussing why this is an issue, what companies can and should do, and what solutions are available. For unions, filling this gap will be the next frontier for collective bargaining, industrial policies and global framework agreements in the digital economy.

Lightning Talks: Inventions, Innovations and Free Trade
WTO E-commerce agenda: What’s the deal? What’s on the plate? Are we ready?
16:00 – 16:15, 205A with Burcu Kilic and Maryant Fernandez Perez, (European Digital Rights (EDRi) and Public Citizen)

Since United States (US) President Trump took office, the US-centric free trade agreement model has been falling apart and global trade discussions have started to centralise around the World Trade Organisation (WTO). Big trade players are now targeting the WTO as a reliable vehicle to foster a global digital trade agenda that will affect our human rights online. E-commerce and, more broadly, digital trade have become a hot topic in the current agenda of the WTO.This lightning talk has three aims. First, we would like to show how important trade discussions are for the community. Second, we highlight the implications of some of the proposals for digital privacy, data protection, algorithms, encryption, competition or even cybersecurity. Third, we’ll show how to get involved in these discussions before it is too late. We have only two years to learn, understand, raise our voice and advocate for our rights. We need to be ready. 


16 May 2018

Looking back at EDRi’s victories in 2017


In 2017, European Digital Rights (EDRi) continued to fight against attempts to undermine human and civil rights online. Our priorities among many ongoing and upcoming challenges were privacy, copyright, surveillance and privatised law enforcement.

Our key successes of 2017 were the vote in a European Parliament Committee rejecting upload filters in the Copyright Directive proposal, EU Parliament’s plenary vote supporting ePrivacy, and the Council of Europe welcoming of EDRi recommendations on the issue of cross-border access to data by law enforcement.

You can read more about our fights and victories in 2017 in our annual report! Also check out our timeline with an overview of the most important events.

Privatised law enforcement
Data protection and privacy
Copyright Reform
Network neutrality

In 2017, we could observe yet another drive from Europe’s governments for online platforms to impose arbitrary restrictions on our freedom of expression, which undermines democratic values and principles enshrined in human rights law.

Last year, we gave important input during the shaping of the Council of Europe’s draft Recommendation on the role of internet service providers and platforms. This draft recommendation represents a step in the right direction on the issue. Moreover, we engaged in a dialogue with the EU institutions in the fields of “illegal content”, terrorism, child protection, and media regulation.

After the successful adoption of the General Data Protection Regulation (GDPR), the fight to keep private information online safe continues. In 2017, the European Commission published a proposal for an ePrivacy Regulation – a crucial piece of legislation providing the rules on the tracking of individuals online and confidentiality of communication in general. We saw one of our biggest victories of the year when our contributions to improve the proposal were reflected in the final text that was adopted by Euro-parliamentarians.

In the area of surveillance, our top priority in 2017 was the question how law enforcement agencies can access data cross-borders. We led a global coalition of civil rights groups to ensure harmonisation of the Council of Europe’s new rules on cross-border access to e-evidence with the highest human rights standards. The Council of Europe welcomed our suggestions on the new protocol. In addition, we engaged with the European Commission in the preparation on a new proposal giving law enforcement easier cross-border data access, as well as to data held by tech and telecom companies.

In 2016, the European Commission published a proposal for a Copyright Reform. EDRi fought against the most dangerous parts of the proposal, including Article 13, which introduces an upload filter, aka “censorship machine”. Echoing our campaign, the EU Parliament Committee on Internal Market and Consumer Protection (IMCO) sent a strong message in their Opinon on the Copyright Directive against the proposed Article 13 and the suggestion to expend the “link tax” (Article 11). The biggest achievement came in November, when the LIBE Committee voted against the mandatory implementation of the censorship machine.

Last year, the Net Neutrality Working Group of the Body of European Regulators for Electronic Communications’ (BEREC’s) delivered a disappointing document. As a result, EDRi member epicenter.works submitted a consultation response, in cooperation with EDRi members IT-Pol and Access Now, as well as by observer Xnet. Furthermore, we continued our involvement with NetCompetition alliance by participating in the Steering Committee, attending the alliance’s apero and the meeting with Member of the European Parliament (MEP) Jose Blanco. Lastly, we signed an open letter in support of Net Neutrality in US, together with 200 organisations and businesses.

EDRi’s participation in the discussion around the inclusion of “data flows” in European trade agreements resulted in many of our suggestions being included in EU Parliament’s report “Towards a digital trade strategy”. We sent a joint letter advocating for the rejection of the EU-Canada Comprehensive Economic and Trade Agreement (CETA) and submitted two public consultation responses on Trade in Services Agreement (TiSA). We became a founding member in the United Nations (UN) Dynamic Coalition on Digital Trade, and co-organised the Civil Society Trade Lab and presented during World Trade Organization Public Forum and EU Parliament’s EPP Hearing on Digital Trade.

The year 2017 was marked by our continued call for a much needed reform of the trilogues system. We pointed out that increased transparency of negotiations between the EU Parliament, the EU Council and the EU Commission would only increase the bodies’ legitimacy and integrity. As part of our response to the Commission’s consultation, we asked for a legal instrument, a Directive aimed at protecting whistleblowers. In 2018, the proposal for this Directive arrived. Finally, we addressed Council transparency in a response to the European Ombudsman.

In 2017, we spoke in 15 countries and gave several expert presentations in European and other key institutions, as well as in many universities. EDRi contributed to international news publications, was represented in a number high-level events, and co-organised events such as the Privacy Camp and School of Rock(ing) Copyright. We continued our membership in Trans Atlantic Consumer Dialogue (TACD), became members of the Non Commercial Users Constituency (NCUC) and renewed the mandate in the Civil Society Information Society Advisory Council’s (CSISAC’s) Steering Committee.

These were our finances in 2017:

To get an overview of the most important actions we took last year, please click on the timeline below:


16 May 2018

A glimpse of our 15th Anniversary


In 2018, European Digital Rights (EDRi) turned 15. We gathered to celebrate a decade and a half of existence of our network and its digital rights victories. The anniversary party took place on 12 April 2018 in Brussels.

In between keynote presentations, a fireside chat with our founders, and a final word from our Executive Director, we managed to capture snippets of our bash. They are now ready for your eyes (and ears).

Birthday Photo Album

Birthday Videos