13 Jun 2018

Censorship – don’t look left or right. Look ahead, look behind!


There is discussion about arbitrary censorship of our freedom of expression in every possible policy area these days. While the issue is intensely political, it is crucial to understand that arbitrary censorship is not a matter of left-wing or right-wing politics, but a threat to democracy as a whole.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Human rights law in Europe and internationally foresees that there are conditions where restrictions on freedom of expression can justifiably be implemented. However, such restrictions need to be genuinely necessary and provided for by an accessible law that can be challenged in an independent court.

We can see how this principle works in practice in the US and Europe. The US Supreme Court ruled that speech that tends “to incite an immediate breach of the peace” is not protected by free speech rules. Similarly, in the well-known Handyside vs UK case, the European Court of Human Rights ruled that the plaintiff’s free speech rights had not been breached by a fine applied for the publication of an “obscene” book, but did set a high bar for such restrictions to be imposed.

The need for protection of freedom of expression represents a long-standing democratic consensus about safeguarding our fundamental freedoms from abuses, regardless of the political motivation of the abuse. Without freedom of expression, there is less political accountability. Less accountability means more abuse and more corruption.

Arbitrary censorship: An issue of human rights, not left and right.

We know from history that oppressive regimes and ideologies, whether they claim be from the left or from the right, have always sought to undermine freedom of expression. This is done overtly by censorship laws and/or insidiously by intimidation of the media.

Arbitrary censorship has resulted in pro-choice channels being repeatedly blocked on YouTube.
Arbitrary censorship has resulted in LGBT channels being “hidden” by YouTube.
Arbitrariness threatens the weakest.
Arbitrariness threatens democracy and accountability.

Looking back – history teaches us all we need to know.
Looking forwards – we must never repeat the mistakes of the past.

Fighting arbitrary censorship is about decency, equality and truth, not politics.

(Contribution by Joe McNamee, EDRi Executive Director)



13 Jun 2018

Answering guide for European Commission’s “illegal” content “consultation”


The European Commission has published a short “consultation” on countering “illegal” content online, with a deadline of 25th June to respond. In order to ensure at least a little balance in outcome of the consultation, EDRi has prepared an answering guide to help you respond as an individual. We suggest opening the consultation in one browser tab and our answering guide in another tab, as the most user-friendly way of availing of our guide.*

Responding should take about 15 to 20 minutes and could have a long-lasting impact on anti-racism, hate speech, child protection, counter-terrorism, freedom of expression, privacy, among other important topics, in Europe.

The consultation follows increasingly frequent demands from the European Commission for arbitrary, unaccountable policing of the internet by service providers, including a Communication in September 2017 and a Recommendation in March 2018. Now, France and Germany are demanding legislation to impose still further restrictions – in the total absence of any evidence that this is necessary, proportionate… or even that it wouldn’t be counterproductive.

Techdirt, provided some good background on the consultation in an article entitled “EU Commission asks public to weigh in on survey about just how much they want the Internet to be censored”.

Click below to access the guide. Each response counts – please play your part.


Read more:

ENAR and EDRi join forces for diligent and restorative solutions to illegal content online (21.06.2018)

Commission’s position on tackling illegal content online is contradictory and dangerous for free speech (28.09.2017)

EU Commission’s Recommendation: Let’s put internet giants in charge of censoring Europe (28.09.2017)

*We would prefer to use frames to make it easier to see both the consultation and the guide at the same time. This is currently not possible due to the way the consultation is coded. We have asked the Commission to change this.

11 Jun 2018

EU Censorship Machine: Legislation as propaganda?


The European Parliament’s Legal Affairs Committee will vote on 20 June on a proposal which will require internet companies to monitor and filter all uploads to web hosting services.

The provisions are so controversial that supporters in the European Parliament have resorted to including purely political – and legally meaningless – “safeguards” in the text as a way of getting the proposal adopted.

For example:

“the measures referred to in paragraph 1 should not require the identification of individual users and the processing of their personal data.

The proposal requires internet companies to provide an “effective and expeditious complaints and redress mechanism”. It is logically impossible to have a filtering system that neither identifies the users nor processes their personal data but still, when content is removed, allows them to complain. What do they complain about when there is no record of the content uploaded by that specific person being deleted?

“ensure the non-availability

This is simply a more complicated and less easy to understand way of saying “upload filtering”

“1.b Members States shall ensure that the implementation of such measures shall be proportionate and strike a balance between the fundamental rights of users and rightholders”.

The Charter of Fundamental Rights applies to governments and the European Commission. The “agreements” to block and filter content would be a commercial decision and therefore outside the reach of fundamental rights legislation.

The Parliament and Member States already agreed (in the recently concluded Audiovisual Media Services Directive) to reject proposals for specific laws to protect fundamental rights in this field.

“and shall in accordance with Article 15 of Directive 2000/31/EC, where applicable not impose a general obligation on online content sharing service providers to monitor the information which they transmit or store”

Article 15 of Directive 2000/31/EC prohibits Member States from imposing a general obligation on internet companies to monitor information that they store. This text suggests upload filters indirectly, in order to circumvent the Charter and EU courts. The reasoning behind is that an obligation to enter into “voluntary” commercial agreement between two private parties “to prevent the availability” of online content will respect EU legislation, while the practices derived from its implementation can only lead to de facto general monitoring of uploads.

“The definition on online content sharing service providers under this directive does not cover services acting in a non-commercial purpose capacity such as online encyclopaedia, and providers of online services where the content is uploaded with the authorisation of all concerned rightholders, such as educational or scientific repositories.”

The fact that they had to include this text proves how wide the effects of Article 13 can be. The problem with this carve out is in the details: what is “acting in a non-commercial purpose” for foundations accepting donations? Similarly, how could future uses of such services be monetised without being “non-commercial”? Furthermore, these carve outs (allegedly targeting individual organisations like Wikipedia and GitHub) are written so vaguely that may not leave sufficient room for them – depending how each court in each Member State will interpret this – or for future similar services.

The vote is on 20 June. If you want to have your say and tell Parliamentarians what you think about this, go to www.saveyourinternet.eu to find out how.

07 Jun 2018

EDRi’s leadership transition: Back to the future…

By Kirsten Fiedler

One and a half months ago, Joe and I shared plans for the upcoming leadership transition in EDRi and launched the search for a new Executive Director. We are proud how far EDRi has come and are now looking at the many challenges and opportunities that lie ahead.

One thing is certain: Challenges to digital rights will not decrease – on the contrary, the speed with which technology is integrating into every part of our lives is making it increasingly difficult to ensure that rights and freedoms are respected from the outset. And in many ways, future battles are already here…

Don’t let the wrong one in

Many experts share the view that the upcoming European elections might bring yet another drastic change to the European political environment. It is therefore not unlikely that this political shift will lead to an increased proportion of parliamentarians from the extremes of the political spectrum that hold views that are contrary to Europe’s core democratic values.

This is why the EDRi network decided in 2018 to strengthen campaigning and cooperation between national digital rights groups – especially in contexts where the existing space for civil society is shrinking.

While policy and advocacy remain without any doubt EDRi’s key strength and focus area, campaigning is the most important new direction for us to develop. Past victories have highlighted how essential it is to mobilise broadly and to press for necessary changes in order to shape European policies.

Lost in translation

However, many digital rights groups have experienced the difficulties of communicating complex digital rights issues. Faced with a lack of powerful imagery that supports traditional human rights issues, we will need to be creative to alert and mobilise. Matters such as algorithmic decision-making need special attention, as they are increasingly used by states and companies, with the potential for serious impact on our daily lives. This issue will only gain in importance as more and more devices are connected to the internet.

On top of this, we’ll continue to be confronted with the dominance of a small number of major private sector actors, their extraordinary financial power and ability to shape the policy environment. The threat from lobbying was highlighted in a recent CEO report about the ePrivacy lobby bandwagon.

United in diversity

EDRi’s strength and uniqueness lies in the fact that we are a membership organisation – our network consists of many renowned tech and legal experts, we can speak for a number of organisations across Europe and mobilise on the national levels to act jointly.

For the EDRi network, a lot of work lies ahead as regards to the implementation of Europe’s new data protection legislation (aka GDPR), that we fought for over more than seven years (from the initial Commission Communication in November 2010 until implementation in mid 2018). Our member organisations have their hands full explaining the GDPR and ensuring that (old and) new rights are well enforced and that they have concrete, practical meaning for individuals, as well as countering wild scare stories spread in the media about what the GDPR actually means.

On the one hand, while EU policies have a direct impact at the national level and the Parliament becoming a more and more difficult point of engagement with the EU institutions, there is a great need to strengthen the network’s impact at the EU Council level. On the other, national political developments also have European-wide impact. There is therefore a growing need to strengthen advocacy groups across Europe to build their work on a national level, but also their cooperation, in order to amplify their voice and impact.

The night of the living dead

In the coming years, EDRi will continue to put all its efforts into ensuring that human rights are respected in all upcoming relevant policy areas from the outset.

Some issues keep reappearing, no matter how many times they appear to have been concluded. For instance, Member States are currently discussing ways to impose mandatory data retention, despite two rulings explaining in great detail about why this is illegal.

We’re also right the middle of the fight against broad filtering and monitoring mechanisms for user content being uploaded on online platforms and against a chaotic new “ancillary copyright” measure that will make it harder to link to and quote from news sources. This also follows two rulings from the EU’s highest court against such policies.

Finally, technology will continue to aggravate another problem from the pre-digital era: human rights violations committed in the name of national security and counter-terrorism, as Member States often see national security as being in competition with and outweighing the right to privacy or free expression. An important bargaining chip in the debate on the false dichotomy between privacy versus security being decided in the coming years is access to “e-evidence”. The idea behind the Commission’s “e-evidence” initiative is that national judicial or administrative bodies can ask a service provider based in another EU Member State, to produce data for the investigation or prosecution of a crime. This means that Facebook, Google, Microsoft, providers of messaging services, and other companies that collect and store data of millions of EU citizens, would be obliged to provide this data to foreign authorities. It is essential for EDRi to continue to fight the proposal to turn service providers into judicial authorities.

The time is now!

As one of EDRi’s founders said during our15th anniversary celebration:

We have arrived in the midst of society. This is the hour!

The fight for digital rights is nowhere near over and there is a growing mountain of issues to be tackled – but the good news is that victories are not impossible, especially with a network that continues to pull together and make the most of its strengths.

If you are passionate about these issues and want to join the fight for your digital rights in Europe, take a moment to consider whether you know a candidate who might be a good fit for this position, help us spread the word about our Executive Director search – or consider applying!

07 Jun 2018

LEAK: France & Germany demand more censorship from internet companies

By Joe McNamee

On 12 April 2018, the Interior Ministers of Germany (Horst Seehofer) and France (Gérard Collomb) wrote to European Commission Vice-President Andrus Ansip and four other European Commissioners, to put pressure on the EU to enact legislation for online platforms like Facebook, but also for small companies, to be legally required to engage in more and quicker privatised and unaccountable censorship. In cases of failure to comply with obligations to remove reported content within one hour, France and Germany ask for the legislation to impose sanctions à la NetzDG. EDRi has obtained a copy of the letter (French/ German  (PDF)) and response that was sent by Mr. Ansip.

The Ministers demand that internet companies develop the “necessary” tools to “identify and remove” terrorist content and other illicit content – automatic censorship by algorithm. They also want them to exchange “hash values” to enable each other to filter uploads of that content.

Why? No evidence is referenced, or even mentioned. The role of states in fighting illegal content is not mentioned.

According to the Ministers, companies should process uploads and make a decision about censoring them very quickly. They propose that internet providers put in place systems (no such systems exist in the real world) in order for them to filter, process, identify, remove and prevent “illicit” content from being further uploaded within one hour.

Why? Because sixty minutes is a magical number. Again, no evidence to this end is referenced, or even mentioned. Again, the role of states in fighting illegal terrorist content is not mentioned.

As for transparency, they ask for statistics on the speed of removal, including the average speed to respond, and “information” about the hash database. However, they do not talk about statistics, about accuracy, nor about the volume of legal content being removed, nor about investigations of such content by states, nor about automated processes disrupting police investigations, nor about any statistics to identify and rectify any counterproductive impacts.

The Ministers explain that they have urged big online platforms to give logistical support to smaller platforms, because “smaller” providers are not able to remove content “fast”.

Why? Again, no evidence is referenced or even mentioned. What does “fast” mean? What is “logistical support”? Do they want all companies to behave like Facebook or Google?

They choose to forget that the European framework, the E-Commerce Directive, permits Member States to hold providers liable for failing to act expeditiously for failing to remove illegal content. It appears that holding providers accountable for breaching existing legislation is more difficult than turning Google or Facebook into an unaccountable internet police force.

What is the metric for assessing current procedures as being “too slow”? Where is the data that suggests that the response is inadequate? Where is the evidence that shows that Member States cannot use the liability provisions already in EU law? It appears that neither Interior Minister has any lawyers working for him. The letter goes on to explain that the non-binding Recommendation issued by the European Commission in March is an “attempt” to permit a more rigorous implementation of the already existing and already binding E-Commerce Directive that is already available to them. What was possible before that Recommendation was adopted is still possible now, and vice versa.

What is surprising is that both Germany and France seem to be pushing for rule of law violations, in the name of upholding the law. The Ministers seem to know that providers will never develop technologies that can process all uploads and make reliable, objective or accountable judgments on the illegality of certain kinds of speech. The Ministers know that, in reality, terms of service will be used to remove anything that creates risk of liability for them (a risk which would be very high, if decisions were needed within 60 minutes and if all breaches of that time limit are sanctioned, as the Ministers demand). Therefore, the Ministers demand the law make adjustments to the providers’ terms of service , moving the measures outside the traditional scope of human rights law (which is binding on states and not private companies).

The Ministers’ letter ends on a bewildering comment that, as there is a consultation on illegal content (that lasts until 25 June 2018), the Commission could prepare draft legislation in June. This would allow five working days after the closing of the Consultation for the Commission to carry out an impact assessment and analyse the results of the consultation, for the relevant Commission services to draft the legislation and for the other relevant parts of the Commission to review the legislation, give their input/suggestions and for that input to be taken into account by the Commission services in charge. This unless, of course, the European Commission is already working on giving the Ministers what they asked for.

In sum, in the five-page letter, not a single reference to the role and responsibilities of the state, to transparency about investigations and prosecutions, to the dangers of counter-productive effects, to review mechanisms, to freedom of expression, personal data protection and privacy, nor to the dangers for democracy of such unaccountable censorship powers in the hands of private companies, some of which are already accused of unfairly influencing elections.

What did the Commission has to say about all this?

On 31 May 2018, Commission Vice President Ansip responded. Ansip pointed out and praised the fact that there are already three initiatives in this regard: a hash-sharing consortium, the Europol Internet Referral Unit and the Radicalisation Awareness Network. Still, he seemed to agree that “more” needs to be done, without any specific evidence to back up this position..

He forgot to mention that there is virtually no meaningful transparency, no review processes, no statistics and no accountability about these initiatives. He also seems to have forgotten to mention the evidence backing the claim that, despite tall of the initiatives listed, further (unspecified) measures might be necessary. On a more positive note, he explained that the Commission is working on a “comprehensive” impact assessment, which will contain all relevant “data and facts” and which will list the various policy options in detail.

This is definitely a reassuringly more sober and more evidence-based approach than the rather frantic, populist approach of the two Ministers. However, as there is no data being collected on, for example, how many of the referrals from the Europol Internet Referral Unit even refer to illegal content or how many (if any) of the examples of allegedly terrorist content are ever investigated or prosecuted, this “comprehensive” analysis will be less comprehensive than it sounds.

Finally, we would like to remind that all Commissioners took an oath of office when starting their jobs (we highlight relevant parts in bold), which also mentions the obligations on states in their interactions with the Commission. Some elements appear to have been forgotten by certain actors in this process:

“I solemnly undertake:
– to respect the Treaties and the Charter of Fundamental Rights of the European Union in the fulfilment of all my duties;
– to be completely independent in carrying out my responsibilities, in the general interest of the Union;
– in the performance of my tasks, neither to seek nor to take instructions from any Government or from any other institution, body, office or entity;
– to refrain from any action incompatible with my duties or the performance of my tasks.

I formally note the undertaking of each Member State to respect this principle and not to seek to influence Members of the Commission in the performance of their tasks.

I further undertake to respect, both during and after my term of office, the obligation arising therefrom, and in particular the duty to behave with integrity and discretion as regards the acceptance, after I have ceased to hold office, of certain appointments or benefits.”


31 May 2018

Xnet: Opposing guarded access to institutional information

By Xnet

In their fight for free access to information and data protection, Spanish EDRi member Xnet contacted the Spanish Data Protection Agency (AEPD). As the AEPD is the institution responsible for the implementation of the General Data Protection Regulation (GDPR) in Spain, Xnet brought up questions about the compliance of the agency’s work with the new regualation.

You can read Xnet’s letter below:

“To whom it may concern,

We have two questions:

1. In order to offer information that should be publicly accessible, you ask for all the personal details of those requesting said information. Does this not clash with article 5 of the GDPR, which states that only the data necessary for performance of the task should be obtained? It is our understanding that for the task of offering information on topics in the public domain, you do not need any data.

As a specific example, in order to ask you this very question, we had to write to you with our electronic certificate, which means you  have access to our personal data. As we understand it, your task is to answer these questions to whomever asks, it should have been possible to ask them without you needing to know who we are. Is this not the case? However, you offer no information by email or by telephone, only in response to communications using electronic certificates.

If our understanding is not correct, we would kindly ask you to send us the legal articles that corroborate your interpretation.

2. We do not understand why the Spanish Data Protection Agency, which as previously mentioned is highly demanding with individuals, does not use https (Secure Hypertext Transfer Protocol) by default in its digital spaces. This leaves the data of those who access your websites vulnerable. We would like to know the reason for this.

Thank you for your attention.



(Contribution by Xnet, EDRi member)

Read more:

A Digestible Guide to Individual’s Rights under GDPR (29.05.2018)

30 May 2018

Belgium Constitutional Court decision on the concept of incitement to terrorism

By Maria Roson

On the 15 March, the Constitutional Court of Belgium issued judgement 31/2018 on the action for the annulment of the law of 3 August 2016 containing various provisions in the fight against terrorism (III), introduced by the NGO Ligue des Droits de l’Homme (Human Rights League) with the Council of Ministers as the defendant. Since the applicant raised objections exclusively against the articles 2 and 6 of the law of 3 August 2016, the Court considered the appeal admissible only in so far as it was directed against these articles, and not to the entire law.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Concerning article 2 of the Law of 3 August 2016, the applicant’s complaint was based on the modification of the previous text of article 140bis of the Criminal Code. The modifications of this article had deleted the requirement that an action pose a real risk to society to be considered an incitement to terrorism. With the removal of this requirement, the article left a wide margin of interpretation, making it impossible to assess the true impact of the action as it was not needed to pose a real risk. This modification also added the possibility of an action inciting “directly or indirectly” to commit a terrorist act, an expression considered too general by the applicant, who felt that this generated great uncertainty about what might or might not be considered an incitement to terrorism. These two modifications also could lead to the criminalisation of less serious offences, without the minimum sentence being reduced.

For these reasons, Ligue des Droits de l’Homme claimed that these modifications were a violation of the principle of legality and the principle of proportionality, being that people could be accused of committing a crime without anything proving it, based on a potential risk determined without objective grounds. These modifications would deeply affect freedom of expression, freedom of association and freedom of movement, leaving citizens uncertain about which could be said or done, since an action would not need to pose a real risk to the public safety nor directly incite to commit a terrorist offence to be considered as such.

As for article 6 of the Law of 3 August 2016, the modification authorises preventive detention in cases of absolute necessity for public security in the case of terrorist offences for which the maximum applicable penalty exceeds five years’ imprisonment, whereas for other offences for which the maximum penalty does not exceed 15 years’ imprisonment, preventive detention is only possible if there are serious reasons to fear that the accused, if left at liberty, would commit new crimes or offences, evade justice, attempt to disappear evidence or collude with third parties. The applicant alleges that the classification as a terrorist offence is not an objective criterion to justify the difference made in relation to other offences.

On their side, the defendant party claimed that the modifications in both articles were founded on the protection of citizens and on the necessity of making the measures and actions to fight against terrorist offences more efficient.

The Constitutional Court carried out a thorough examination of articles 2 and 6 of the law in order to assess their compliance with the Belgian constitution, going through legal grounds including not only the Belgian constitution itself, but also international instruments ratified by Belgium (such as the European Charter of Fundamental Rights, the International Covenant on Civil and Political Rights, the Council of Europe Convention on the Prevention of Terrorism or the European Convention on Human Rights), and also decisions of the Council of the European Union in the fight against terrorism and the case law of the European Court of Human Rights.

In its judgement, the Court decided to annul article 2, 3°, of the law of 3 August 2016, considering several legal obligations, the most relevantbeing one included in Directive 2017/541/EU of the European Parliament and of the Council of 15 March 2017 on combating terrorism, which says that “Member States shall take the necessary measures to ensure that it is punishable as a criminal offence, when committed intentionally, the dissemination or any other form of making available to the public by any means, whether online or offline, of a message with the intention of inciting the commission of one of the offences listed in Article 3(1)(a) to (i), where such conduct incites, directly or indirectly, for example by glorifying terrorist acts, to commit terrorist offences, thereby creating the risk that one or more of those offences may be committed.” Other paragraphs of this directive also mention the need of creating a risk, such as article 10 that states that “such behaviour should be punishable when it creates the risk that terrorist acts could be committed.”

Therefore, based on the absence of this requirement in the modified article 140bis, the Court proceeded to annul article 2.3 of the Law of 3 August 2016.

As for article 6, the Court did not consider it unconstitutional and agreed with the Council of Ministers by saying that the Legislator does not disproportionately infringe the rights of the people concerned due to the special circumstances of terrorist offences, which might require stronger preventive measures that may not apply in other criminal offences.

(Contribution by Maria Roson, EDRi Intern)

Read more:

Terrorism Directive: Document

Can we ensure EU terrorism policies respect human rights? (24.01.2018)



30 May 2018

Join the coordinated calls against EU’s Censorship Machine

By Andreea Belu

Several organisations in different European countries have picked up their phones and mobilised against Article 13. Article 13 introduces automated filters for user content being uploaded on online platforms. The measure is part of EU’s proposal for a new Copyright Directive and poses huge threats to individuals’ rights and freedoms, but also obliges online platforms like Google and Facebook to monitor our communications and become the Internet police.

The goal of our calls is to convince the key undecided MEPs in the Legal Affairs Committee (JURI) of the European Parliament to oppose a censored internet, filtered by automated algorithms under the control of internet giants.

Why now?

JURI is set to vote on its opinion on the 20th-21st of June. Once JURI concludes on its position, it will hold secret talks with the European Council – who already adopted their negotiating position – and reach an agreement.

The bad news? The European Council decided to support mass monitoring and filtering of internet uploads on the 25th of May. The good news? Some MEPs in the JURI Committee are opposing the measure and some are undecided. Unfortunately, the number of MEPs against Article 13 is not yet enough to block the proposal. If we do not convince just a few more to oppose the article too, we will lose the JURI vote. We can not allow this to happen, if we should preserve the internet free from censorship machines.

What now?

While sending emails to MEPs implies less effort than calling them, emails can also be less effective as they can be easily ignored or deleted by email spam filters. Phonecalls can not be ignored and are therefore the most efficient method to (literally) make our voices heard. Anyway, when is the last time you talked to an European Parliamentarian?

Join the movement against the Censorship Machine!

In the past weeks, organisations in Romania, The Netherlands and the UK have phoned the key undecided MEPs. Next week, campaigns will be launched in Spain and phonecall meetings will take place in Bulgaria. The EDRi office will mobilise and spend an hour on Tuesday the 5th of June contacting targeted MEPs. The SaveYourInternet movement is also planning a call storm on the 12th of June. Are you going to join us in calling the crucial undecided JURI MEPs? Here’s what you can do:

  • Grab your phone and call ! Here’s some tips on best practices, a suggestion for a call script and a free calling tool.
  • Organise NGOs and friends in a gathering. We prepared guidance for organising and conducting meetings. Easy-peasy, trust us!

Get in touch at andreea.belu(at)edri.org for more materials and coordination tips or if you want to share how your call session went.

Read More

Save Your Internet

Proposed internet filter will strip citizens of their rights: Your action is needed! (28.03.2018)

#CensorshipMachine – How will the decision be taken? (19.03.2018)

5 Devastating Effects of the EU’s Copyright Proposal (29.03.2018)

Copyright Reform: Document Pool


30 May 2018

Gesellschaft für Freiheitsrechte: Legal challenge against Bavarian Police Act

By Gesellschaft für Freiheitsrechte

EDRi observer Gesellschaft für Freiheitsrechte (GFF) is preparing a joint constitutional complaint to be brought before the German Constitutional Court against the newly passed Bavarian Police Act (PAG) and has started a crowdfunding campaign for that case. In the last couple of weeks Germany has seen major protests against the Bavarian Police Task Act (#noPAG) – but nevertheless, the law was passed by the Bavarian state parliament on 15 May and went into force on 25 May.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

GFF sees the law as a massive threat to civil rights in Bavaria. Critics have seized especially on a definition shift in the Christlich-Soziale Union (CSU) government’s law for the threshold for police intervention from “imminent danger” (konkrete Gefahr) to “looming danger” (drohende Gefahr) as the threshold for police intervention. “Not only does the police get a whole new set of competences to restrict civil rights, but they can also act much earlier. Previously, there were clear requirements as to when the police should be allowed to act, and police action could be tested by administrative courts. In the future, it can hardly be regulated if a given situation is actually presented a “looming danger”, explains Ulf Buermeyer, chairman of Gesellschaft für Freiheitsrechte. “Now, the police is in fact almost free to intervene at their own discretion”. This law will impact digital rights on a whole range of issues, like the use of drones to make images of public events , including specific identification of individuals, lower legal requirements for the police to use wiretapping or get access to user data from third parties.

Consequently, loosening legal requirements to only a “looming danger” will be one of the main issues in GFF´s constitutional complaint. A group of lawyers and civil rights groups is preparing a complaint and currently examining the law for other infringements of civil and human rights. “There are several options to act in the courts against the law. For us it is important to act as thoroughly as possible. We need a brilliant complaint to be successful”, Buermeyer adds.

GFF and other critics fear that the law in Bavaria is only a beginning for a nation-wide change in police legislation, since the newly elected Minister of the Interior Horst Seehofer has been Bavarian prime minister and is also a prominent CSU party member – the party that drafted and pushed the new law. Accordingly, Seehofer considers the new police powers in Bavaria as a blueprint for the rest of the country.

GFF prepared a synopsis (available only in German), that containings the four different versions of the Bavarian Police Act: the one before 01.08.2017, the one since 01.08.2017, the draft of January 2018 and the proposed changes of CSU, that went into force on 25 May.

(Contribution by Gesellschaft für Freiheitsrechte, EDRi member, Germany)

Read more:

GFF: Bavarian Police Act Synopsis (available only in German):

Bavarians protest against vastly extended police powers (16.05.2018)



30 May 2018

Your ePrivacy is nobody else’s business

By Maria Roson

The right to privacy is a fundamental right for every individual, enshrined in international human rights treaties. This right is being particularly threatened by political and economic interests, which are having a deep impact on freedom of expression, democratic participation and personal security. The recent Facebook-Cambridge Analytica scandal is a perfect example of the risks that privacy breaches poses to individuals’ rights.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Under the excuse of providing customers with “a better service”, companies are often unnecessarily asking to exploit communications data, and track them online. In practice, these “requests” often leave users without the real possibility of refusing, as this would mean not being allowed to use the service. This is what EDRi member Bits of Freedom calls “tracking walls”. To protect citizens from this and other abusive practices, EU level rules have been developed, namely the ePrivacy Directive. This Directive was adopted in 2002 and revised in 2009. Now, a new proposal for a ePrivacy Regulation is on the table.

The protection of the right to privacy online in the ePrivacy Regulation should be at the centre of EU’s priorities. For this reason, it is important to be aware of the most sensitive issues concerning ePrivacy, to be able to identify when citizens’ rights could be at risk:


Consent is one of the ways to allow your data to be used legally. Through free and informed consent, the users agree that a company to accesses a specific personal information for a specific purpose.. Consent drives the trust that is needed for new services but it needs to be meaningful. It must be freely given, specific, informed and explicit, not the only choice that is available. For example, accepting abusive permissions “required” by an app, when the only alternative is not using the app at all, is not a valid form of requiring consent.

Legitimate interest

“Legitimate interest” means that under exceptional circumstances it would be legal to access personal data without the user’s consent. Communications data – your emails, calls over the internet, chats, and so on – must be treated as sensitive data, as it has been stated by the Court of Justice of the European Union (CJEU). The “legitimate interest” exception allows only the use of non-sensitive data – such as an email address or a telephone number – therefore communications data cannot, logically and legally, be processed under this exception. For this reason, companies should, in no circumstances, be allowed to monetise or otherwise exploit sensitive communications without specific permission.

Given that the scope of the ePrivacy Regulation deals with sensitive data, the legitimate interest exception has no place in it. Any suchexception would fatally undermine users’ control over such information. Moreover, it would affect freedom of expression, as the users would fear having their communications controlled by companies without consent.

Offline tracking

Offline tracking is a highly intrusive technology, which implies being tracked through your electronic device. The location of your device can be used for unlawful purposes involving the use of sensitive data, revealing personal information of the users, particularly when they are in the vicinity of – or in – various service or institutions. The European Commission has proposed to allow this offline tracking as long as the individual notified. However, obtaining this information by tracking individual citizens poses severe privacy risks and possibilities for abuse, including the risk of mass surveillance by commercial or law enforcement entities. For these reasons, every update of the ePrivacy rules must consider less intrusive ways to obtain location-based information.

Privacy by design and by default

In the same way that you expect to use a microwave oven without having to think about a risk of starting a fire in your house, your connected devices should protect your privacy by design and by default. Privacy by design is the principle by which a high level of user privacy protection is incorporated in all stages of a device’s creation Privacy by default means that our devices are set to protect our data, with options to change this, if we wish to do so. As the ePrivacy Regulation will be the main framework to protect your communications online, it is important that hardware and software (not only browsers) will be designed, at all stages, to protect the privacy of individuals by default, and not by option.

The ePrivacy Regulation is currently being revised in the Council of the European Union, and there is an aggressive lobbying campaign to influence the Regulation to allow big business to exploit personal data more easily. Consequently, it will become less favourable for protecting citizens and their privacy online – the very purpose of the Regulation. Some of the [edri.org/files/eprivacy/ePrivacy_mythbusting.pdf arguments promoted by the lobbyists] are that ePrivacy is bad for democracy and for media pluralism, and that it prevents the fight against illegal content. (None of these arguments is actually linked with protecting privacy.) We have busted these myths, as well as the rest of the most common misconceptions related to ePrivacy. You can read more about it here: edri.org/files/eprivacy/ePrivacy_mythbusting.pdf

Being aware of what it is at risk is the best way to fight against lobby campaigns threatening citizens’ rights.

(Contribution by Maria Roson, EDRi Intern)

Read more:

Mythbusting – Killing the lobby myths that are polluting the preparation of the e-Privacy Regulation

EU Member States fight to retain data retention in place despite CJEU rulings (02.05.2018)

ePrivacy: Civil society letter calls to ensure privacy and reject data retention (24.04.2018)

Cambridge Analytica access to Facebook messages a privacy violation (18.04.2018)

(Contribution by Maria Roson, EDRi Intern)