privatised law enforcement

A measure which would be illegal if implemented by a government should also be illegal if implemented by industry as a “voluntary” measure, as a result of government pressure or for public relations or anti-competitive reasons. However, as key international legal instruments, such as the European Charter of Fundamental Rights and the European Convention on Human Rights, as well as national constitutions are binding for states and governments, they are not directly applicable to other entities, such as private companies. As a result, there is a major trend towards governments persuading or coercing companies to impose restrictions on fundamental freedoms under the guise of “self-regulation,” thereby circumventing legal protections.

11 Dec 2018

The EU Council’s general approach on Terrorist Content Online proposal: A step towards pre-emptive censorship


On 6 December 2018, the EU Council published its general approach on the proposed Terrorist Content Online Regulation. The Council’s position poses serious risks to violate inviduals’ fundamental rights. The approach follows a pattern of rushing into introducing new measures without an appropriate evaluation of their efficiency or consequences to fundamental rights such as privacy and freedom of expression.

The proposed ‘proactive measures’ are an inevitable step towards pre-emptive censorship

said Diego Naranjo, Senior Policy Advisor at EDRi.

“Furthermore, the debate has been rushed, even though there is no evidence that this Regulation is actually needed. For nearly two years, the Council was unable to move the negotiations forward. Now, in only eight weeks, it has achieved a consensus to introduce measures that threaten the open internet and our freedom of expression.”

In its general approach, the Council seems to ignore the fact that proposed “proactive measures” to tackle terrorist content online lack adequate safeguards for fundamental rights, and that the overbroad definitions introduced in the Regulation text introduce risks of arbitrariness in the removal of content online.

In practice, these measures will mean automated detection tools and upload filters. The Impact Assessment for the Regulation admits that there is “rich literature (…) on the need for specific safeguards for algorithmic decision making, where biases and inherent errors and discrimination can lead to erroneous decisions”. The Regulation contains no specific safeguards that would actually work, since platforms are likely to rely on their terms of service – and not law – when disabling access to content. This leaves citizens without access to remedies to bring back legal content online. This is likely to affect human rights defenders, investigative journalists, terrorism victims, NGOs researching war crimes and many other individuals based on their perceived activism, political affiliation, gender, race or origin.

We regret that the Council has not taken the time to improve the many issues that need to be reformed in the proposal, and we hope that the EU Parliament resists the pressure from the Commission and Member States, and adopts legislation that protects fundamental rights.


On 12 September 2018, the European Commission proposed yet another attempt to empower the same big tech companies it claims are already too powerful: a draft Regulation on preventing the dissemination of terrorist content online. The proposal encourages private companies to delete or disable access to “terrorist content”.

The Council, while being unable to move forward negotiations to protect privacy of EU citizens in over two years, has achieved in only eight weeks a consensus to introduce “proactive measures” leading to general monitoring obligations.

The debate for the Regulation has been rushed through, despite the lack of evidence of its necessity. Furthermore, the proposal ignores the incomplete implementation of the existing Directive regarding the spread of terrorist content online.

Read more:

Proposal for a Regulation on preventing the dissemination of terrorist content online – general approach (03.12.2018)

Proposal for a Regulation of the European Parliament and of the Council on preventing the dissemination of terrorist content online

Press Release: EU Terrorism Regulation – an EU election tactic (12.09.2018)

Commission’s position on tackling illegal content online is contradictory and dangerous for free speech (28.09.2017)


05 Dec 2018

Digital rights as a security objective: Fighting disinformation

By Yannic Blaschke

Violations of human rights online, most notably the right to data protection, can pose a real threat to electoral security and societal polarisation. Yet, we are still missing a comprehensive understanding that commercial and political misuse of digital tools are two sides of the same coin – and that we can neither afford weighing digital rights against economic benefits, nor encroaching on citizen’s liberties through ever-new ways of filtering and monitoring. In this series of blogposts, we’ll explain the relationship between digital rights and security objectives. The first part of the series focuses on the role of strong privacy regimes in resilience against disinformation.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

The Cambridge Analytica Scandal, which exposed the use of the information from millions of Facebook users for a perverse form of granular political micro-targeting, revealed that the tracking, monitoring and assessment of citizens’ online behaviour could not only be used for stalking people with commercial advertising. In a much more sinister twist, we had to realise that targeting individuals with pinpoint displays of advertising based on inferences from their personal data opened the door for election interference and disinformation campaigns on an unprecedented scale.

Moreover, motivating one’s most dedicated followers with glorifying information about the own candidacy or organisation, while simultaneously targeting people more inclined to support another movement to suppress their voting intention, suddenly came into the realm of the technically possible. To a great extent, such practices exploit and amplify the already in itself worrying tendency of social networks and internet platforms to create “echo chambers” in which citizens are only presented with news and content that mirrors their own political views, but does not reflect and balance the diversity of opinions within societal debate. When such echo chambers are targeted with dedicated political messages, their already explosive societal potential is set aflame.

In October 2018, the European heads of states called the European Commission to “protect the Union’s democratic systems and combat disinformation, including in the context of the upcoming European elections, in full respect of fundamental rights”. Ironically, many of these same governments are currently blocking the ePrivacy proposal, the exact Regulation that was designed to prevent the most pervasive commercial surveillance such as online tracking and snooping on emails and chat messages. in the negotiations in the Council of the European Union. Even big technology corporations are starting to realise this dangerous potential. However, there seems to be remarkably little reflection in EU policy circles that ultimately it is the commercialisation of personal data on the internet that has brought about these developments. Everyone seems to agree that disinformation should be tackled, but the same online tracking, clickbait news and profiling in social media that is used to create disinformation and radicalisation should not be regulated more tightly – not to harm the “European” (in fact mostly US based) data economy.

Tracking cookies are one of the primary tools of behavioural targeting on the internet, including for political purposes, and metadata constitutes one of the most sensitive and easiest to process forms of data. Not dealing with these issues while pretending to care about disinformation is absurd.

To stop medicating the symptoms and tackle the root cause of the problem instead, a strong privacy regime must thus finally be considered a measure of individual and public safety.

Council continues limbo dance with the ePrivacy standards (24.10.2018)

Five Reasons to be concerned about the Council ePrivacy draft (26.09.2018)

EU Council considers undermining ePrivacy (25.07.2018)

Your ePrivacy is nobody else’s business (30.05.2018)

e-Privacy revision: Document pool (10.01.2017)

(Contribution by Yannic Blaschke, EDRi intern)



05 Dec 2018

Civil society calls Council to adopt ePrivacy now


EDRi has joined a letter of 30 representatives from civil society and online industry, to the Ministers in the Telecoms Council, to express the wide support for the ePrivacy Regulation. The letter describes the clear and urgent need to strengthen privacy and security of electronic communications in the online environment, especially in the wake of repeated scandals and practices that undermine citizens’ right to privacy and the trust on online services.

The support from privacy-friendly businesses such as Qwant, Startpage, Startmail, TeamDrive, Tresorit, Tutanota, ValidSoft or WeTransfer show the positive implications that ePrivacy will have for a dynamic and innovative European internet industry. The collaboration between organisations defending citizens’ rights and industry representatives underlines that both EU citizens and privacy-friendly business models have much to gain from a strong ePrivacy Regulation.

EDRi full-heartedly supports the call of the coalition to the Council of Minister’s to finally move the ePrivacy discussion forward, so that a compromise with the European Parliament can be found before the elections in May 2019. If this is achieved, European citizens will benefit from a strong privacy regime and a less intrusive, more dynamic and more innovative EU data economy.

You can find the letter here.

Open letter to EU member states from consumer groups, NGOs and industry representatives in support of the ePrivacy Regulation (03.12.2018)

ePrivacy review: document pool

Council continues limbo dance with the ePrivacy standards (24.10.2018)



05 Dec 2018

Complaints: Google infringes GDPR’s informed consent principle

By Yannic Blaschke

On 27 November 2018, seven members of the European Consumer Organisation (BEUC) have launched complaints with their national Data Protection Authorities (DPAs) about Google potentially infringing the General Data Protection Regulation (GDPR). The complaint addresses an abusive business practice that is unfortunately wide-spread on the internet: designing websites and interfaces in a way that makes turning off privacy intrusive settings much harder than turning them on.

----------------------------------------------------------------- Support our work with a one-off-donation! -----------------------------------------------------------------

In the case of Google’s processing of location data, BEUC has, for instance, identified the hiding of rejection options in remote locations of the settings or the way that the click flow guides customers through a setup of a Google product as infringements of the GDPR’s “informed consent” principle. The design of so called “choice architectures”, meaning the way in which different choices are presented to an individual, is, in fact, not endemic to the digital world: Originating in behavioural science, the so called “nudging” of people towards certain (ideally benign) behaviours is in contrary frequently implemented in very different spheres ranging from architecture to social policies. But while the ethics of nudging are already heatedly discussed when it comes to practices that supposedly benefit individuals, choice architectures that are designed to lure citizens into agreeing to certain terms they are not fully informed about seems nothing else but intentional deception.

What makes this all worse in the online context is that the surreptitious guiding of individuals to accept invasive privacy settings while not informing about objections is only the first nudge of many: From shopping online to political debates, there is a vast set of economic and political actors that have a deep interest into subtlely pushing what citizens buy, do, see and say online. Being misled in our privacy choice architectures is therefore the outset to being misled with more and more nudges that are often, without our agreement, personalised to us.

Cases such as Google’s practices in regard to location data emphasise that the invasiveness and micro-targeting of the online tracking community begins with the lack of transparency and asymmetry of the information and choice presented to individuals on the internet. Citizens’ anxieties and fears from manipulation deriving from malicious practices will ultimately drag responsible and irresponsible businesses down alike: In recent Eurobarometer surveys 67% of internet users were concerned that the personal data people leave on the internet is used to target the political messages they see, and 40% of citizens avoid certain websites because they are worried about their activities being monitored.

If trust is not restored in the Digital Single Market through fair and respectful business models that rely on informed and meaningful consent, the economic potential of the European data economy will stay below its own potential. It is therefore high time and of crucial importance that the Data Protection Authorities that received complaints by BEUC now set a clear sign that with the introduction of the GDPR, manipulation is, and can never be, informed consent.

NCC publishes a report on tech companies’ use of “dark patterns” (27.06.2018)

My Data Done Right launched: check your data! (07.11.2018)

The GDPR Today – Stats, news and tools to make data protection a reality (25.10.2018)

GDPR explained

(Contribution by Yannic Blaschke, EDRi intern)



05 Dec 2018

Net Neutrality vs. 5G: What to expect from the upcoming EU review?


Since 2016 the principle of net neutrality is protected in the European Union (EU). Net neutrality is a founding principle of the internet. It ensures the protection of the right to freedom of expression, the right to assembly, the right to conduct business, and the freedom to innovate on the internet. These protections came about in no small part due to the work of civil society. A coalition of 23 NGOs worked together for over three years to convince politicians and regulators of the importance of net neutrality. This victory may be called into question when the Body of European Regulators for Electronic Communications (BEREC) reviews its net neutrality guidelines in 2019.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

The net neutrality protections in the EU consist of two layers, a legal and regulatory one. The legal basis for the protections is part of an EU Regulation, which takes precedence over national law and is directly applicable in all Member States as well as the further three countries in the European Economic Area (EEA) Norway, Iceland and Liechtenstein. This Regulation gives the independent national telecom regulators the power and the mandate to protect net neutrality in their respective countries. To ensure that these 31 independent regulators apply the Regulation uniformly throughout the EU and EEA, they must take “utmost account” of the guidelines on net neutrality that were issued by the European umbrella organisation of all telecom regulators, BEREC. These BEREC net neutrality guidelines present very detailed recommendations on what net neutrality actually means in Europe.

The Regulation prescribes that the European Commission has to submit an evaluation report by 30 April 2019. For this purpose, it has outsourced part of the work to the consultancy firm Ecorys and the law firm Bird & Bird, which is famous for assisting the telecom industry in resisting net neutrality protections. This has led to the peculiar situation that regulators and civil society have to answer questions about the strengths and weaknesses of the Regulation to the same office of Bird & Bird that EDRi member Bits of Freedom faces in court in a case based on the same Regulation. Several NGOs, including EDRi, EDRi member and others have sent an open letter to the Commission pointing out this conflict of interest, but the Commission has not fully addressed these concerns.

It seems the European Commission will not reopen the Regulation, considering the elections to the European Parliament in May 2019 and the following reshuffle of political power in the European Union. On the other hand, BEREC conducts itself transparently and has already announced on several occasions that it intends to review its net neutrality guidelines. Since the release of BEREC’s draft work programme for 2019, we also know that this review is planned to start in 2019 and will lead to new draft guidelines, followed by a public consultation process in late September 2019.

What is to be expected of this review? The telecom industry has made clear what they want to talk about: 5G. The new mobile network standard has not even been fully specified, but it is already the biggest talking point of the telecoms industry and is used to call into question existing net neutrality protections around the world. With the US having stepped away from their 2015 Open Internet rules, Europe is now the first major world region that tries to bring 5G in line with net neutrality. This debate has a technological and political side. Technologically, 5G brings a new option for telecom operators to deepen their control over the information flow. It is called “network slicing” and it brings differentiated Quality of Service policies to the radio access network. The scenarios range from preferential treatment for premium subscribers at the expense of everyone else to a complete segmentation of the internet with granular control of the network over every application.

Whether the application of this new industry standard has to follow existing telecom law or whether the law should adapt to the standard ought to be an easy question to answer, but this might not be the case. For example, at the recent global Internet Governance Forum (IGF) in Paris, representatives of telecom giants Vodafone and AT&T strongly argued for loosening existing net neutrality rules in order to make a 5G-rollout more economically viable.

5G offers providers far more control when it comes to giving preferential treatment to individual applications or internet subscribers, but it also brings interesting new features like specifying a low energy network slice, which could be used for example by solar powered Internet of Things (IoT) devices. BEREC has taken it upon itself to tackle this issue upfront. This means that regulators will decide if the strong protections against the abuse of exceptions to the net neutrality in the Regulation (so called “specialised services”) will be upheld.

If Europe follows the push of the telecom industry to water down the implementation and enforcement of its net neutrality rules and allows a two-tiered internet system built on a sliced up 5G network, this could have serious ripple effects in the rest of the world. Should the US and Europe allow 5G to become the exception to net neutrality, the end of the open internet will become a question of the roll-out of the next mobile network technology.

The final guidelines in Europe comes quite close to the US 2015 Federal Communications Commission (FCC) Open Internet rules. Particularly regarding the issue of zero-rating, the guidelines do not offer a so-called “bright-line rule”. This means that the final decision on the legality of commercial offers that discriminate between applications based on price is left to the regulator. While zero-rating offerings are now on the market in all but one European country, not a single regulator has prohibited such offer. Especially low income and young internet users are affected by this strong incentive to only use well-established internet services.

The reform of the net neutrality guidelines should tackle this issue. The Regulation clearly states that there are cases in which regulators have to intervene against zero-rating. While a bright-line rule banning zero-rating would be the best possible outcome, at the very least more guidance has to be given to regulators when it comes to different forms of economic discrimination that are clearly harmful to end-user rights. EDRi member will publish a report on the net neutrality situation in Europe in early 2019, including a mapping of zero-rating offers in the European market.

This article is an adaptation of an article previously published by EDRi member

BEREC Guidelines on the Implementation by National Regulators of European Net Neutrality Rules

How we saved the internet in Europe (04.10.2016)

(Contribution by EDRi member, Austria)

05 Dec 2018

EDRi members in joint protest against “surveillance zone” in Saxony

By Digitalcourage

A new proposal for a surveillance law in the German state of Saxony is threatening to lead to abhorrent consequences on a stretch of Germany’s international border. The draft law is part of a wave of drastic police law reforms that are being discussed in 15 of the 16 federal states of Germany.

----------------------------------------------------------------- Support our work with a one-off-donation! -----------------------------------------------------------------

The police law proposed by the state government of Saxony is particularly concerning, with plans such as increasing the use of police informers, allowing communications interception and inhibition, and arming the police with hand grenades and machine guns. With the stated aim of controlling organised crime, Saxony also intends to allow video surveillance with biometrics technology – such as automatic face recognition – along the two international sections of the state’s border, neighbouring the Czech Republic and Poland. This would not only apply at the border itself, but anywhere up to 30 kilometres away from it. The Pirate group at the University of Dresden, in the Saxon capital, computed that this zone would cover just over half of Saxony, not just a quarter or a third as proponents of the measure had claimed.

Protests against the draft police law are being led by a local alliance of organisations and individuals. EDRi member Digitalcourage supports this alliance and has teamed up with fellow EDRi members Iuridicum Remedium (IuRe) from the Czech Republic and Panoptykon Foundation from Poland to voice their concerns. Digitalcourage submitted a detailed response to the draft law to Saxony’s state parliament, describing the surveillance plans as “a statement of mistrust against our Czech and Polish neighbours” and pointing to the state’s constitution which in article 12 “calls for cross-border regional cooperation – and not preemptive, automatised surveillance.” Digitalcourage “observes with great concern that with these changes, the Saxon police and judiciary will take on characteristics of a ‘preemptive state’. The plans for preemptive telecommunications and video surveillance and data processing powers will shift the focus of police work from investigation to surveillance.”

IuRe and members of the Czech Pirate Party have raised their concerns with Czech Foreign Minister Tomáš Petříček. IuRe lawyer Jan Vobořil wrote in a press release that “as Czech citizens [IuRe] perceives the plans for camera systems along the border and 30 km into Germany as a threat […] It is impossible to ignore that this is a comprehensive incursion into the basic rights of every cross-border traveller.”

Wojciech Klicki, legal analyst at Panoptykon Foundation, wrote in support of Digitalcourage’s statement: “The use of face recognition technology is a sign of treating everyone as a potential suspect. This proposal also demonstrates a lack of trust between respective Polish and German police forces. Collecting personal data can be justified only in exceptional, specific cases. In other cases, it is essentially a tool of mass surveillance of local communities and incomers/commuters from Poland.”

Digitalcourage is calling on the Saxon government to stop the legislative process for this proposal which has disregarded necessary assessment including a privacy assessment for the border surveillance and include clear risk of fundamental rights violations.


Saxon police law: Czech, Polish and German criticism on planned facial recognition on the border (only in German)

Press release: Saxon police law: Czech, Polish and German criticism on planned facial recognition on the border (only in German)

Comments submitted to Parliament (only in German)

Comments submitted to Saxon parliament by Amnesty International, Saxony branch (only in German)

Cross-border criticism of facial recognition in Saxony (only in German, 12.11.2018)

Press release: Saxon police planning to install facial recognition cameras to the border with the Czech Republic (only in Czech, 26.10.2018)

Smart cameras at the Polish-German border (only in Polish, 13.11.2018)

(Contribution by Sebastian Lisken, EDRi member Digitalcourage)



05 Dec 2018

Germany: New police law proposals threaten civil rights

By Digitalcourage

The number of police laws in Germany has increased in recent months. Several states have introduced changes to their police laws that all follow the same line: the police need more means and powers to combat terrorism. Advocacy around these these laws is complicated by the country’s federal system, where each federal state has their own police forces and governing laws.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

These new law proposals are introduced under the pretext of increasing security, but they may result in actually stripping away important protections and safeguards against governmental repression. With the exception of one federal state, all regional governments are pushing for drastic measures of surveillance: governmental hacking via state trojans, more video surveillance, more police controls in the public space, lifelong restraining orders, weeks of imprisonment without legal aid and arming the police forces with machine guns. The biggest threat to our rights and freedoms, though, is a paradigm shift underlying all these new police laws: the required condition for surveillance measures is changing from a concrete suspicion to an “impending threat”. This takes away a fundamental principle under the rule of law: people will no longer know by which behaviour they can avoid being targeted by police measures. The vagueness of terms and the lack of requirement for reasonable suspicion increase risks of arbitrariness in the use of police force.

All across Germany alliances have formed to stop these infringements on our rights and freedoms. EDRi member Digitalcourage has taken an active role in several of these alliances, with contributions to state parliament hearings, online petitions and other supporting work. Tens of thousands of people have protested in the state capitals of Munich (Bavaria), Düsseldorf (North Rhine-Westphalia) and Hanover (Lower Saxony). In Lower Saxony and North Rhine-Westphalia the governments’ plans have been postponed. Next steps include legal action. The protest alliances are supported by various civil society organisations and movements: antifascists and liberals, data protection activists and football fans, environmentalists and lawyers. It is an unusually broad alliance of groups that are aware how deeply these laws would affect our everyday lives.


Digitalcourage: articles on police law reforms in Germany (only in German)

“Against police laws and interior armament” – blog article (only in German)

“Against police laws and interior armament” – petition form (only in German)

(Contribution by Kerstin Demuth and Sebastian Lisken, EDRi member Digitalcourage, Germany)



05 Dec 2018

Growing concerns on “e-evidence”: Council publishes its draft general approach


On 30 November 2018, the Council of the European Union published a draft text for its general approach on the proposal for a regulation on European Production and Preservation Orders in criminal matters – also known as “e-evidence”. The text is to be adopted by EU Member States, represented in the Council.

Already the initial proposals of the European Commission raised concerns regarding the fundamental right to privacy and protection of personal data, and the concerns are growing with this new text. For instance, the text reveals a severe deterioration of the few provisions that were meant to safeguard these fundamental rights (see, for example, deletions in recital 55 and articles 14.4.f and 14.5.e). Consequently, on 5 December 2018, 18 civil society organisations sent a letter urging EU Member States to oppose the adoption of the draft general approach and seriously reconsider the position of the Council.

You can read the letter here (pdf) and below:

Civil society urges Member States to seriously reconsider its draft position on law enforcement access to data or “e-evidence”

Dear Madam/Sir,

We are writing on behalf of 18 civil society organisations from across Europe and beyond. In view of the upcoming Council meeting regarding the draft Regulation on European Production and Preservation Orders, we urge you to oppose and seriously reconsider the draft general approach. We join the eight Member States that wrote to the European Commission and the Austrian Presidency asking to take into account input from stakeholders, including civil society.

The “compromises” presented by the Austrian Presidency fail to solve the fundamental concerns of the “e-evidence” proposals. For example, the text

  • greatly reduces the possibility for enforcing authorities to refuse recognition and enforcement of an order on the basis of a violation of the Charter of Fundamental Rights;
  • wrongly assumes non-content data is less sensitive than content data, contrary to case law of the Court of Justice of the European Union (CJEU) and the European Court of Human Rights (ECtHR) – notably the CJEU Tele 2 judgment (cf. para.99) and the ECtHR’s case Big Brother Watch and others v. UK (cf. para.355-356);
  • contemplates the possibility to issue orders without court validation, disregarding what the CJEU has consistently ruled, including in its Tele 2 judgment (para. 120).
  • does not provide legal certainty; and
  • undermines the role of executing states, thereby undermining judicial cooperation.

Civil society is not alone in raising serious concerns. Similar views have been expressed by the European Data Protection Board (EDPB), judges such as German Association of Judges, companies like Internet Service Providers, academia, Bar Associations, the Meijers Committee, among many others.

We value the role of law enforcement to protect society and understand the need for law enforcement authorities to perform their duties effectively. However, efficiency should not be achieved at the expense of weakening fundamental rights, legal safeguards and judicial cooperation.

We thank you in advance for your time and consideration.

Kind regards,

European Digital Rights (EDRi)
Access Now
Centre for Democracy and Technology – CDT
Chaos Computer Club (Germany)
Council of Bars and Law Societies of Europe – CCBE (Sweden)
Electronic Frontier Foundation – EFF
Electronic Frontier Norway – EFN (Norway) (Austria)
Fair Trials
Förderverein Informationstechnik und Gesellschaft e. V. – FITUG (Germany)
Fundamental Rights European Experts Group – Free Group
Homo Digitalis (Greece)
IT-Pol (Denmark)
La Quadrature du Net (France)
Privacy International
Vrijschrift (Netherlands)
X-net (Spain)

Civil society letter urging Member States to seriously reconsider its draft position on law enforcement access to data or “e-evidence”(05.12.2018)

Draft Council general approach on “e-evidence” (30.11.2018)

Letter of eight Member States to the European Commission and the Austrian Council Presidency on “e-evidence” (20.11.2018)

EU “e-evidence” proposals turn service providers into judicial authorities (17.04.2018)

Independent study reveals the pitfalls of “e-evidence” proposals (10.10.2018)

(Contribution by Chloé Bérthélemy, EDRi intern, and Maryant Fernández Pérez, EDRi)



05 Dec 2018

Terrorist Content Regulation: Civil rights groups raise major concerns


On 4 December 2018, a coalition of 31 civil society organisations published a letter that raises significant concerns regarding the proposal for a Regulation to prevent the dissemination of terrorist content online. The letter was addressed to the EU Member States’ Home Affairs Ministers, ahead of their meeting on 6 December.

While the undersigned organisations support the fight against terrorism as an important and legitimate goal for public policymakers, the measures contained in the Regulation risk an unbalanced arbitrariness that goes against the need for the safeguarding of civil liberties. In particular:

  • The definitions in the proposal are too broad and should be brought in line with the current Terrorism Directive.
  • The proposed proactive measures are not transparent and lack accountability.
  • The “competent authorities” who will have the right to issue removal orders need to be clearly defined to ensure their independence.
  • The proposed referrals risk to undermine the rule of law via the vague terms of service of internet companies.

As the Regulation in its current state risks to seriously undermine the EU fundamental rights framework, EDRi and the undersigned organisations call for a far-reaching reform of the proposal.

Letter on the proposal for a Regulation to prevent the dissemination of terrorist content online (04.12.2018)

Terrorist content regulation – prior authorisation for all uploads? (21.11.2018)

EU’s flawed arguments on terrorist content give big tech more power (24.10.2018)

Joint Press Release: EU Terrorism Regulation – an EU election tactic (12.09.2018)

EU Parliament’s anti-terrorism draft Report raises major concerns (10.10.2018)


04 Dec 2018

Serbian Data Protection Commissioner: NGOs call for transparency

By SHARE Foundation

Today, on 4 December, eight digital rights organisations from across Europe sent a letter to the National Assembly of Serbia, asking for a transparent process of the selection of the country’s new Data Protection Commissioner. The mandate of the current Commissioner for Information of Public Importance and Personal Data Protection of Serbia is to expire soon, and given the fact that the newly adopted Law on Personal Data Protection starts being applied in August 2019 and that Law on Free Access to Information of Public Importance is being reformed, it is of high importance that the new Commissioner is appointed as soon as possible, through a transparent process in accordance with the law, and that the best candidate is given the position.

The letter invites the Culture and Information Committee of the National Assembly of Serbia to:

  • Start the procedure for the election of a new Commissioner, as soon as possible;
  • To make the procedure for selecting the best candidate for the position transparent;
  • To determine the legal conditions for selection, where, in addition to general expertise and experience in the protection and promotion of human rights, priority should be given to candidates with specific expertise and experience in freedom of information and personal data protection;
  • To conduct interviews with the best candidates at a session that will be open to the public, in order to deliver a reasoned decision on the proposal to the National Assembly;
  • To justify the proposal for a decision on the best candidate’s choice according to each of the set conditions.

The organisations called upon the National Assembly, which appoints the Commissioner, to ensure the highest standards in the selection and appointment of the new Commissioner in order to respect the foundations of a free, innovative and open digital society that delivers the best data protection standards in Serbia, in line with the European Union General Data Protection Regulation (GDPR) and the Convention 108 of the Council of Europe.

General Data Protection Regulation: Document pool

Letter to the National Assembly of Serbia (04.12.2018)

(Contribution by EDRi member SHARE Foundation, Serbia)