30 Apr 2019

Two years of net neutrality in Europe – 31 NGOs urge to guarantee non-discriminatory treatment of communications

By Kirsten Fiedler

Today, on 30 April 2019, two years after the new net neutrality rules came into force, the EU Commission published its Report to evaluate their implementation in Europe.

Unfortunately, the Report does not give the in-depth analysis of the situation that one could have hoped for. It ignores the fact that the respect for net neutrality protections has deteriorated across the EU in the past years, as shown by a recent study carried out by EDRi member Epicenter.works.

31 human and civil rights organisations therefore addressed an open letter to the EU Commission and the Body of European Regulators for Electronic Communications (BEREC).

The letter points out that it is now more critical than ever for the EU institutions to ensure that users and online businesses in Europe benefit from equal and non-discriminatory treatment of traffic in the provision of internet access services. The upcoming review of BEREC’s net neutrality guidelines will be an opportunity to work towards a true Digital Single Market that protects and promotes an open, neutral and non-discriminatory access to the internet.

EDRi, along with the other signatories of the letter will continue to closely monitor and contribute to the review process of the guidelines for a harmonised implementation which will start in the second half of 2019.

You can download the letter here (pdf) or read it below.

European Commission
DG for Communication Networks, Content and Technology
Directorate B: Electronic Communications Networks and Services
Unit B2: Implementation of the Regulatory Framework
1049 Bruxelles/Brussel

Body of European Regulators for Electronic Communications (BEREC)
Zigfrīda Annas Meierovica bulvāris № 14, 2
nd floor
LV-1050 Rīga
Republic of Latvia

30 April 2019

Joint statement on the publication of the European Commission’s evaluation report on Europe’s net neutrality rules

Today, the European Commission published its implementation report on the EU’s net neutrality rules contained in Regulation (EU) 2015/2120 laying down measures concerning open internet access.

Net neutrality is one of the central reasons for the success of the internet and the foundation of its technological structure. Net neutrality is crucial for innovation, competition and for the free flow of information by allowing internet traffic to move freely without discrimination. Most importantly, a neutral and open access, as safeguarded in Article 3 of the Regulation, gives the internet its ability to generate new means of exercising fundamental rights such as the freedom of expression and the right to receive and impart information – without interference by telecom companies.

The undersigned organisations therefore welcome the European Commission’s decision to uphold the EU’s net neutrality legislation. While we welcome certain positive elements in the report, such as stating the Commission’s aim to protect European internet users and the release of the underlying study of Bird & Bird, many obvious problems, such as the market entry barriers for participation in the class-based zero-rating offers, particularly affecting cross-border content and application providers in the Digital Single Market, the ongoing throttling of certain applications by telecom operators, the complete lack of dissuasive and proportionate penalty provisions by member states have been ignored or overlooked.

The undersigned organisations feel that, regardless of the brief evaluation period, the report falls short of an in-depth analysis and we are disappointed that the Commission did not put more efforts into a substantive, evidence-based report.

New barriers to enter the market

The most significant issue, which is unfortunately not covered in the report, is the lack of enforcement as regards commercial practices such as differential pricing offers (zero-rating) which undermine the rights of end-users.

A recent study by EDRi member epicenter.works found that since the EU’s net neutrality rules came into effect, the discriminatory practice of zero-rating has spread to all but two EU countries with a total of 186 cases in Europe. Among the top 20 applications and services that receive preferential treatment, 15 are from the US and only three are based in the EU. As applications and services from EU Member States other than the country where the telecom service is offered are rarely zero-rated and telecom companies thus created new market entry barriers1, the implementation has weakened European applications and services and led to further fragmentation of the European Digital Single Market.

Lack of harmonisation

The undersigned organisations firmly believe that an in-depth analysis of the state of harmonisation as regards the implementation of Regulation (EU) 2015/2120 is necessary.

Three serious shortcomings were highlighted to the Commission, BEREC and other experts but unfortunately not covered by the Commission’s implementation report:

  1. A majority of national regulators have not implemented effective and dissuasive penalties (BG, CY, DE, DK, EE, ES, FI, GR, HR, IE, IT, LU, LV, NO, PT, SE, SI)2 as required by Article 6 of the Regulation;
  2. There have been contradicting decisions by national regulators (notably regarding congestion management as well as port blocking, which is critical for consumers to deploy self-hosted and decentralised email servers and service providers that rely on the digital single market);
  3. Many national regulators are not compliant with their annual reporting obligations and most crucially only 8 regulators report numbers on the development of internet speeds that should meet increasing demands3.

Finally, we remain hopeful that BEREC’s work to review the Guidelines will lead to a more efficient and harmonised implementation. An essential part of this work will be to offer further guidance to national regulators when assessing differential pricing practices and their effect on material infringements of end-user rights and on cross-border provision of online services. Most importantly, Europe will lead the way as regards a clarification which (and if any) changes are needed in the net neutrality framework to incorporate the upcoming 5G mobile network standard. An evidence-based discussion will be of utmost importance in this matter.

We remain at the Commission’s and BEREC’s disposal for any support and expertise that we can provide to work towards a true Digital Single Market that protects and promotes an open, neutral and non-discriminatory access to the internet.


AccessNow (International)
Alternatif Bilisim (AIA, Turkey)
Asocia ia pentru Tehnologie i Internet (ApTI, Romania) ția pentru Tehnologie și Internet (ApTI, Romania) și Internet (ApTI, Romania)
Bits of Freedom (Netherlands)
Chaos Computer Club (CCC, Germany)
Chaos Computer Club Wien (C3W, Austria)
Code4Romania (Romania)
Defesa dos Direitos Digitais (D3, Portugal)
Digitalcourage (Germany)
Digitale Gesellschaft e. V. (Germany)
Digital Rights (Ireland)
Electronic Frontier Norway (EFN, Norway)
Epicenter.works (Austria)
European Digital Rights (EDRi, Belgium)
Fitug e. V. (Germany)
Föreningen för digitala fri- och rättigheter (DFRI, Sweden)
Frënn vun der Ënn (Luxemburg)
Hermes Center (Italy)
Homo Digitalis (Greece)
IT-Pol (Denmark)
Iuridicum Remedium (IuRe, Czech republic)
Liga voor Mensenrechten (Belgium)
Net Users’ Rights Protection Association asbl (NURPA, Belgium)
OpenMedia (International)
Open Rights Group (United Kingdom)
Reporters Without Borders (International)
quintessenz (Austria)
SHARE (Serbia)
vibe.at (Austria)
Wikimedia Deutschland e. V. (Germany)
Xnet (Spain)

1Report: The net Neutrality Situation in the EU, pages 21-29: https://epicenter.works/document/1522

2Report: The net Neutrality Situation in the EU, pages 13-15: https://epicenter.works/document/1522

3Report: The net Neutrality Situation in the EU, pages 9-13: https://epicenter.works/document/1522

Net neutrality wins in Europe! (29.08.2016)

Commission report on open internet

A study evaluates the net neutrality situation in the EU (13.02.2019)

Net Neutrality vs. 5G: What to expect from the upcoming EU review? (05.12.2019)

24 Apr 2019

What the YouTube and Facebook statistics aren’t telling us

By Bits of Freedom

After the recent attack against a mosque in New Zealand, the large social media platforms published figures on their efforts to limit the spread of the video of the attack. What do those figures tell us?

Attack on their reputation

Terrorism presents a challenge for all of us – and therefore also for the dominant platforms that many people use for their digital communications. These platforms had to work hard to limit the spread of the attacker’s live stream. Even just to limit the reputational damage.

And that, of course, is why companies like Facebook and YouTube published statistics afterwards. All of that went to show that it was all very complex, but that they had done their utmost. YouTube reported that a new version of the video was uploaded every second during the first hours after the attack. Facebook said that it blocked one and a half million uploads in the first 24 hours.

Figures that are virtually meaningless

Those figures might look nice in the media but without a whole lot more detail they are not very meaningful. They don’t say much about the effectiveness with which the spread of the video was prevented, and even less about the unintended consequences of those efforts. Both platforms had very little to say about the uploads they had missed, which were therefore not removed.

In violation of their own rules

There’s more the figures do not show: How many unrelated videos have been wrongfully removed by automatic filters? Facebook says, for example: “Out of respect for the people affected by this tragedy and the concerns of local authorities, we’re also removing all edited versions of the video that do not show graphic content.” This is information that is apparently not in violation of the rules of the platform (or even the law), but that is blocked out of deference to the next of kin.

However empathetic that might be, it also shows how much our public debate depends on the whims of one commercial company. What happens to videos of journalists reporting on the events? Or to a video by a victim’s relative, who uses parts of the recording in a commemorative video of her or his own? In short, it’s very problematic for a dominant platform to make such decisions.

Blind to the context

Similar decisions are already taken today. Between 2012 and 2018, YouTube took down more than ten percent of the videos of the Syrian Archive account. The Syrian Archive is a project dedicated to curating visual documentation relating to human rights violations in Syria. The footage documented those violations as well as their terrible consequences. YouTube’s algorithms only saw “violent extremism”, and took down the videos. Apparently, the filters didn’t properly recognise the context. Publishing such a video can be intended to recruit others to armed conflict, but can just as well be documentation of that armed conflict. Everything depends on the intent of the uploader and the context in which it is placed. The automated filters have no regard for the objective, and are blind to the context.

Anything but transparent

Such automated filters usually work on the basis of a mathematical summary of a video. If the summary of an uploaded video is on a list of summaries of terrorist videos, the upload is refused. The dominant platforms work together to compile this list, but they’re all very secretive about it. Outsiders do not know which videos are on it. Of course, that starts with the definition of “terrorism”. It is often far from clear whether something falls within that definition.

The definition also differs between countries in which these platforms are active. That makes it even more difficult to use the list; platforms have little regard for national borders. If such an automatic filter were to function properly, it would still block too much in one country and too little in another.

Objecting can be too high a hurdle

As mentioned, the published figures don’t say anything about the number of videos that were wrongfully removed. Of course, that number is a lot harder to measure. Platforms could be asked to provide the number of objections to a decision to block or remove content, but those figures would say little. That’s because the procedure for such a request is often cumbersome and lengthy, and often enough, uploaders will just decide it’s not worth the effort, even if the process would eventually have let them publish their video.

One measure cannot solve this problem

It’s unlikely that the problem could be solved with better computers or more human moderators. It just isn’t possible to service the whole world with one interface and one moderation policy. What is problematic is that we have allowed to create an online environment dominated by a small number of dominant platforms that today hold the power to decide what gets published and what doesn’t.

What the YouTube and Facebook statistics aren’t telling us (18.04.2019)

What the YouTube and Facebook statistics aren’t telling us (only in Dutch, 08.04.2019)

(Contribution by Rejo Zenger, EDRi member Bits of Freedom; translation to English by two volunteers of Bits of Freedom, one of them being Joris Brakkee)

24 Apr 2019

Facebook Custom Audience illegal without explicit user consent

By Guest author

Online shops and marketers routinely share customer data with Facebook to reach them with targeted advertising. Turns out that in many cases this is illegal. A ground-breaking decision by a German Data Protection Authority (DPA) recently ruled that matching customers’ email addresses with their Facebook accounts requires their explicit consent.

Cold medicine when you catch the flu, outdoor clothing when you want to go hiking, diapers after you searched for baby care – targeted advertising on Facebook is everywhere. What many users don’t understand is how exactly advertisers target them on Facebook.

Facebook’s Custom Audience tool is one of many ways in which advertisers can find specific audiences on the platform. The tool allows them to get their message through to people they already know, such as clients from their online shops or subscribers of their newsletters. It is one of the foundations of Facebook’s billion-dollar advertising business. It is also illegal, the way it is often used today.

Here’s how Custom Audience works: Advertisers upload a list with customer contact information like email addresses or phone numbers. Facebook then matches these with its own data to identify the desired audience. “In none of the cases we investigated, had companies informed their users, subscribers or customers that their contact information will be shared with Facebook”, explained Kristin Benedikt, head of the internet division at the Bavarian Data Protection Authority, in an interview with netzpolitik.org. Her office recently banned advertisers from using the tool and uploading people’s data to Facebook without explicit user consent. The Higher Administrative Court of the federal state of Bavaria upheld the decision in late 2018, after an online shop had appealed it.

“ We are certain that Facebook obtains additional information about users from matching email addresses, regardless of whether a person is already registered with Facebook. At the very least, custom audience data shows Facebook that a user is also a customer of a particular company or online store. This may seem harmless in many cases, but we have observed insurance companies that have uploaded email addresses, also online shops for very specific products. When an online pharmacy or an online sex shop shares their customer list with Facebook, we cannot rule out that this reveals sensitive data. The same applies when someone visits the online shop of a political party or subscribes one of their newsletters. In all of these instances, custom audiences reveal granular insights. Facebook adds this information to existing profiles and continues to use it, without notifying users or giving them a chance to object,” Benedikt elaborated.

Wide-ranging implications for other Facebook tools

Defenders of the tool such as the data broker Acxiom point to the fact that the data matching only happens after the data has been hashed. Hashing is a popular pseudonymisation technique that turns the advertisers’ customer data such as email addresses or phone numbers into short fingerprints before they are matched by Facebook, which does the same with its own data. In our interview, Kristin Benedikt explains that from a data protection perspective this doesn’t change anything: “When one of the partners in the process can translate the hash code, the procedure cannot be anonymous. The whole purpose of Custom Audience is to find and address selected users.”

Benedikt argues that the decision has implications for the use of other Facebook tools, such as Lookalike Audience and the Facebook Pixel, even though the regulator only looked at the use of the specific version of Facebook Custom Audience that relies on contact lists. The Lookalike Audience tool allows advertisers to reach out specifically to people who have similar data profiles to those in their existing databases. The Facebook Pixel allows them to target people on Facebook who have previously used their websites and apps.

“In our opinion usage of the pixel method also requires user consent in order to be permissible. Data processing under the pixel method is particularly extensive, tracking users across different websites and devices. This also applies to non-Facebook users. For users visiting a website, tracking is neither expectable nor recognisable. Only those who are technically sophisticated can detect data processing in the background. This is neither transparent nor does the user have a real choice here,” said Benedikt.

Other European DPAs are showing interest

The case was decided under the federal German data protection law before the General Data Protection Regulation (GDPR) came into force in the EU in May 2018. “Nevertheless, we think that the relevant principles still hold under the GDPR”, Benedikt explained. She stressed that her office rules out that advertisers could rely on another legal basis for the data transfer. “At most, there would be the so-called balancing of interests. But in a case like this, in which the processing is opaque, the interests of data subjects in the protection of their data clearly outweighs the companies’ interest in advertising and sales.”

German Data Protection Authorities are organised between the 16 federal states and the federal government. Benedikt explained that the Bavarian enforcement action has been coordinated with other German DPAs, giving reason to believe that this interpretation of the law is not unique to the Bavarian DPA.

According to Benedikt, DPAs in other European countries have also expressed interest in the court’s decision, “and asked us for the basis of our prohibition of using Custom Audiences. So far we only received encouraging feedback. From our perspective, it actually is a very clear matter anyhow.”

After netzpolitik.org published the interview, a PR agency that represents Facebook reached out to them and pointed them towards the following section in the terms and conditions for Facebook’s Custom Audience tool:
“Facebook will not give access to or information about the custom audience(s) to third parties or other advertisers, use your custom audience(s) to append to the information we have about our users or build interest-based profiles, or use your custom audience(s) except to provide services to you, unless we have your permission or are required to do so by law” (emphasis added). While this passage can give the impression that Facebook would not add Custom Audience data to existing profiles, it leaves more than enough room for exception and shifts responsibility to advertisers (“ unless we have your permission”).

Netzpolitik.org has asked Facebook’s PR agency to explain how Facebook actually uses Custom Audience data, and specifically comment on claims that Facebook adds the data it obtains from advertisers to existing user profiles. Facebook declined repeatedly to answer.

This article was originally published at https://netzpolitik.org/2019/facebook-custom-audience-illegal-without-explicit-user-consent-bavarian-dpa-rules/

(Contribution by Netzpolitik.org, Germany)

24 Apr 2019

Protecting personal data world wide: Convention 108+

By Diego Naranjo

Almost one year after the General Data Protection Regulation (GDPR) entered into force in the European Union (EU), the question often arises about what could other countries around the world do to protect their citizens’ personal data. Although there are countries that have data protection laws in place, many still do not, or have laws that are only partially adequate.

The need for a global data protection

Given the existing (and increasing) data flows, having different degrees of data protection in different regions is a threat to those countries and regions that are advanced in their legislations (such as EU, Uruguay, Argentina, and Japan). Harmonisation is also key to ensuring that enforcement is equally strong everywhere, and companies have no possibility to engage in “forum shopping”.

Currently, the global standard for data protection could be the updated Convention 108 (“Convention 108+”). This Convention, even though it was developed by the Council of Europe, can be signed and ratified by any country around the world. The modernised Convention 108 brings a number of improvements to the previous text:

  • Any individual is covered by its protection, independently of their nationality, as long as they are within the jurisdiction of one of the parties who have ratified the Convention.
  • Definitions are updated, and the scope of application includes both automated and non-automated processing of personal data.
  • The catalogue of sensitive data has been extended to include genetic and biometric data as well as trade-union membership or ethnic origin.
  • There are now requirements to notify without undue delay any security breaches.
  • Data subjects are granted new rights, namely the right not to be subject to a decision which affects the data subject which is based solely on an automated processing.

How to get there

While working to improve data protection at national or regional levels, an additional effort should be made to be sure that signing and ratifying Convention 108+ is part of any agenda. On 9 April 2019, the European Council adopted a decision that authorises EU Member States to ratify Convention 108+. This should be done without undue delay. At the same time, the possibilities the Convention 108+ offers for a global data protection campaign will be discussed with activists from around the world during the RightsCon 2019 conference.

Modernised Convention for the Protection of Individuals with Regard to the Processing of Personal Data – Consolidated text

The modernised Convention 108: novelties in a nutshell

Explanatory Report to the Protocol amending the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data

(Contribution by Diego Naranjo, EDRi)

24 Apr 2019

Strategic litigation against civil rights violations in police laws

By Gesellschaft für Freiheitsrechte

Almost every German state has expanded or is preparing to expand police powers. The police authorities are now more often allowed to interfere with civil rights, even before a specific danger has been identified. They are also given new means to conduct secret surveillance online. EDRi member Gesellschaft für Freiheitsrechte (GFF, Society for Civil Rights) is taking legal action against all changes in police powers that violate civil rights. GFF has already lodged constitutional complaints against the police laws in the states of Bavaria and Baden-Württemberg.

In Germany, police powers are defined on the state level, not the federal level. At the moment, there is a clear trend to expand these powers across nearly all German federal states. The development has been pioneered by Bavaria, where in May 2018, the police was endowed with powers nearly comparable to those of secret services. The amendment in question introduced the term of “impending danger”, meaning that the police is allowed to encroach on civil rights in various ways when merely assuming that a dangerous situation could develop — which can virtually always be justified. The police can thus use far-reaching measures like online searches and telecommunications surveillance as preventive instruments.

Trend towards expanded police powers

While Bavaria is the most blatant example, several other states have subsequently introduced police laws that encroach on civil rights. Baden-Württemberg, Saxony-Anhalt, Rhineland-Palatinate, Hesse, Mecklenburg-Western Pomerania, North Rhine-Westphalia, and Brandenburg already amended their police laws.

The amendments differ, but all of them introduce questionable measures that police authorities may now use. Many federal states introduced online searches and telecommunication surveillance. This is an unprecedented way of encroaching on the fundamental right to confidentiality and integrity of information technology systems. At the same time, it means that police authorities may take advantage of security gaps and thereby destabilise the general IT security.

Other new police powers include the use of electronic shackles and bodycams, the extension of video surveillance in public places, the possibility of extended DNA analysis, the extension of maximum detention periods and the technical upgrading of the police (including hand grenades, stun guns and drones).

Legal action against excessive expansion of police powers

GFF and its partners have already filed constitutional complaints against the new police laws in Bavaria and Baden-Württemberg and are currently investigating possible action against the changes in the police laws of the states of North Rhine-Westphalia and Hesse. GFF is also critically involved in the reform debates in the other state parliaments and plans to take legal action against the further expansion of police powers in Germany.

Gesellschaft für Freiheitsrechte (GFF, Society for Civil Rights)

Germany: New police law proposals threaten civil rights (05.12.2018)

Overview of police law changes in the German states prepared by Amnesty International and GFF (only in German)

(Contribution by EDRi member Gesellschaft für Freiheitsrechte, Germany)

24 Apr 2019

Will Serbia adjust its data protection framework to GDPR?

By SHARE Foundation

After a process that took more than five years, the National Assembly of Serbia finally adopted a new Law on Personal Data Protection in November 2018. The law closely follows EU’s General Data Protection Regulation (GDPR), almost to the point of literal translation into Serbian of some parts of the text. That was expected, due to Serbia’s EU membership candidacy. However, it seems it will be very difficult to implement the new legislation in practice – and thereby actually make a difference, as there are numerous flaws that were overlooked when the law was drafted and enacted.

There is not a high level of privacy culture in Serbia and therefore the majority of people are not aware of how the state and the private sector are collecting and handling their personal data. The recent affair with new high-tech surveillance cameras in Serbia’s capital city Belgrade, which were supplied by Huawei and have facial and vehicle license plate recognition capabilities, shows that little thought is invested in how intrusive technologies might impact citizens’ privacy and everyday lives. The highest-ranking state officials for internal affairs, the Minister of Interior and the Director of Police, have announced in the media that these cameras are yet to be installed in Belgrade, while a use case study on Huawei’s official website claimed that the cameras were already operational. Soon after EDRi member SHARE Foundation, a Serbian non-profit organisation dedicated to protecting and improving human rights in the digital environment, published an article with information found in Huawei’s “Safeguard Serbia” use case, the study miraculously disappeared from the company website. However, an archived version of the page is still available.

Considering that the adaptation period provided in the law is only nine months after its coming into force – compared to two years under the GDPR, the general feeling is that both the public and the private sector will have many difficulties in adjusting their practices to the provisions of the new law.

In the past years, we have witnessed many cases of personal data breaches and abuse, the largest one undoubtedly being the case of the now defunct Privatization Agency, when more than five million people, almost the entire adult population of Serbia, had their personal data – such as names and unique master citizen numbers, exposed on the internet. The agency was ultimately shut down by the government, and no-one was held accountable as the legal proceeding was not completed in time (see PDF of Commissioner’s report, 2017, p. 59).

Although the Serbian law contains key elements of the GDPR, such as principles relating to processing of personal data and data subjects’ rights, its text is very complicated to understand and interpret, even for lawyers. One of the main reasons for this is the fact that the law contains provisions related to matters in the scope of EU Directive 2016/680, the so-called “Police Directive”, which deals with processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties and on the free movement of such data. The law also fails to cover video surveillance, particularly important aspect of personal data processing. The Commissioner for Information of Public Importance and Personal Data Protection, Serbia’s Data Protection Authority, and civil society organisations have pointed out these and other flaws on several occasions (see, among other, Serbia’s former Commissioner’s comments), but the Ministry of Justice ignored these comments.

In addition to filing a complaint to the Commissioner, citizens are also allowed under the law to seek court protection of their rights, creating a “parallel system” of protection which can lead to legal uncertainty and uneven practice in the protection of citizens’ rights. Regarding data subjects’ rights, the final text of the law includes an article with limitations to these rights, which omitted that they can only be restricted by law. In practice, this would mean that state institutions or private companies processing citizens’ personal data may arbitrarily restrict their rights as data subjects.

To make matters even more complicated, the Serbian National Assembly still hasn’t appointed the new Commissioner, the head of the key institution for personal data protection reform. The term of the previous Commissioner ended in December 2018, and the public is still in the dark as to whom will be appointed and when. There are also fears, including on behalf of civil society and experts on the topic, that the new Commissioner might not be up to the task in terms of expertise and political independence.

New and improved data protection legislation, adapted for the world of mass data collection and processing via artificial intelligence technologies, is a key component of a successful digital transformation of society. In Serbia it is, however, usually considered as a procedural stepto join the EU. A personal data protection framework which meets high standards set in the GDPR in practice is of great importance for the digital economy, particularly for Serbia’s growing IT sector. If all entities processing personal data can demonstrate that they are indeed GDPR-compliant in their everyday practices, and not just “on paper”, there will be more opportunities for investments in Serbia’s digital economy and for Serbian companies to compete in the European digital market.

It will take a lot of effort to improve the standards of data protection in Serbia, especially with a data protection law which will be difficult to implement in practice. Therefore, it is of utmost importance that the National Assembly appoints a person with enough expertise and professional integrity as the new Commissioner, so that the process of preparing both the private and public sector for the new regulations can be expedited. As the application of the new Law on Personal Data Protection starts in August 2019, it should be regarded as just the beginning of a new relationship towards citizens’ data, which requires a lot of hard work to accomplish. Otherwise, the law will remain just a piece of paper with no practical effect.

This article was originally published at https://policyreview.info/articles/news/will-serbia-adjust-its-data-protection-framework-gdpr-practice/1391

SHARE Foundation

Law on Personal Data Protection (only in Serbian, 13.11.2018)

Outgoing Serbia’s Commissioner warns of data protection law (23.10.2018)

Serbian Data Protection Commissioner: NGOs call for transparency (04.12.2018)

(Contribution by Bojan Perkov, EDRi member SHARE Foundation, Serbia)

17 Apr 2019

Press Release: EU Parliament deletes the worst threats to freedom of expression proposed in the Terrorist Content Regulation


Today, 17 April 2019, the European Parliament (EP) adopted its Report on the proposed Terrorist Content Regulation. Although it has been questioned whether this additional piece of law is necessary to combat the dissemination of terrorist content online, the European Union (EU) institutions are determined to make sure it sees the light of day. The Regulation defines what “terrorist content” is and what the take-down process should look like. Fortunately, Members of the European Parliament (MEPs) have decided to include some necessary safeguards to protect fundamental rights against overbroad and disproportionate censorship measures. The adopted text follows suggestions from other EP committees (IMCO and CULT), the EU’s Fundamental Rights Agency, and UN Special Rapporteurs.

The European Parliament has fixed most of the highest risks that the original proposal posed for fundamental rights online.”

said Diego Naranjo, Senior Policy Advisor at EDRi.

We will follow closely next stages’ developments, since any change to today’s Report could be a potential threat to freedom of expression under the disguise of unsubstantiated ‘counter-terrorism’ policies.

he further added.

European Digital Rights (EDRi) and Access Now welcome the improvements to the initial European Commission (EC) proposal on this file. Neverthless, we doubt the proposal’s objectives will be achieved, and point that no meaningful evidence has yet been presented on the need for a new European counter-terrorism instrument. Across Europe, the inflation of counter-terror policies has had disproportionate impact on journalists, artists, human rights defenders and innocent groups at risk of racism.

The proposed legislation is another worrying example of a law that looks nice, politically, in an election period because its stated objective is to prevent horrendous terrorist content from spreading online. But worryingly, the law runs the severe risk of undermining freedoms and fundamental rights online without any convincing proof that it will achieve its objectives.

said Fanny Hidvegi, Europe Policy Manager at Access Now

During the rest of the process, the very least the EU co-legislator must do is to maintain the basic human rights safeguards provided by the European Parliament’s adopted text.

she further added

The next step in the process are trilogues negotiations between the European Commission, the European Parliament and Member States. Negotiations are expected to start in September / October 2019.

Read more:

Terrorist Content Regulation: Successful “damage control” by LIBE Committee (08.04.2019)

CULT: Fundamental rights missing in the Terrorist Content Regulation

Terrorist Content: IMCO draft Opinion sets the stage right for EP (18.01.2019)

Terrorist Content Regulation: Document pool

16 Apr 2019

EDRi is looking for an interim Executive Director (6 months maternity cover)


European Digital Rights (EDRi) is an international not-for-profit association of 42 civil society organisations. We defend and promote human rights and freedoms in the digital environment, such as the right to privacy, freedom of expression, and access to information.

We are looking for an interim Executive Director to replace our current Executive Director during her maternity leave (6 months from mid-July 2019 to mid-January 2020).

The Executive Director provides overall leadership and management of the strategy, policy, resources, operations, and communications of EDRi. The Executive Director is responsible for the management of the organisation and all aspects of its operations. While the Interim Executive Director is not expected to be a specialist in specific operations (campaigns, fundraising, HR, administration, finance, etc.), s/he has a sufficient grasp of all domains to ensure that staff members can achieve their objectives and that they and the EDRi members can work well together to achieve the organisation’s mission.

We are an equal opportunities employer with a strong commitment to transparency and inclusion. We strive to have a diverse and inclusive working environment. We encourage individual members of groups at risk of racism or other forms of discrimination to apply for this post.

Job title: Interim Executive Director
Start and end dates: 15 July 2019 – 15 January 2020
Reports to: Board of Directors, (human resources task force)
Line-manages: policy, advocacy, campaigning, communications, fundraising, and organisational support teams
Scope: staff members 10, annual budget of approx. 830k euro


1. Leadership, organisation mission and strategy

  • steer the consultation phase of the strategic planning process
  • provide leadership and management for the organisation
  • implement the annual work plan and ensure rigorous evaluation
  • Start preparations for the 2020 general assembly
  • support the Board, and prepare quarterly financial and narrative reports
  • represent the organisation at events as necessary
  • support development of policy strategy and taking of tactical decisions

2. Financial sustainability and oversight

  • prepare the yearly budget, oversee expenditure
  • oversee and contribute to the raising of funds from foundations corporations and individual donors
  • maintain good relations with donors and oversee reporting to them
  • oversee fiscal management operating within the approved budget
  • ensure that sound book-keeping and accounting procedures are followed
  • ensure that the organisation complies with relevant legislation and grant contracts

3. Organisation operations

  • ensure the implementation of Board decisions
  • ensure that the Board is made aware of all matters requiring a Board decision
  • inform the Board of all developments of major significance to the organisation
  • oversee internal human resources policies and ensure staff retention
  • provide oversight of all staff and organise weekly meetings with staff
  • foster effective teamwork and establish a positive work environment
  • evaluate the individual objectives with staff members
  • undertake regular one to one meetings with all staff
  • sign contracts and other agreements on behalf of EDRi
  • give or refuse final approval for any unforeseen use of resources


  • senior management experience preferably in a non-governmental organisation
  • solid, hands-on financial and budget management skills
  • strong organisational abilities, especially for planning, delegation and project management
  • ability to convey the vision of EDRi’s strategic future to staff, Board, network and donors
  • ability to build trusted relationships with, and to collaborate with and oversee all staff
  • knowledge of EU policy-making processes
  • knowledge and/or experience in understanding the NGO sector
  • awareness and knowledge of the EU’s political environment
  • knowledge of the human rights and digital rights field and affinity with EDRi’s values and mission,
  • knowledge and/or experience in the field of human resources management
  • knowledge and/or experience in fundraising unique to nonprofit sector
  • knowledge and/or experience in conflict resolution
  • public speaking skills
  • ability to interface and engage EDRi’s main stakeholders


Passionate, idealistic, enduring, team player, diplomatic, discreet, patient, mission-driven, self-directed, and committed to knowledge-sharing and high-integrity leadership.


  • fluency in written and spoken English
  • strong written and verbal communication skills
  • budgeting (oversight, presenting, monitoring)
  • knowledge of free and open source operating systems and software are a plus


To apply please send a maximum one-page cover letter and a maximum two-page CV (only PDFs are accepted) by email to applications[at]edri.org. Closing date for applications is 30 April 2019. Interviews with selected candidates will take place around mid-May, with a start date of (ideally) 15 July.

15 Apr 2019

EU Member States give green light for copyright censorship

By Diego Naranjo

Today, on 15 April 2019, European Union Member States gave their final approval to the text of the copyright Directive as it was adopted by the European Parliament on 26 March. This vote in the Council of the EU was the last procedural requirement in the EU law-making process. Now the Directive, once translated and published in the Official Journal of the EU, will become law.

19 Member States voted in favor of the Directive and effectively ignored hundreds of thousands of people who went on the streets in Europe to protest against upload filters and a petition signed by five million people. Six Member States (Finland, Italy, Luxembourg, the Netherlands, Poland, and Sweden) voted against the Directive text, while three (Belgium, Estonia, and Slovenia) abstained, showing how controversial the text is. You can find the full results of the vote on the Save Your Internet campaign website.

Member States will now have two years to implement the Directive in their legislation. The only way to prevent, in practice, upload filters for copyright purposes in the EU is to influence the national level implementation. To do this, we encourage you to support civil rights groups working to defend digital rights in your country!

Read more:

Filters Incorporated (19.04.2019)

Censorship machine takes over internet (26.03.2019)

Copyright reform: Document pool

12 Apr 2019

Cross-border access to data for law enforcement: Document pool


The European Commission proposed a Regulation on cross-border access to and preservation of electronic data held by service providers and a Directive to require service providers to appoint a legal representative within the EU in April 2018. Since then, the legislative process to adopt them has been fast-tracked, which has prevented any proper assessment of these measures to be carried out.

On 7 December 2018, the Council of the European Union reached a general approach on the text, that is to say a political agreement on a negotiating position to enter the negotiations with the European Parliament. The Civil Liberties, Justice and Home Affairs Committee of the Parliament decided to first make a thorough assessment of the Commission’s proposal before adopting its position. An introduction was first published to identify a number of questions for discussion that will be followed up by topical working documents.

The United States, adopted its own piece of legislation in a rush, the US Clarifying Lawful Use of Overseas Data (CLOUD) Act. The bill allows to access data stored outside of US territory while bypassing the legal safeguards of traditional international cooperation frameworks.

In parallel, the Council of Europe (CoE) has been also preparing a new protocol to the Convention on Cybercrime (also known as “the Budapest Convention”) on cross-border access to data by law enforcement authorities. This Second Protocol is expected to be finalised by the end of 2019

EDRi has been sending submissions to all institutions to ask for human rights to be respected. In this document pool, you will find the relevant information, documents and analyses on the e-evidence proposals. We’ve been updating this document pool as the process advanced.

Last update: 25 April 2019.

EDRi’s analysis and recommendations
Legislative documents
EDRi’s blogposts and press releases

EDRi’s analysis and recommendations:

Legislative documents:

More information in EUR LEX (EU Database on preparatory acts) and OEIL (European Parliament’s Legislative Observatory)

EDRi’s blogposts and press releases:

RightsCon session on cross-border access to e-evidence – key interventions (10.04.2017)
Access to e-evidence: Inevitable sacrifice of our right to privacy? (14.06.2017)
Cross-border access to data: EDRi delivers international NGO position to Council of Europe (18.09.2017)
Cross-border access to data has to respect human rights principles (20.09.2017)
CLOUD Act: Civil society urges US Congress to consider global implications (19.03.2018)
Nearly 100 public interest organisations urge Council of Europe to ensure high transparency standards for cybercrime negotiations (03.04.2018)
EU “e-evidence” proposals turn service providers into judicial authorities (17.04.2018)
Independent study reveals the pitfalls of “e-evidence” proposals (10.10.2018)
Growing concerns on “e-evidence”: Council publishes its draft general approach (05.12.2018)
EU Council’s general approach on “e-evidence”: From bad to worse (19.12.2018)


Joint Civil society letter to the Members of the US Congress on the US CLOUD Act (19.03.2018)
Joint Civil society letter to the Secretary General of the Council of Europe on the draft Second Additional Protocol to the Convention on Cybercrime (03.04.2018)
Joint Civil Society Response to Discussion Guide on a 2nd Additional Protocol to the Budapest Convention on Cybercrime (28.06.2018)
European Parliament Research Service’s assessment of the Commission’s proposals on electronic evidence (09.2018)
Joint Civil society letter to Member States about their draft position on “e-evidence” (05.12.2018)

Legislative process: