A measure which would be illegal if implemented by a government should also be illegal if implemented by industry as a “voluntary” measure, as a result of government pressure or for public relations or anti-competitive reasons. However, as key international legal instruments, such as the European Charter of Fundamental Rights and the European Convention on Human Rights, as well as national constitutions are binding for states and governments, they are not directly applicable to other entities, such as private companies. As a result, there is a major trend towards governments persuading or coercing companies to impose restrictions on fundamental freedoms under the guise of “self-regulation,” thereby circumventing legal protections.

02 May 2018

New EU proposals foresee mandatory biometrics in national ID cards

By Statewatch

The European Commission has proposed a host of new measures aimed at “denying terrorists the means to act” which include the mandatory inclusion of two biometrics – fingerprints and a facial image – in all ID cards and residence documents for European Union citizens and their family members issued by Member States.

----------------------------------------------------------------- Support our work with a one-off-donation! -----------------------------------------------------------------

According to the Commission’s proposal: “Up to 370 of the 440 million citizens in 26 Member States (DK [Denmark] and UK do not issue ID cards – Ireland is unsure whether it issues obligatory ID cards or not) could hold national ID cards”, although “identity card ownership is common and compulsory in 15 Member States” and there are five other Member States in which citizens “are obliged to hold a non-specific document for identification purposes. In practice this very frequently is an identity card”.

The measure essentially aims at fingerprinting the majority of EU citizens – which will compliment the fingerprinting of non-EU citizens as required by the Visa Information System (VIS), for those who require a visa to enter the bloc, and as foreseen by the Entry/Exit System, which will hold the fingerprints on almost all non-EU nationals exempt from visa requirements.

A document released alongside the proposals states:
It is estimated that 80 million Europeans currently have non-machine readable ID cards without biometric identifiers. As many of the EU’s security measures rely on secure travel and identity documents – such as the systematic checks carried out at the external borders on all citizens using the Schengen Information System – this creates a security gap, because of the increased risk of falsification and identity fraud. It also leads to practical difficulties for citizens when travelling or moving to another Member State.

The Commission is therefore proposing measures to strengthen the security features of ID cards and residence documents of EU citizens and their non-EU family members. More secure documents will enhance EU external border management, increase the protection against falsification and document fraud and make it more difficult to misuse or copy such documents. This will benefit the security of all citizens, public authorities and businesses.”

Proposals for new rules on national ID cards have been put forth alongside proposed measures to ease cross-border access to financial information for law enforcement authorities; to make the acquisition of explosives precursors more difficult; and for stricter controls on the import and export of firearms.

On 16 April, the Commission also published a proposal for new rules allowing easier cross-border access to data (referred to as “e-evidence” for political purposes) for police and judicial authorities, with the measures “maximising risks for fundamental rights violations”.

Alongside the new proposals came the latest progress report on the Security Union: Fourteeneth progress report towards an effective and genuine Security Union (COM(2018) 211 final).

Some official documentation (such as impact assessments) is currently unavailable.

This article was originally published by EDRi member Statewatch at

Security Union: Commission presents new measures to deny terrorists and criminals the means and space to act (17.04.2018)

Frequently Asked Questions: Security Union – Denying terrorists the means to act (17.04.2018)

Proposal for a Regulation on strengthening the security of identity cards of Union citizens and of residence documents issued to Union citizens and their family members exercising their right of free movement (COM(2018) 212 final) (17.04.2018)

Impact assessment (SWD(2018) 110 final) (17.04.2018)

(Contribution by EDRi member Statewatch, the United Kingdom)



02 May 2018

Xnet: Petition on net neutrality guideline violation in Spain

By Xnet

In November 2015, the “Telecommunications Single Market Regulation”, which includes provisions on net neutrality, was adopted. In August 2016, the Body of European Regulators for Electronic Communications (BEREC) published its guidelines on the implementation of the net neutrality rules.

----------------------------------------------------------------- Support our work with a one-off-donation! -----------------------------------------------------------------

Since then, the Spanish EDRi member Xnet has been deeply concerned about the lack of transparency in the implementation of the Regulation in their country. As an exception in Europe, in Spain the National Regulation Authority (NRA), the National Committee for the Markets and Competition (CNMC) it is not responsible for overseeing the correct implementation of net neutrality. In fact, the CNMC is only responsible for resolving conflicts between operators about net neutrality. The body responsible for the implementation of the Regulation is not an independent body. This is worrisome since the largest telecommunication companies have a close relationship with the government. The institution in charge of supervising the implementation of the net neutrality rules is the Secretariat of State for the Information Society and the Digital Agenda, which seems to be failing, so far, to do it correctly. This is an eminently predictable problem, which is why the EU has traditionally insisted on national regulatory authorities being independent of government.

The situation is worrisome as:

    • Zero-rating is increasingly becoming a common practice and it is deeply restricting fair access to the internet;
    • Transparency requirements on the connection speeds advertised in the contracts to the public are not respected;
    • Transparency requirements on traffic management practices of providers and their implementation are not respected;
    • The obligation of setting up a complaint channel is not met.

The mere fact that the Ministry of Energy has not opened a complaint channel as required by the Regulation makes it a bureaucratic nightmare to notify the authorities about net neutrality infringements. Xnet has been trying to get in contact with the Ministry, so far without success. This is why Xnet has turned to the European institution through a petition “Violation of the Net Neutrality Guidelines in Spain”. The European Parliament’s Petitions Committee received the petition with the number 0210/2018.

Inaction of Ministry of Energy, Tourism and the Digital Agenda with regard to its obligation to apply necessary net neutrality rules guaranteeing a free and open internet (16.11.2017)

Net Neutrality (only in Spanish)

(Contribution by Simona Levi, EDRi member Xnet, Spain)



02 May 2018

EU Member States fight to retain data retention in place despite CJEU rulings

By IT-Pol

EU Member States are still working to adopt their position on the ePrivacy Regulation proposed by the European Commission in January 2017. A number of draft compromise texts have been published by the Council Presidency before discussions in the Working Party on Telecommunications and Information Society (WP TELE).

----------------------------------------------------------------- Support our work with a one-off-donation! -----------------------------------------------------------------

Unfortunately, the Council transparency in publishing those documents does not extend to the part of the ePrivacy Regulation that concerns data retention. This means mainly Article 11, which allows Member States to restrict the rights to data protection and confidentiality of electronic communication under certain conditions, in a similar way to Article 15(1) of the current ePrivacy Directive. This part of the ePrivacy Regulation is being discussed jointly by WP TELE and the Working Party on Information Exchange and Data Protection – Friends of the Presidency on Data Retention (DAPIX FoP), which is also tasked with analysing the implications of the Tele2 judgment (joined cases C-203/15 and C-698/15) from the Court of Justice of the European Union (CJEU).

Documents from these discussions are marked “LIMITE” and therefore not generally available to the public. An incomplete picture of the work is available through a combination of Freedom of Information (FOI) requests and leaked documents. It is known that DAPIX FoP has developed the concept of ”restricted data retention” which is a deliberately crafted attempt to circumvent the Tele2 ruling of the highest court of the European Union (the CJEU) with a data retention scheme that is, in reality, general and undifferentiated (and therefore illegal) while officially claiming not to be.

Recently, the working document WK 11127/2017 of 10 October 2017 was released in full through a FOI request by Corporate Europe Observatory. This document provides another piece of the puzzle regarding the secret data retention discussions in Council working groups by outlining two different strategies for storage of electronic communications metadata for law enforcement purposes.

The first strategy is based on data retained by providers of Electronic Communication Services (ECS) for business purposes. Article 6(2)(b) of the Commission proposal for the ePrivacy Regulation allows ECS providers to process electronic communications metadata for purposes of billing, calculating interconnection payments as well as stopping fraudulent or abusive use of ECS. The working document proposes to expand Article 6(2)(b) to include ”illicit use” of ECS, which would allow processing for a broader purpose than abuse or fraudulent use of the communications service itself. Potentially, ”illicit use” could include any crime or illegal behaviour committed by the subscriber with the assistance of the electronic communications service, even if the ECS provider is not the victim of the offence (such as through fraudulent use of the service). The working document further proposes a minimum six month retention period for electronic communications data processed under the broadened purposes of Article 6(2)(b).

In effect, this is mandatory blanket data retention disguised as storage of communications data processed for voluntary business purposes, like billing. When ECS providers process communications data for business purposes, the processing, and in particular any storage of personal data, should be limited to the duration necessary for this purpose. Setting a minimum mandatory retention period for communications data processed under Article 6(2)(b) will mean weakening the level of protection guaranteed under the General Data Protection Regulation (GDPR), which is not only unacceptable but also contradictory to the ePrivacy Regulation being lex specialis to the GDPR. If Member States want to “ensure” the availability of electronic communications data for law enforcement, this should be done by appropriately restricting the rights to data protection and confidentiality of communications in accordance with Article 11 of the ePrivacy Regulation and, in particular, in accordance with the CJEU case law which prescribes targeted data retention rather than blanket data retention.

The second consideration in working document WK 11127/2017 is to exclude processing for law enforcement purposes from the scope of the ePrivacy Regulation in Article 2(2). Under the current ePrivacy Directive, both the retention of electronic communications data and access to retained data by competent authorities is within the scope of the Directive. The working document suggests that excluding processing for law enforcement purposes from the scope of the ePrivacy Regulation could ”bring more clarity to the legal context of data retention”. This would put national legislation for mandatory data retention outside the scope of the ePrivacy Regulation and possibly even outside the scope of EU law, which would be very dangerous for fundamental rights. It could also be considered that it does not put this activity outside the scope of EU law (or at least not fully), as data retention could be considered an exception to the GDPR. So much for “clarity”.

The current ePrivacy Directive provides legal clarity for the retention of electronic communications data and access to the retained data since both types of processing are covered by Article 15(1) of the Directive. Furthermore, CJEU case law provides specific conditions for retention and access to electronic communications data, which ensure appropriate safeguards for fundamental rights. Excluding processing for law enforcement purposes from the scope of the ePrivacy Regulation would bring less legal clarity, not more. In addition, a Regulation aimed at protecting personal data and confidentiality of electronic communications would be deprived of its purpose if certain types of processing (such as “processing for law enforcement purposes”) are completely excluded from its scope. This was also noted by the CJEU in paragraph 73 of the Tele2 judgment.

On 25 April 2018, EDRi member Statewatch published a recent document from the Bulgarian Council Presidency on data retention. Working document WK 3974/2018 looks at the “renewable retention warrant” (RRW). The intention is that competent authorities can issue data retention orders (warrants) to ECS providers under certain conditions. The legal basis for issuing RRWs will have to be national law as no EU legal basis currently exists. It is suggested by the Presidency that ECS providers could appeal the warrant, which would give private companies the job of safeguarding citizens’ fundamental rights. Even though the data retention requirements for RRWs could differ among ECS providers, the Presidency notes that the RRW would be rendered ineffective for law enforcement purposes if not all providers are covered. This will make the RRW approach identical to blanket data retention for all practical purposes and, therefore, a clear circumvention of CJEU rulings.

The patchwork of Council documents (only some of which are available) from DAPIX FoP on data retention shows that some Member States governments are exploring every possible option to uphold their current data retention requirements, despite two very clear CJEU rulings in 2014 and 2016 that blanket data retention is illegal under EU law. These efforts often take place behind closed doors in Council working groups, and the discussions only receive input from Member States’ governments and EU institutions in the law enforcement area, such as Europol and the EU Counter-Terrorism Coordinator. The European public, civil society organisations and data protection authorities are excluded from most of the critical discussions around data retention. In the past, this approach has repeatedly produced legislation such as the Data Retention Directive which was later overturned by the CJEU.

After working document WK 11127/2017 was published in full, European Digital Rights and EDRi members Access Now, Privacy International and IT-Pol Denmark, sent an open letter to EU Member States on the ePrivacy reform. The letter calls upon EU Member States to ensure privacy and reject data retention.

ePrivacy: Civil society letter calls to ensure privacy and reject data retention (24.04.2018)

Freedom of Information request by CEO for WP TELE ePrivacy documents (17.04.2018)

“Renewable retention warrants”: a new concept in the data retention debate, Statewatch (25.04.2018)

EU Member States plan to ignore EU Court data retention rulings (29.11.2017)

(Contribution by Jesper Lund, EDRi member IT-Pol, Denmark)



02 May 2018

Big Brother Awards – tips and materials for organisers

By Iuridicum Remedium

In October 2018; we will celebrate 20th anniversary of the first Big Brother Awards (BBA) event in UK. Big Brother Awards is an event which seeks to highlight violations of our privacy, especially with regard to new methods of surveillance, associated with the development of technology. Since 1998, the Big Brother Awards have been organised in a number of countries around Europe – in some countries, the Awards are a new initiative, while in many others, a solid tradition has been established, and the BBA has become an annual event. Thanks to the BBA events, the information about the most striking violations in the field of privacy is shared with the broader public.

----------------------------------------------------------------- Support our work with a one-off-donation! -----------------------------------------------------------------

The “father of the Big Brother Awards” Simon Davies summarises the meaning of BBA: “It has been a great honour to be the founder of the Big Brother Awards. In the 20 years since the first edition, the world has become sensitised to privacy, and the awards have been instrumental in that. People across the world are sensitive to privacy issues. They have become activists in every domain, large and small. Governments and corporations are terrified. We have created a real movement that will shape the future of human rights.”

Iuridicum Remedium, with the support from Digital Rights Fund, prepared materials for current and especially future organisers of the Big Brother Awards. This BBA package consists of tips and tricks, sets of experiences, inspirations and recommendations from various experienced European BBA organisers. IuRe has also designed a fresh new visual style for BBA, including graphic elements, graphic manual and merchandise design.

“When creating the uniform visual design of Big Brothers Awards, I was inspired by vintage propaganda materials. I found its straightforward style and sense of urgency were a good fit. Using irony, I tried to shift the negative aspect of propaganda to a much more playful mode – so that we all stay alert and do our best to block manipulation,” said Tomas Vovsik, the designer of the BBA visual style, about his concept.

All these materials and graphics are ready for free use by BBA organisers or digital rights NGOs, and IuRe hopes they will be helpful for other digital rights defenders and their campaigns.

Big Brother Awards – Tips and tricks

Big Brother Awards – Graphic Manual and designs

Big Brother Awards – Graphic design user manual

European fund for digital rights launched (08.02.2017)

Czech BBA for Ministry of Industry and Trade for data retention (07.03.2018)

Europe’s governments win the Big Brother Awards 2017 for opening the pandora’s box of surveillance (13.10.2017)

Dutch mass surveillance law receives two BBA nominations (29.11.2017)

Finnish Big Brother Award goes to intrusive loyalty card programme (07.09.2017)

BBA Germany 2017: Espionage, threats, tracking, provoking cyber wars (17.05.2017)

(Contribution by Jan Vobořil, EDRi member Iuridicum Remedium – IuRe, Czech Republic)



02 May 2018

Are GDPR certification schemes the next data transfer disaster?

By Foundation for Information Policy Research

The General Data Protection Regulation (GDPR) encourages the establishment of data protection certification mechanisms, “in particular at [EU] level” (Art. 42(1)). But the GDPR also envisages various types of national schemes, and allows for the approval (“accreditation”) of schemes that are only very indirectly linked to the national data protection authority.

----------------------------------------------------------------- Support our work with a one-off-donation! -----------------------------------------------------------------

On 6 February 2018, the Article 29 Working Party (WP29) adopted Draft Guidelines on the accreditation of certification bodies under Regulation (EU) 2016/679 (WP261). On 16 February, it issued a call asking for comments on these draft guidelines. Why can this seemingly technical issue have major implications, in particular in relation to transfers of personal data to third countries without “adequate” data protection (such as the USA)?

The GDPR stipulates that, in relation to several requirements (consent, data subject rights, etc.), a data protection seal (issued at national or EU level) can be used as “an element by which to demonstrate” the relevant matters. This makes such seals useful and valuable, but still allows the data protection authorities to assess whether a product or service for which a seal has been issued really does conform to the GDPR.

However, in one context this is different: in relation to transfers of personal data to third countries without adequate data protection. Such transfers are in principle prohibited, subject to a limited number of exceptions, including where “appropriate safeguards” are provided by the controller or processor (Art. 46). In this regard, the GDPR stipulates that such appropriate safeguards “may be provided for” inter alia by:
an approved certification mechanism pursuant to Article 42 together with binding and enforceable commitments of the controller or processor in the third country to apply the appropriate safeguards, including as regards data subjects’ rights (Art. 46(2)(f)).

In other words, in relation to transfers of personal data to countries without adequate data protection, certifications are conclusive: they provide, in and by themselves, the required safeguards. Indeed, the article adds that certifications can achieve this “without requiring any specific authorisation from a supervisory authority” (leading sentence to Article 46(2)).

In the highly sensitive context of data transfers, it is therefore crucial that certification schemes will ensure that certifications can and will only be issued in cases in which they really provide cast-iron safeguards, “essentially equivalent” to those provided within the European Union and the European Economic Area (EEA) by the GDPR. Otherwise, the very same problems and challenges will arise as arose in relation to the discredited “Safe Harbor” scheme and the not-much-less contestable (and currently contested) “Privacy Shield”.

Unfortunately, the GDPR does not directly guarantee that certification schemes must be demanding and set high standards. Rather, member states can choose from three types of arrangement: the relevant national data protection authority (DPA) issuing seals; the national DPA accrediting other bodies to issue seals; or leaving it to national accreditation bodies to accredit other bodies to issue seals. In the last case, the seal-issuing bodies are therefore two arms-lengths removed from the DPAs. Moreover, national accreditation bodies normally accredit technical standards bodies, for example, for medical devices or toys – they are unsuited to approve mechanisms supposed to uphold fundamental rights. This could lead to low-standard seal schemes, in particular in countries that have always been lax in terms of data protection rules and enforcement, such as the UK and Ireland.

The only safeguard against the creation of weak certification schemes lies in the criteria for accreditation of certification schemes, applied by the relevant accrediting body (which as just mentioned need not be the country’s DPA): those criteria must be approved by the relevant national DPA, subject to the consistency mechanism of the GDPR (which means that ultimately the new European Data Protection Board, created by the GDPR as the successor to the Article 29 Working Party) will have the final say on those criteria. But this is still rather far removed from the actual awarding of certifications.

Surprisingly, the Draft Guidelines on the accreditation of certification bodies, released by the WP29, do not include the very annex that is to contain the accreditation criteria.

To the extent that the WP29 say anything about them, they play them down: the WP29 says that the as-yet-unpublished guidelines in the not-yet-available annex will “not constitute a procedural manual for the accreditation process performed by the national accreditation body or the supervisory authority”, but rather will only “provide […] guidance on structure and methodology and thus a toolbox to the supervisory authorities to identify the additional requirements for accreditation” (p. 12).

As pointed out in a letter to the WP29, “the WP29 Draft Guidelines therefore fail to address the most important issues concerning certification”. The letter calls on the WP29 to:

urgently provide an opinion on the ways in which it can be assured that certification schemes will really only lead to certifications at the highest level, and in particular to ensure that certifications will not be used to undermine the strict regime for transfers of personal data from the EU/EEA to third countries that do not provide “adequate” (that is: “essentially equivalent”) data protection to that provided by the GDPR –

[and to]

urgently move towards the accreditation of (a) pan-EU/EEA certification scheme(s) at the highest level, and adopt a policy that would require controllers and processors involved in cross-border processing operations within the EU/EEA and/or data transfers to third countries without adequate data protection to seek such pan-EU/EEA certifications for such cross-border operations, rather than certifications issued by national schemes.

Draft Guidelines on the accreditation of certification bodies under Regulation (EU) 2016/679 (WP261)

Letter to the Article 29 Working Party

General Data Protection Regulation (GDPR)

(Contribution by Douwe Korff, EDRi member Foundation for Information Policy Research – FIPR, United Kingdom)



02 May 2018

Facebook: Unanswered questions

By Joe McNamee

On 9 April 2018, EDRi received an invitation from Facebook to attend a meeting to the loss of trust in Facebook, following the Cambridge Analytica scandal. The meeting was proposed for 26 April.

It struck us that, if Facebook wanted an honest exchange, it would be happy to answer some of the most obvious outstanding issues.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

Encouragingly, Facebook said it would welcome the questions and said that they would still also like the meeting on 26 April.

The questions were sent on 16 April and… we never heard from Facebook again…

Here they are:

1. Facebook’s new policy is based on opt-in for facial recognition being applied to inform Facebook users of their faces appearing on photos uploaded by other users. Does this mean that Facebook will index all facial profiles on any photo uploaded, regardless of any consent by any person depicted? Please answer with “yes” or “no” and explain.

1b. More specifically, will Facebook refrain from analysing any photograph uploaded by any user for biometric data about persons depicted on those photos until it has received an opt-in by every person depicted on those photos? Please answer with “yes” or “no”.

2. You state the following: “Second, we’ll ask people who’ve previously chosen to share their political, religious, and “interested in” information in their profile to check that they want to continue to share it.”

Does the above mean that any of the above data will be deleted if Facebook does not receive an explicit consent to retain it? Please answer with “yes” or “no”.

If “yes”, what will be the cut-off date before Facebook starts deleting such data?

2.b If by “sharing” it is meant that the scope of the discontinuation is limited to sharing with other Facebook users and/or Facebook affiliates, how does Facebook consider that this complies with the requirements of art. 9 GDPR for processing these special categories of data?

3. Privacy International created a new Facebook profile to test default settings. By default, everyone can see your friends list & look you up using the phone number you provided. This is not what proactive privacy protections looks like. How does this protect users by design and by default?

4. According to your notification, a “small number of people who logged into ‘This Is Your Digital Life’ also shared their own News Feed, timeline, posts and messages which may have included posts and messages from you”. Why was this not notified to the appropriate national authorities immediately? Are other apps also able to share / receive messages from me?

5. If a similar situation to the one involving Cambridge Analytica were, despite your efforts, to arise again, who would be responsible, Facebook Inc or Facebook Ireland?

6. Why do privacy settings continue to only focus on what friends can & can’t see? If the recent FB scandal has showed one thing, it is that FB’s ad policies have far-reaching consequences for users’ privacy. When are you going to treat ad settings as privacy settings?

7. The GDPR includes new provisions on profiling and automated decision-making. How are you going to change your ad targeting practices to be compliant?

8. The Economist recently reported on how difficult it is for Europeans to download their personal data from Facebook, and Mark Zuckerberg’s testimony described your systems as more transparent than they actually are. How and when, if at all, do you plan to address these issues?

9. You claim to offer a way for users to download their data with one click. Can you confirm that the downloaded files contain all the data that Facebook holds on each user?

You claim to offer a single place to control your privacy. This does not seem to include ways to opt out of ad targeting or to avoid being tracked outside Facebook. Will you offer a single place where users can control every privacy aspect of Facebook, even for people who have no Facebook account?

10. The GDPR gives individuals the right to access and verify their profiles, including marketing profiles based on so called derived data (data that were not disclosed by the user but interpreted from his/her behaviour). Is Facebook going to give its users full access to their marketing profiles? Please answer with “yes” or “no” and explain.

11. Speaking about derived data and marketing profiles, does Facebook process for marketing purposes any data that reveal (directly or indirectly) political opinions of its users? Please answer with “yes” or “no” and explain.

12. Do Facebook apps use smartphone microphones in any way, without this being made clear to the user? If this were to happen, would you consider that lawful?

13. Facebook has voluntary agreements with the Swedish intelligence services to share data. How do you reconcile that with the GDPR?

We are expecting Facebook’s answers any day now…maybe not today, maybe not tomorrow, but soon. If not, we’ll always have Cambridge.

(Contribution by Joe McNamee, EDRi)



02 May 2018

Leak: The “copyright troika” launches another censorship machine attack

By Joe McNamee

On 27 April, a two-hour discussion was held on the Copyright Directive in the Council of the European Union. The meeting discussed text and data mining, restrictions on quoting from and linking to news articles and the infamous “censorship machine” – mandatory upload filters for European web hosting companies. No deal was reached and new discussions will happen on 4 May.

At the meeting, France, Spain and Portugal (joined by Italy this time) once again tabled extreme measures that would restrict freedom of expression, undermine the ability of European Internet companies to conduct a business, and create huge collateral damage for fundamental rights worldwide. France, Spain and Portugal previously tabled a similar lobbyist-driven proposal in October 2017.

This time, the suggestions include:

Article 2.5 and recital 37a

  • Broadening the scope of the companies covered by the proposals in the Directive to any company whose main activity is to provide hosting of web content, regardless of profit motive. The proposal leaves intact the chaotic approach of excluding certain (badly defined) services, such as “not-for-profit open source software developing platforms” which attempts – but fails – to exclude services like Github.
  • Broadening the scope of the companies covered by the proposals in the Directive by removing a proposed limitation to only cover companies with a profit motive and, again, specifying that certain badly-defined services can be exempted if their activity is not “for profit making purposes”.

Rationale: Ironically, the countries justify this new, chaotic, language on the basis that it “creates legal certainty”.

Article 13.4

As a rather hilarious concession, the Member States suggest that services that impose upload filtering cannot be subject to criminal sanctions (or damages). In short, if European companies:

  • implement upload filters
  • implement a “notice and takedown” system, which also prevents future uploads and
  • ensure that all measures have been “agreed upon by rightsholders,”

…then (and only then) can they be protected from the legal chaos that the Directive creates.

A remarkable compromise, indeed.

Article 13.5

Here, the four countries propose an incomprehensible obligation to “take into account” the nature and size of the rightsholders that need to agree to the “effective and proportionate” measures that internet service providers are required to implement.

Article 13.8

Member States are encouraged to establish the “necessary mechanisms” to assess the appropriateness (but not proportionality) of the measures being implemented by the companies. In short, once the internet companies have finished agreeing with big and small rightsholders about the upload filtering technologies that European providers will need to pay for and implement, in order to filter out audio, audiovisual, visual and text works and other subject matter, those technologies should then also be subject to review by Member States.

Leak: Three EU countries join forces for restrictions & copyright chaos (26.10.2017)

Copyright reform: Document pool

Copyright reform: State of play (10.01.2018)

(Contribution by Joe McNamee, EDRi)


27 Apr 2018

15 organisations ask the European Parliament not to weaken net neutrality enforcement

By Maryant Fernández Pérez

On 27 April 2018, EDRi and 14 other organisations sent a letter to the European Parliament’s rapporteur on the European Electronic Communications Code (EECC), Ms. Pilar del Castillo. Ms. Del Castillo is the parliamentarian in overall charge of negotiating a political agreement on behalf of the European Parliament. We are concerned about how the current trilogue negotiations between the European Parliament, the European Commission, and the Council are going.

In the letter, we urge the European Parliament to defend its mandate and not to weaken the enforcement of EU net neutrality rules and reaffirm the role of national regulatory authorities and the Body of European Regulators for Electronic Communications (BEREC) in doing so in a coherent and coordinated manner.

You can read the letter here and below.

27 April 2018

Re: European Electronic Communications Code trilogue negotiations on Article 5

Dear Ms. Del Castillo,

We, the undersigned organisations, are writing to you to express our profound concern about the way that the currently proposed trilogue agreement on Article 5 of the Electronic Communications Code would weaken the role of independent telecommunications regulatory authorities in Europe.

In particular, we are alarmed that the European Parliament delegation appears not to be strongly defending the Parliament’s mandate. The Parliament’s position is quite clear that NRAs should be responsible for “ensuring compliance with rules related to open internet access in accordance with Regulation (EU) 2015/2120” and for “ensuring consumer protection and end-user rights in the electronic communications sector within the remit of their competences under the sectorial regulation, and cooperating with relevant competent authorities wherever applicable”. Any change to this approach can only serve to create legal uncertainty and the weakened enforcement of the Regulation previously approved by the Parliament.

It is entirely unacceptable for the Council to try to undermine an absolutely fundamental element of ensuring a competitive, innovative and open electronic communications market.

Caving in to the demands of the Council, driven by a very small number of Member States, will risk:

  • undermining the crucial independence of NRAs;
  • moving us towards a situation more similar to the United States, which facilitated short-sighted political decisions on net neutrality that undermined crucial independent, evidence- and expert- driven policies supporting the open Internet;
  • undermining coherent and coordinated enforcement of the rules outlined in Regulation (EU) 2015/2120.

According to Article 5(1) of Regulation (EU) 2015/2120, National Regulatory Authorities (NRAs) are the competent authorities to enforce open internet rules. Article 5(4) of this Regulation provides for the possibility for NRAs to conduct additional tasks. The European Electronic Communications Code must not – and cannot – contravene this Regulation, which is directly applicable. We urge the European Parliament to ensure that the European Electronic Communications Code clearly reaffirms the role of NRAs and BEREC in ensuring that the net neutrality rules are enforced efficiently and consistently.

We remain at your disposal for any further information.

We thank you for your time and consideration.

Kind regards,

European Digital Rights (EDRi), a coalition of 39 civil and human rights organisations
Access Now, International NGO, member of EDRi
AFUL, French speaking users of Libre and Free Software NGO
Alsace Réseau Neutre, French local non-profit Internet Access & Service Provider
Aquilenet, French local non-profit Internet Access & Service Provider, member of EDRi
FAImaison, non-profit Internet Service & Access Provider based in Nantes, France
Fédération FDN, federation of local & non-profit ISPs
FFII France, a French NGO fighting against software patents, French local non-profit Internet Access & Service Provider
Frënn vun der Ënn, Luxembourg, NGO, defending privacy and human rights on the internet
Ilico, French local non-profit Internet Service Provider
Illyse, French local non-profit Internet Access Provider
La Quadrature du Net, French NGO defending rights and freedom on the Internet
Midway’s Network, French local non-profit internet service provider based in Belfort

(Contribution by Maryant Fernández Pérez, EDRi)


26 Apr 2018

Let’s stop the Censorship Machine!

By Andreea Belu

We have to make sure our representatives in the European Parliament oppose Article 13 during their vote in the JURI Committee on the proposed Copyright Reform. The dangers have been pointed out repeatedly. Still, they have remained ignored. We therefore decided to send the message in different languages, hoping Parliamentaries will better relate this time. Support our fight against the #CensorshipMachine and spread the word!














Stop the #CensorshipMachine! (10.04.2018)

Proposed internet filter will strip citizens of their rights: Your action is needed! (28.03.2018)

Copyright reform: Document pool

5 Devastating Effects of the EU’s Copyright Proposal (29.03.2018)


26 Apr 2018

Press Release: “Fake news” strategy needs to be based on real evidence, not assumption


Today, 26 April 2018, the European Commission adopted a Communication on “tackling online disinformation”. European Digital Rights (EDRi), The Civil Liberties Union for Europe (Liberties) and Access Now will jointly respond by issuing a joint shadow report in the coming weeks.

“Good policy is based on evidence. For the moment, we have different initiatives from the European Commission that do not even agree on how to define the problem being addressed”,

said Maryant Fernández Pérez, Senior Policy Advisor at European Digital Rights (EDRi).

First we have to understand the problem we face: the real effect of fake news. For that, we need research and data. Liberties urges policy makers to refrain from placing disproportionate limits on free speech and privacy. Doing so will not solve the problem of fake news, but make the situation worse,” said Eva Simon, advocacy officer for freedom of expression at Liberties.

Policy makers should move away from generic and misleading actions under the false umbrella term of ‘fake news’. Access Now urges all actors to adopt, strengthen and respect enforceable privacy rules around online tracking which can solve challenges in the information ecosystem including the spreading of misinformation and profiling of users“, added Fanny Hidvégi, European Policy Manager at Access Now.

We urge the European Commission not to rush into taking binding measures regarding “fake news” or “online disinformation” but rather, take the expertise of civil liberties and digital rights experts into account. Liberties, EDRi and Access Now highlight that any and all measures aimed at addressing online disinformation should:

  • have a clear and narrow problem definition;
  • be based on clear empirical data of actual harms that are of a scale that merits intervention;
  • fully respect international human rights law on freedom of expression, personal data protection and privacy;
  • have clear benchmarks;
  • be subject to rigorous ongoing review to prevent counterproductive effects for freedom of expression, privacy and the public policy goals of the measures;
  • not lead to harmful consequences for the technical functioning of the Internet; among others, they should avoid its fragmentation, and ensure that its security, stability and resiliency is intact; and
  • avoid any measure, such as ancillary copyright, which would serve to make access to quality journalism more difficult and make it even easier to spread disinformation.

Liberties, EDRi and Access Now are working on issuing a shadow report in the coming weeks to provide a thorough human rights assessment of current policy considerations and make constructive recommendations. In the meantime, our full position is in our responses to the Commission’s public consultation.


On 13 November 2017, the European Commission launched a public consultation on “fake news” and “online disinformation”, which did not include a clear definition of “fake news”.

On 13 November 2017, the European Commission announced plans for a “high level expert group” on “fake news” without defining the subject or subjects in which the experts were expected to be experts in.

On 12 January, 2018, the European Commission appointed the 39 people to the Group, which included seven TV broadcaster representatives, but did not include the United Nations Special Rapporteur on Freedom of Expression and Opinion nor a digital rights-focused organisation.

On 12 March 2018, the European Commission published a Eurobarometer opinion survey where individuals were asked about their views on “fake news”. Respondents, however, were told that “news or information that misrepresent reality or that are even false” are called “fake news”. On the same date, the “High-Level Expert Group” presented its final report on the topic. The report would have benefited from more diversity among its membership. For example, we are concerned that the definition of “disinformation” they provide is too broad and relies on the intent rather than the actual effect of the “disinformation”. However, the report raises several key points that we welcome:

  • The High-Level Expert Group cast doubt on the methodology of the Eurobarometer survey, pointing out that “research has shown that citizens often associate the term ‘fake news’ with partisan political debate and poor journalism broadly, rather than more pernicious and precisely defined forms of disinformation.” This clearly indicates that asking about “fake news” in a survey very probably produced unreliable outputs.
  • The report states unequivocally that “censorship and online surveillance and other misguided responses that can backfire substantially, and that can be used by purveyors of disinformation in an “us vs. them” narrative that can de-legitimize responses against disinformation and be counter-productive in the short and long run. Attention should also be paid to lack of transparency and to the privatization of censorship by delegation to specific bodies/entities or private companies”.

On 18 March 2018, the European Data Protection Supervisor (EDPS) published an opinion on online manipulation and personal data which rightly points out that “fake news” is a “symptom of concentrated, unaccountable digital markets, constant tracking and reckless handling of personal data”.

On 26 April 2018, the European Commission published a Communication on fake news and online disinformation. As with previous initiatives on illegal or unwelcome content online, the European Commission fails to:

  • recognise that measures can backfire;
  • collect data to get early warnings of any such counterproductive effects; or
  • plan for measures to respond to any counterproductive effects.

Read more:

EDRi’s response to the public consultation for legal entities – “Fake news and online disinformation” (22.02.2018)

EU Could Kill Free Speech in Fight Against Fake News (12.03.2018)