10 Jan 2018

Proposal to revoke data retention filed with the Czech Court

By Iuridicum Remedium

On 20 December 2017, EDRi member Iuridicum Remedium (IuRe) filed a request with the Constitutional Court of the Czech Republic to revoke the Czech data retention related legislation.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The filing of the request was achieved in close cooperation with the Czech Pirate Party, whose 22 deputies were for the first time elected to the Chamber of Deputies of the Czech Parliament in October 2017. Apart from the Czech Pirate Party, the proposal also won the support of Members of the Parliament across five other parties represented in the Chamber of Deputies. Altogether, 58 signatures were gathered.

The proposal was prepared also thanks to means granted by the Digital Rights Fund. It builds on a similar successful proposal filed by IuRe with the Constitutional Court of the Czech Republic in 2011. In 2012, a new data retention system was adopted that implemented the EU Data Retention Directive that was in force at that time. The recent proposal aims at revoking this new law.

The proposal challenges, in particular, the Electronic Communication Act, the Police Act and the Criminal Procedure Act as well as the implementing legislation which defines the range of data to be kept. Currently, operational and localisation data on electronic communications are stored for six months. Apart from the police and other law enforcement bodies, intelligence agencies, as well as the Czech National Bank, may use the data. According to the Czech Telecommunication Office, for example, mobile phone data were requested in over 470 000 cases in 2016 alone.

The complaint to the court considers the principle of general and indiscriminate data collection a fundamental problem. It relies on two key decisions made by the Court of Justice of the European Union (CJEU) – in cases Digital Rights Ireland and Watson/Tele 2. In both cases, this measure was rejected. The proposal also explains that Czech and German statistical data demonstrates that the absense of data retention did not affect the level of criminality nor the number of criminal cases solved. The proposal also suggests revoking of selected sections of the Police Act that allow data to be requested without court permission. Furthermore, it suggests revocation of selected parts of the Code of Criminal Procedure, which do not sufficiently limit the possibility of requiring data related to serious crimes only.

Based on IuRe’s experiences from 2011, the decision of the Constitutional Court of the Czech Republic can be expected in approximately one year time.

IuRe and Pirate party send complaint on general surveillance of citizens to the Constitutional Court (only in Czech, 20. 12. 2017)
http://www.iure.org/15/pirati-iure-podali-navrh-na-zruseni-plosneho-sledovani-obcanu-ustavnimu-soudu-cr

Czech Republic: Data retention – almost back in business (01.08.2012)
https://edri.org/edrigramnumber10-15czech-republic-new-data-retention-law/

Czech Constitutional Court rejects data retention legislation (06.04.2011)
https://edri.org/edrigramnumber9-7czech-data-retention-decision/

Czech Parliament – close in implementing data retention directive (04.06.2008)
https://edri.org/edrigramnumber6-11czech-data-retention/

European fund for digital rights launched (08.02.2017)
https://edri.org/european-fund-for-digital-rights-launched/

(Contribution by Jan Vobořil, EDRi member Iuridicum Remedium, Czech Republic)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
10 Jan 2018

EU Parliament criticises incompetent Commission work on child abuse

By Joe McNamee

The European Commission proposed its badly drafted “Directive on combating sexual abuse, sexual exploitation of children and child pornography” in 2010. In 2011, it was finally adopted by the Council of the European Union and the European Parliament.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Under the Directive, the European Commission was legally required to publish an implementation report by 18 December 2015. The Commission ignored its legal obligation and published its report a year late, on 16 December 2016. It published one report on the whole Directive and one on the implementation of Article 25 on the Directive, on internet blocking. Despite taking an extra year to collect information, the 13-page document is almost entirely devoid of useful data.

On 14 December 2017, the European Parliament adopted a Resolution which details the shocking failure of the European Commission to take the issue seriously. In particular, the report lists the data that should have been, but was not, collected. Key criticisms include:

“D. whereas the Commission’s implementation report does not provide any statistics on the take-down and blocking of websites containing or disseminating images of child sexual abuse, especially statistics on the speed of removal of content, the frequency with which reports are followed up by law enforcement authorities, the delays in take-downs due to the need to avoid interference with ongoing investigations, or the frequency with which any such stored data is actually used by judicial or law enforcement authorities;”

Very unusually the Parliament uses the word “deplores” to describe both the lateness of the reports and the absence of meaningful data in the report.

“3. Deplores that the Commission was not able to present its implementation reports within the deadline set out in Article 28 of Directive 2011/93/EU and that the two evaluation reports presented by the Commission merely documented transposition into national law by Member States and did not fully assess their compliance with the Directive; requests the Member States to cooperate and forward to the Commission all of the relevant information on the implementation of the Directive, including statistics;”

A further failing identified by the European Parliament concerned hotlines. Despite the European Commission having used taxpayer money to fund national hotlines for over a decade, no assessment of the efficiency of the hotline system was mentioned. The European Parliament said that it:

“5. Considers it regrettable that the Commission’s implementation report fails to mention whether it assessed the efficiency of the INHOPE system when it transfers reports to counterparts in third countries;”

In an important step forward, the Parliament recognised that “blocking” is not one thing, but a range of technologies, with varying degrees of effectiveness and intrusiveness. It therefore criticised the Commission’s incompetent references to “blocking” which do not provide any meaningful insights. It also praised the fact that the implication of the Commission’s report is that it has abandoned the support for “blocking” (whatever it understands by that word, if anything):

“6. Considers it regrettable that the Commission has failed to collect data on the types of blocking that have been used; considers it regrettable that data has not been published on the number of websites on blocking lists in each country; considers it regrettable that there is no assessment of the use of security methods, such as encryption, to ensure that blocking lists are not leaked and thereby become seriously counterproductive; welcomes the fact that, having promoted mandatory blocking in 2011, the Commission has explicitly abandoned this position;”

“42. Regrets the fact that the Commission has neither assessed the security of blocking lists, the technologies used for blocking in those countries that have implemented the measures, the implementation of security measures, such as encryption, for the storage and communication of blocking lists, nor carried out any meaningful analysis of the effectiveness of this measure;”

The Parliament resolution concludes with a demand for the Commission to produce another report, this time with the data that should have been provided in the first place:

“53. Calls on the Commission to continue keeping Parliament regularly informed on the state of play in relation to compliance with the Directive by the Member States, by providing disaggregated and comparable data on the Member States’ performance in preventing and combating child sexual abuse and exploitation offline and online; calls on the Commission to present a more comprehensive report on the implementation of the Directive, which should include additional information and statistics on take-down and blocking of websites containing child sexual abuse material, statistics on the speed of removal of illegal content beyond a period of 72 hours and on the follow-up by the law enforcement authorities to the reported offences, delays in take-downs as a result of the need to avoid interference with ongoing investigations, information on the use of the stored data by judicial and law enforcement authorities and on the actions undertaken by hotlines after informing the law enforcement authorities to contact the hosting providers; instructs its relevant committee to hold a hearing on the state of play in relation to implementation and possibly consider adopting an additional report on the follow up given to the implementation of the Directive;”

Included in the Commission proposal in 2010 was a demand for mandatory internet blocking, which was indicative of the superficial, political and indifferent attitude of the Commission to the issues at hand. It was accompanied by possibly the worst “impact assessment” that the European Commission has ever produced, which included analysis that suggested that internet blocking that is not based on a specific law definitely is legal, definitely is not legal and may or may not be legal. An internal review of the impact assessment said that it needed “to be significantly improved” and even the “baseline scenario” needed to be “reworked”. Seven working days later, the Directive and Impact Assessment were published. It is not clear whether any improvements were made to either the Directive or Impact Assessment in the course of those seven days.

Report from the Commission assessing the implementation of the measures referred to in Article 25 of Directive on combating the sexual abuse and sexual exploitation of children and child pornography (16.12.2016)
http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52016DC0872

European Parliament resolution on the implementation of Directive on combating the sexual abuse and sexual exploitation of children and child pornography (14.12.2017)
http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//TEXT+TA+P8-TA-2017-0501+0+DOC+XML+V0//EN&language=EN

Impact assessment: Council Framework Decision on combating the sexual abuse, sexual exploitation of children and child pornography, repealing Framework Decision 2004/68/JHA
http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52009SC0355&from=EN

Internal review of the impact assessment
http://ec.europa.eu/smart-regulation/impact/ia_carried_out/docs/ia_2009/sec_2009_0357_en.pdf

(Contribution by Joe McNamee, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
10 Jan 2018

EU-Japan trade agreement not compatible with EU data protection

By Vrijschrift

The EU and Japan have announced the conclusion of the final discussions on a trade agreement, the EU-Japan Economic Partnership Agreement (EPA).

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Regarding cross-border data flows and data protection, the European Commission’s press release states that recent reforms of their respective privacy legislation offer new opportunities to facilitate data exchanges, including through a simultaneous finding of an adequate level of protection by both sides.

But this is not the full story. Besides the possibility to adopt adequacy decisions, the EPA contains explicit data flow commitments in the financial section, implicit data flow commitments in the services chapter, and a review clause. Especially the implicit data flow commitments do not seem compatible with the fundamental right to the protection of personal data.

In addition, a form of investor-to-state dispute settlement (ISDS/ICS) may be added later. All published trade agreement texts are subject to legal scrubbing (adjustment by legal services).

Adequacy decisions

Allowing cross-border data flows through an adequacy decision is, in principle, the correct way to approach this issue. Such decisions are EU decisions (from the EU point of view). If data protection in Japan deteriorates, or if the EU rejects mass surveillance in Japan, the EU can revoke the adequacy status – in principle.

The approach may not work out in practice. It remains to be seen whether the European Commission would really revoke the adequacy status. But should the EU award Japan adequacy status? Graham Greenleaf argues that Japan has serious issues to overcome: a weak personal information definition; a carve-out for “anonymously processed information”; a cross-border privacy rules back-door for onward transfers to the US; no record of enforcement; trivial or missing remedies; carve-outs for big data; carve out for de-identification. (Update: article)

Will the EU take an independent adequacy decision? Take this formulation in the press release:

“This offers new opportunities to facilitate data exchanges, including through a simultaneous finding of an adequate level of protection by both sides.”

The formulation “simultaneous finding of an adequate level” suggests a negotiated compromise where fundamental rights are traded against economic rights.

Implicit cross-border data flow commitments

Many people overlook implicit data flow commitments. Chapter 8 Section C Cross-Border Trade in Services, contains National treatment and Most-favoured-nation treatment clauses.

Cross-border services imply cross-border data flows. (1) We find a safeguard in Section G Exceptions, Article X1 General exceptions, paragraph 2. It is a GATS article XIV kind of exception with many conditions. Such safeguards are insufficient, see Kristina Irion, Svetlana Yakovleva, and Marija Bartl.

The implicit cross-border data flow commitments do not have sufficient safeguards. This is not compatible with the EU Fundamental rights framework.

Explicit data flow commitment

Chapter 8 Section E Sub-section 5 Financial Services, article 6, Transfers of Information and Processing of Information, contains a cross-border data transfer commitment regarding financial data. Paragraph 2 contains a safeguard:

“2. Nothing in paragraph 1 restricts the right of a Party to protect personal data, personal privacy and the confidentiality of individual records and accounts so long as such right is not used to circumvent the provisions of this Article.”

The strength of the exception is limited by a condition (“so long as …”). (2) The safeguard seems stronger than the one used in the financial sections in the agreements with Korea, Singapore, Vietnam and Ukraine. It is also stronger than the general exception in Section G Exceptions, mentioned above. The safeguard is based on the 1994 Understanding on Commitments in Financial Services (article B. 8).

Marija Bartl and Kristina Irion noted:

“The formulation used in CETA is likely more prudent compared to the language proposed in the Agreement with Japan. From the outset, it lays down a better division of labor between trade law and domestic data protection law. Given that the EU trade negotiators tend to work on blueprints of their earlier agreements reverting to the language of the 1994 Understanding on Financial Services and the text of the earlier EU – Singapore Free Trade Agreement 16 would mean a regressive development for the safeguards on data privacy.” (3)

It would seem that it is open to debate which one is stronger, as the formulation in the EU-Canada CETA has weaknesses as well. More importantly, neither safeguards respects the European Parliament demands for TTIP and TiSA. (4)

Review clause

The draft EPA contains a review clause. Chapter 8 Section F Electronic Commerce Article 12 Free Flow of Data reads:

“The Parties shall reassess the need for inclusion of an article on the free flow of data within three years of the entry into force of this Agreement.”

There is a lot of discussion on free flow of data commitments – while the draft EPA already contains (often overlooked) implicit data flow commitments. The review clause may act as a distraction.

ISDS

The Commission’s press release notes that negotiations continue on investment protection standards and investment protection dispute resolution. Adding ISDS/ICS or an investment court could have a negative impact on data protection. See here and here.

After writing the first version of this blog, someone alerted me to this analysis, from October 2017: Marija Bartl and Kristina Irion, The Japan EU Economic Partnership Agreement: Flows of Personal Data to the Land of the Rising Sun. Recommended reading.

This article was originally published at https://blog.ffii.org/eu-japan-trade-agreement-not-compatible-with-eu-data-protection/

(Contribution by Ante Wessels, EDRi member Vrijschrift, the Netherlands)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner


Footnotes:

(1) See page 1 (after the Roman numerals) Kristina Irion, Svetlana Yakovleva, and Marija Bartl.

(2) See, in general, without, for now, the part on ISDS, here.

(3) Note there is an important difference between the 1994 Understanding and the EU-Singapore FTA texts. Singapore, 8.54 (2): “Each Party shall, adopt or maintain appropriate safeguards to protect privacy and personal data, including individual records and accounts, as long as these safeguards is not used to circumvent the provisions of this Agreement.” 1994 Understanding, article B.8: “(…) Nothing in this paragraph restricts the right of a Member to protect personal data, personal privacy and the confidentiality of individual records and accounts so long as such right is not used to circumvent the provisions of the Agreement.” The first is a commitment, the second an exception to a commitment.

(4) TTIP 2 (b) (xii); TiSA 1 (c) (iii) reads: “(…) to incorporate a comprehensive, unambiguous, horizontal, self-standing and legally binding provision based on GATS Article XIV which fully exempts the existing and future EU legal framework for the protection of personal data from the scope of this agreement, without any conditions that it must be consistent with other parts of the TiSA; (…)” The articles in the agreements are not unambiguous, and do not fully exempt (…) from the scope of the agreement.

close
10 Jan 2018

ePrivacy proposal undermined by EU Member States

By EDRi

The discussions on the ePrivacy Regulation continue in the European Union (EU) legislative process. They were on hold for a few weeks because of ongoing negotiations on the European Electronic Communications Code (EECC) – another big “telecoms” file that the Council of the European Union is working on.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

On 5 December 2017, the Estonian Presidency of the Council proposed new compromises on key articles. This latest proposal for amendments is related to Articles 6, 7 and 8 of the draft ePrivacy Regulation, which concern permitted processing (Art. 6), storage and erasure of communications data (Art. 7) and the protection of stored communications in users’ devices (Art. 8).

Permitted processing

The provisions on permitted processing cover the circumstances under which electronic communications data may be processed.

The Estonian Presidency text suggests a few adaptations to be in line with the General Data Protection Regulation (GDPR) by including the legal ground of vital interest in Article 6(2)(d) and a new recital 17a, as well as provisions for accessibility in Article 6(3)(aa) and the new recital 19a. These additions should not add any risks for privacy issues in the way they are currently designed.

Much more concerning is the addition in Article 6(2)(e) and a recital 17b of a legal ground for scientific research and statistical purposes, similar to the one in Article 9(2)(j) of the GDPR (research, unlike the archiving, need not be “in the public interest”). The text of the recital and the Article state that this “type of processing should be subject to further safeguards to ensure privacy of the end-users by employing appropriate security measures such as encryption and pseudonymisation.” The use of “such as” means that these are just possibilities, not requirements. On top of that, a lot of flexibility would be given to Member States, since these measures must be “based on Union or Member State law, which shall be proportionate to the aim pursued and provide for specific measures”. This creates risks for privacy, security and the economic benefits generated by a more predictable, harmonising measure.

Storage and erasure

The provisions on storage and erasure cover what protection should apply to different types of data and the deletion of data that is no longer needed to perform a service.

On storage and erasure, the Estonian Presidency “invites delegations to reflect on the need for” Art. 7(1) which ensures protection of communication data when it is at rest (i.e. stored in the provider’s network). Not including the protection of communications data at rest in the ePrivacy regulation means that an e-mail would be sensitive data subject to the standards of the ePrivacy Regulation while being transmitted and suddenly, upon arrival in the (online) mailbox of the provider, be subject to the General Data Protection Regulation. This would create the option for processing of the content as non-sensitive data under the “legitimate interest” exception in the GDPR, in order to facilitate invasive surveillance of content, of the kind previously done by Gmail. Bizarrely, businesses lobby both for clear, predictable rules and unclear and unpredictable rules like this.

Protection of terminal equipment

The provisions on protection of terminal equipment cover the rule for installing or using data on an individual’s communications device.

As regards terminal equipment, recital 21 adds precision on the use of cookies. Cookies can be used for both tracking and non-tracking purposes. The text recognises “authentication session cookies used to verify the identity of end-users engaged in online transactions” as legitimate, as well as some audience measuring. However, Articles 8(1)(d) and 8(2)(c) authorise audience measuring “by a third party on behalf of the provider of the information society service” and statistical counting without making pseudonymisation mandatory. This would facilitate the kind of cross-platform data processing done by, for example, Google Analytics.

Recital 21 and Article 8(1)(e) also allow for installation of security updates without the consent of the end-user, provided they are necessary, that the user is aware of them and that the user can delay them. While security updates are particularly important to protect the user from attacks or breaches, consent should remain as the sole legal basis for any process linked to accessing a terminal equipment. That way, instead of knowing that a “security update” is being installed on your phone, computer or other connected device, the software provider would have an incentive to be more transparent and give you more information on the update and what it is for.

Although not every proposed amendment threatens fundamental rights, the Estonian Presidency proposed to broaden the scope of exceptions in significant ways. It suggested authorising some processing that goes beyond what is strictly necessary, not keeping consent as sole legal basis, and not putting up strong safeguards to limit the impact of this broadening on privacy. This weakening of protections and predictability brings us closer to the kind of security and privacy chaos that the United States is experiencing. It would without doubt create the “chill on discourse and economic activity” that failure to implement privacy and security measures has caused in the US. But at least Facebook and Google will be happy.

Presidency text leaked by Austrian government (05.12.2017)
https://www.parlament.gv.at/PAKT/EU/XXVI/EU/00/43/EU_04355/imfname_10770009.pdf

Presidency text (05.12.2017)
http://data.consilium.europa.eu/doc/document/ST-15333-2017-INIT/en/pdf

e-Privacy: what happened and what happens next (29.11.2017)
https://edri.org/e-privacy-what-happened-and-what-happens-next/

e-Privacy revision: Document pool
https://edri.org/eprivacy-directive-document-pool/

(Contribution by Anne-Morgane Devriendt and Diego Naranjo, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
10 Jan 2018

Commission claims that general monitoring is not general monitoring

By Modern Poland Foundation

Will everything we do on the internet be monitored and checked against by a non-transparent mechanism that decides what can be published? It is a real threat, and currently it is coming from an area that patently does not require such draconian measures: EU copyright law. This threat is a peculiar one, because there are actually explicit safeguards in existing EU law designed to prevent general monitoring of users’ communications.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Specifically, EU Member States are not allowed to impose a general obligation to monitor the information that users transmit or store, as stated in the Article 15 of the e-Commerce Directive (2000/31/EC). So how to introduce a general monitoring obligation if this is explicitly forbidden? Well, it is currently being done by a mix of sophisticated legal interpretation, a splash of sophistry and clever legal drafting that is hard to comprehend for non-experts in the field. Below, we try to explain the details in a more easily understandable way.

The starting point are Recitals (explanatory notes) 47 and 48 of the e-Commerce Directive. Recital 47 clarifies that the prohibition of general monitoring leaves room for monitoring in “specific cases”. Recital 48 gives Member States an option to apply “duties of care” in order to detect and prevent certain types of illegal activities.

We are now observing an attempt to fit a de facto general monitoring obligation into the room created by Recital 47 in a misrepresentation of the notion of “duty of care” from Recital 48. The attempt is constructed of two parts: the first is the recently published “Guidance” on certain aspects of “IPR Enforcement Directive” (IPRED) and the second is the mix of Articles 11 (ancillary copyright) and Article 13 (licensing, liability, upload filtering, redress, cooperation) of the European Commission’s Proposal for a Copyright Directive.

IPRED (Directive 2004/48/EC of the European Parliament and of the Council of 29 April 2004 on the enforcement of intellectual property rights), is a legal instrument which requires Member States to ensure that courts may issue injunctions against intermediaries intended to prevent infringements (duplicating similar provisions in the 2000 E-Commerce Directive and the 2001 Copyright Directive). Injunctions are court orders requiring a party to do something or to refrain from doing something, subject to a penalty for non-compliance.

As we already know, it is not legal to impose a general monitoring obligation, and this holds for obligations imposed in injunctions. In fact, the “filtering systems” section of the IPRED Guidance mentioned above starts with that observation. It even stresses that it would be incompatible with fairness and proportionality, and excessively costly. Moreover, it references two Court of Justice of the European Union (CJEU) rulings, Scarlet and Netlog, in which the Court found that general filtering systems are incompatible with EU law. However, it then invokes the possibility of monitoring obligations in “specific” cases (Recital 47), and links it with the newly re-invented “duty of care” in order to detect and prevent infringements (Recital 48). This is an important part, because contemporaneous documents indicate the “duties of care” were meant to refer only to measures to enforce the rules in the Articles of the Directive and were not introducing nor facilitating the implementation of a separate “duty”. The Commission’s new interpretation of the 2000 Directive is therefore demonstrably misleading.

However, the Guidance ignores the facts and concludes that it is possible to use both Recitals to impose obligations on service providers to prevent the upload of infringing content identified by rightholders and in cooperation with them. In other words, filtering is forbidden by EU law, but also, following this logic, permitted by it.

Such an interpretation is contrary to the most basic principles of legal interpretation. Directives consist of operative “Articles” and explanatory “recitals”. The relevant Article (15) has, as its title “no general obligation to monitor”. For the Commission’s current interpretation to be correct, this text must be interpreted in a way that permits a general obligation to monitor. It seems absurd to have to explain that this interpretation is not valid. However, for the sake of completeness, Article 31.1 of the Vienna Convention on the Law of Treaties establishes that s “treaty shall be interpreted in good faith in accordance with the ordinary meaning to be given to the terms of the treaty in their context and in the light of its object and purpose”. The (somewhat unsurprising) concept that the “ordinary meaning to be given to the terms” should be used when interpreting law has been carried across into EU law multiple times by the CJEU.

When you put all these parts together, you come up with a system that can be used by the publishers to require service providers to take down and prevent upload of virtually any content using a mechanism that could not and would not be expected to distinguish such exceptions as quotation, critique, parody, panorama, inspiration, or independent creation. The system is, in the ordinary meaning of the words, a “general obligation to monitor” and will, by causing the deletion of content protected by copyright exceptions, a restriction on fundamental rights that is not necessary, proportionate or achieving objectives of general interest, as required by the EU Charter.

Nonetheless, this kind of censorship will be virtually impossible to challenge before a court. First, if the content is prevented from being uploaded and published, the user will find it very hard to construct a legal case. Secondly, there are virtually no mechanisms in the copyright laws of Member States which would allow users to assert their rights, mostly because user freedoms are constructed as “limitations and exceptions”, not as independent user rights.

Separately, a court case would not be possible until after the Directive is implemented, and it could, for the reasons described above, only come from a service provider. After going through the national courts, it would take at least a further 18 months to get a ruling against the mandatory filters. At that stage, virtually all providers in Europe would already have had to invest in the filtering systems and would, due to the weakened liability regime, opt to keep them in any case.

Would an injunction requiring to monitor specifically for all such content materially differ from a general monitoring injunction? It probably would not. General monitoring means searching everything. Specific monitoring means searching everything, looking for millions of “specific cases”. It will not solve any problems, but introduces a threat to freedom of expression.

It is still not too late to stop decision-makers from letting this happen. Visit savethememe.net for information on how to have your say!

Vienna Convention on the Law of Treaties
https://treaties.un.org/doc/publication/unts/volume%201155/volume-1155-i-18232-english.pdf

(Contribution by Krzysztof Siewicz, EDRi member Modern Poland Foundation, Poland)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
10 Jan 2018

Copyright reform: State of play

By Anne-Morgane Devriendt

In 2016, the European Commission (EC) launched its proposal for a new Directive on Copyright in the Digital Single Market. This reform was supposed to update the previous Directive, to adapt it to the digital world.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Since the previous Directive was adopted in 2001 (after a four-year legislative process), technology and the online ecosystem have changed considerably, so an update was needed. However, far from modernising existing rules, the proposed Directive appears to be designed to create new barriers to fundamental rights, access to culture and the digital single market. The two most controversial issues that have arisen in the proposed text are Articles 11 and 13.

Article 11 proposes to establish an ancillary right for press publishers, in line with failed legislation in Spain and Germany. This is ostensibly to help redirect revenue to press publishers by creating a new right for publishers to restrict the use of news snippets. The Spanish and German examples showed the failures of this system that led to the end of Google News in Spain, to the detriment of smaller publishers, and to Google being exempted in practice from the obligations established by the German law, to the detriment of smaller news aggregators. Despite these failures and despite many academics warning against it, many legislators appear willing to ignore experience and hope that the same policy that was an abject failure and hurt publishers in Spain and gave a competitive advantage to Google in Germany will lead to different outcomes when implemented on an EU level.

Article 13 is supposed to address a so-called “value gap” (a lobbying term created by copyright lobbyists and generously re-used by the European Commission) by forcing online platforms to use upload filters on all content to check for copyrighted content. This would mean that each platform allowing uploads would have to monitor all material made available by their users. This would involve multiple filters for text, audio, audiovisual, image and other formats. Worse still, as these filters are not very effective, providers would have to guess how much filtering would be enough to defend themselves if they are taken to court. This would create huge uncertainty for European companies and yet another competitive advantage for the biggest platforms.

Not only this would be extremely costly and detrimental to smaller companies and start-ups, but it also contradicts the e-Commerce Directive. As upload filters would rely on algorithms and content recognition, we can be certain that they would infringe freedom of expression by taking down content that has been wrongly flagged or that is perfectly legal, as it falls under the entirely lawful exceptions to copyright, such as parody.

Other issues are text and data mining for scientific purposes (TDM), restricted to research institutions in the Commission’s proposal, and the harmonisation of exceptions such as the exception of panorama. Ironically and incomprehensibly, the existing Directive, introduced as a harmonisation measure, gives Member States literally millions of options to choose between.

After heated discussions, the European Parliament has yet to reach an agreement. The vote in the lead Committee, the Committee on Legal Affairs (JURI), is currently scheduled to take place on 25 January 2018, if compromise agreements are found on the main issues before then. This seems unlikely.

In the Council of the European Union, some Member States questioned the legality of the Article 13 with regard to the e-Commerce Directive and previous case law forbidding the untargeted filtering of uploads. After an unclear response from the legal service of the Council, the Estonian presidency only managed to find an agreement on the smaller issues like TDM but could only produce disastrous proposal after disastrous proposal on the most controversial elements elsewhere in the Directive. These proposals aim at bypassing the conflict with the e-Commerce Directive by establishing that online platforms are making an act of communication to the public, or “online content sharing service provider” (OCSSP), thus being directly liable for the content they host. It is now up to the Bulgarian presidency to work out a common position on the major issues.

After the 2001 Directive aimed to align outdated copyright law with the digital age, the new Directive appears to have a new aim – to align the digital age to outdated copyright law.

Copyright reform: Document pool
https://edri.org/copyright-reform-document-pool/

Censorship machine: Busting the myths (13.12.2017)
https://edri.org/censorship-machine-busting-myths/

Copyright Directive may lead newspapers to become their own censors (13.12.2017)
https://edri.org/copyright-directive-newspapers-become-their-own-censors/

Commission to scientists: Stop ruining our copyright plans with your facts and your research! (22.12.2017)
https://juliareda.eu/2017/12/jrc-paper-copyright/

(Contribution by Anne-Morgane Devriendt, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
10 Jan 2018

2018: Important consultations for your Digital Rights!

By EDRi

Public consultations are an opportunity to influence the future legislation at an early stage, in the European Union and beyond. They are your opportunity to help to shape a brighter future for digital rights, such as your right to an open internet, a private life, and data protection, or your freedom of opinion and expression.

Below you can find a list of public consultations we find important for digital rights. We will update the list on an ongoing basis, adding our responses and other information that can help you get engaged.


Public consultation on fake news and online disinformation by the European Commission

  • Deadline: 23 February 2018

Feedback on the 2nd data package or proposal for a Regulation on the Free flow of non-personal data


You can find public consultations of importance to digital rights and EDRi’s responses from previous years here:

Twitter_tweet_and_follow_banner

close
21 Dec 2017

Have yourself happy privacy friendly holidays!

By Maren Schmid

It’s (nearly) holiday time. And with holidays, there come the family’s holiday gatherings holding forth the possibility of awkward or tense moments. But fear not – it is time to take over the conversation and educate your family about digital rights!

We have prepared some talking points that will facilitate the process of bringing your family closer to privacy, security and surveillance related issues. (Yes, this is a fun conversation you definitely want to have!)

1. As soon as the good mood threatens to turn bad, casually change the topic and inform your relatives about how this past October the ePrivacy regulation was approved by the European Parliament – a big win for privacy, citizens’ rights, competition, innovation and security! Explain that this is only the first step, though, and that in 2018, the European Commission, the Council and the Parliament have yet to reach an agreement during the trilogues. (However, we suggest to avoid going into depth explaining what this obscure process called “trilogues” is – you don’t want to lose your audience at this point already!)

2. Your family might be baffled because of your sudden change of topic and might not know what to respond – a perfect opportunity for you to continue! Make sure to tell them about another big win from July where the Court of Justice of the European Union (CJEU) confirmed that the EU/Canada deal on indiscriminate collection of air travellers’ data and sharing it breaches European law. This was the third time that the European Court had ruled against arrangements for mandatory storage of personal data. This is great news for EU citizens, as the risks associated with massive and unnecessary databases of sensitive personal data are unacceptable. Blindly collecting data, hoping it will magically protect our society, is bad for security and bad for fundamental rights! It would be better not to mention that the European Commission keeps proposing and defending illegal restrictions on fundamental rights, your family should enjoy the holidays.

3. Ignore any objections your family might have (such as “I have nothing to hide anyway”, see Bullsh*t Bingo below – or ask to read their bank records, text messages and ask for their credit card) and carry on! At this point, you can briefly mention the EU Directive on combating terrorism. (If you do not feel brave enough to use buzzwords like “terrorism” as they might trigger deep-rooted arguments, skip this point and continue at 4.) The European Parliament voted in favour of this worrisome Directive in February 2017. And what makes the Directive so worrisome? Its use of unclear and weak wording. For instance, it criminalises “glorifying terrorism”, without defining what it means. This creates the risk of excessive punishment and censorship. Beware: A lot of people seem to assume that fighting terrorism is inevitably connected to giving up fundamental rights like privacy or freedom of speech (see Bullsh*t Bingo, again). Your family might have similar beliefs, so be prepared to face some opposition. The answer is simple, though: We can’t bravely defend our freedoms by meekly abandoning them.

4. If you followed point 3, you might be met with an awkward silence – make use of this gap in the conversation by expounding on how important digital rights are and why they are under threat. An example for this could be the ongoing Copyright Directive which may actually become the biggest censorship machine ever. The proposed Directive 1) requires internet companies to install filtering technology to prevent the upload of content that has been “identified by rightsholders” 2) seeks to make internet providers responsible for their users’ uploads and 3) gives internet users no meaningful protection from unfair deletion of their creations. Sounds quite alarming, right? Make sure your family understand that the EU will make their decision on this scary Directive in early 2018 and that it is time to speak up now! Tell your family to contact your MEPs and ministries asap.

5. Finally, as they become more and more enthusiastic about the defence of digital rights, explain to them that there is a high chance they might have paid more money for one of their present than other people – because of how their data is used to profile them. How? Explain to them that it is a common practice for companies to use personal data to charge different people different prices for the same product. Companies do so in different ways. They might gain information about you from your type of device (Mac vs. Windows), your IP address, and your browser’s cookies. Hilariously, some companies even defend this practice by saying that they’re only discriminating in favour of some people and not against the people that they’re not discriminating in favour of! Sadly, explaining to people that privacy and freedom online matter because they might end up being charged more than other people is very effective: Appealing to people’s wallet usually works better than to their loss of their fundamental rights. But yay, if you reached this point – you have successfully convinced your family and have collected enough good karma for 2018!

More fun activities

Bullsh*t Bingo

Convincing your family of the importance of digital rights might be a long, tiring and rough process… So have some patience and don’t forget to keep track of the battle with our Bullsh*t Bingo! Check off each block when you hear these overused phrases and arguments. And when you get three blocks horizontally, vertically or diagonally, stand up and shout BULLSH*T!


Download a printable PDF!

(Contribution by Maren Schmid, EDRi intern)

Twitter_tweet_and_follow_banner

close
21 Dec 2017

Holiday must reads – our best of 2017

By EDRi

The year 2017 has been a busy one. We’ve been fighting the censorship machine, supporting the adoption of a strong ePrivacy Regulation, working towards a balanced approach to law enforcement online and defending data protection in the context of trade negotiations. We also published 221 articles on our website – to sum up the most important events and developments of the year, here a selection of the most popular ones.

We hope you will enjoy reading them!

Leaked document: EU Presidency calls for massive internet filtering (06.09.2017)
At the end of August, EDRi member Statewatch leaked a document that revealed that Estonia’s EU Presidency has been pushing the other Member States to strengthen indiscriminate internet surveillance and was working hard to push the Commission’s original proposal for the new copyright Directive further into the realms of illegality.
https://edri.org/leaked-document-eu-presidency-calls-for-massive-internet-filtering/

Last-ditch attack on e-Privacy Regulation in the European Parliament (24.10.2017)
In October, some Members of the European Parliament tried to overturn progress made on the e-Privacy Regulation.
https://edri.org/last-ditch-attack-on-e-privacy-regulation-in-the-european-parliament/

UN Rapporteur demands respect for freedom of expression online (14.06.2017)
In June, the United Nations Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, David Kaye, released a new report, which gave an overview of the problems for freedom of expression and opinion in the Telecommunications and Internet Access Sector.
https://edri.org/un-rapporteur-demands-respect-for-freedom-of-expression-online

School of Rock(ing) Copyright 2017: (Re-)united to #fixcopyright! (15.11.2017)
In September and October, EDRi, Communia and Wikimedia co-organised a series of copyright-related workshops: School of Rock(ing) Copyright. The goal of the workshops was to engage local activists, researchers and associations interested in copyright to create new spaces of action at the national and EU level.
https://edri.org/school-of-rocking-copyright-reunited

No, you can’t enjoy the music you paid for, says EU Parliament Committee (05.07.2017)
In July, a leaked European Parliament document exposed some of the most bizarre suggestions yet in the debates around the proposed new copyright rules in Europe.
https://edri.org/no-you-cant-enjoy-the-music-you-paid-for-says-eu-parliament/

Recklessly unclear Terrorism Directive creates significant risks for citizens’ security (16.02.2017)
In February, the European Parliament voted in favour of the EU Directive on combating terrorism. Weak, unclear, ambiguous wording in the Directive presents dangers for the rule of law, the right to privacy and freedom of opinion and expression.
https://edri.org/recklessly-unclear-terrorism-directive-creates-significant-risks-citizens-security/

Illegal surveillance against civil society continues in Macedonia (22.02.2017)
At the beginning of the year, Macedonian civil society organisations advocating for human rights and democracy came under increasing pressure by the authorities. They had previously been caught up in use of the state apparatus for massive illegal surveillance, including wiretapping of activists.
https://edri.org/illegal-surveillance-against-civil-society-continue-in-macedonia

AVMS Directive: It isn’t censorship if the content is mostly legal, right? (27.04.2017)
During the first half of 2017, we struggled with the reform of the Audiovisual Media Services Directive (AVMSD). After going through several legislative stages, the AVMSD is now being negotiated in trilogues.
https://edri.org/avmsd-it-isnt-censorship-if-the-content-is-mostly-legal/

PNR: EU Court rules that draft EU/Canada air passenger data deal is unacceptable (26.07.2017)
In July, the Court of Justice of the European Union (CJEU) confirmed that the EU/Canada deal on collection of air travellers’ data and sharing breaches European law. This is the third time that the European Court has ruled against arrangements for mandatory and indiscriminatory storage of personal data.
https://edri.org/pnr-eu-court-rules-draft-eu-canada-air-passenger-data-deal-is-unacceptable/

Cross-border access to data: EDRi delivers international NGO position to Council of Europe (18.09.2017)
In September, a global coalition of civil society organisations submitted to the Council of Europe its comments on how to protect human rights when developing new rules on cross-border access to e-evidence. The Council of Europe is currently preparing an additional protocol to the Cybercrime Convention.
https://edri.org/cross-border-access-data-edri-delivers-international-ngo-position-council-europe/

EU Commission on FOI request: Incompetence or ill-intent? (31.05.2017)
The European Commission had major problems to respond to our freedom of information (FOI) request to access related to industry lobbying in Brussels surrounding the copyright reform.
https://edri.org/eu-commission-on-foi-request-incompetence-or-ill-intent/

Twitter_tweet_and_follow_banner

close
20 Dec 2017

EDRi Awards 2017

By Joe McNamee

For the first time and with great solemnity, EDRi presents the first ever 4th edition of our annual awards.

1. The “Humpty Dumpty Award” for the most silly “statistics”

IAB & their silly statistics, with the honourable exception of the real statistics buried deep in the spin.

2. The Mark Zuckerbrot award for WTF

This award goes to a Member of the European Parliament (MEP) who is down with the kids and up with the facts, who explained to us that Facebook’s Mark “Zuckerbrot” “got the message” and now wants to regulate the morals of children online, just as the Audiovisual Media Services (AVMS) Directive demands. The same MEP has attacked EDRi publicly for opposing Google’s lobbying for internet filtering, arguing that we are supporting Google because we oppose them. No, we don’t understand either.

3. The cranial fracture facepalm award

Computer and Communications Industry Association’s (CCIA) insistence to a journalist that they are “not working on e-Privacy”, before immediately launching a lobbying campaign on e-Privacy and then paying a “consultant” for an “independent” study on e-Privacy.


Positive EDRi Awards

On a more serious note, we should also spare a thought for the wonderful people that are doing wonderful work at a difficult time.

4. The “Max Schrems” award

David Carroll for his fight against Cambridge Analytica;

The Court of Justice of the European Union (CJEU) itself, for prohibiting (again, having done so already in 2012), the untargeted mass storage of telecommunications data (in its ruling on Tele2) and its clear Opinion on the EU/Canada PNR Agreement. Sadly, to “protect us” from lawbreakers, the European Commission and EU Member States continue to break the law in relation to both of these activities, as we reported (read more here & here).

5. The heroes who keep us energised award

We cannot name everybody, including last year’s awardees, but here are six top stars that are worth highlighting:

Tijn, Luca, Nina, Marlou and Joran

Who are they?

The students who organised the collection of 384 000 signatures to require an advisory referendum on the Dutch “dragnet” surveillance law. We need more people like them!

Finally, we want to recognise the amazing work that all of our members and other digital rights activists are doing in Europe and around the world.


Getting inspired, dig into our 2017 top reads

Notable publications of the year:

Did you like them? Please, check previous EDRi awards:

EDRi awards 2016
EDRi awards 2015

EDRi awards 2014

(Contribution by Joe McNamee and Maryant Fernández Pérez)

Twitter_tweet_and_follow_banner

close