18 Oct 2017

Success Story: A win on Austrian surveillance legislation

By Epicenter.works

The security debate in many countries shows an alarming trend towards restrictions of fundamental rights that liberal societies have codified in the past centuries. Particularly in the field of surveillance, recent legislation often goes beyond what has been deemed constitutional by courts and lacks any fact-based justification as to how those measures are supposed to prevent or mitigate serious crime. Advocacy groups rarely achieve a victory in this debate. Hence, it is worth taking a closer look at a recent example from Austria where such legislation was prevented by EDRi member NGO epicenter.works.

In January 2017, the Austrian government agreed to a “security package”. The proposed legislation spanned over several seemingly unconnected areas from the legalisation of government spyware, restrictions of the right to assembly, real-time CCTV systems, a centralised database for registered license plates (piggybacked on existing legislation for license plate tolls), IMSI catchers, a prohibition of anonymous pre-paid SIM-cards, and the arrest of people who pose “potential threats” before convictions or even charges.
Epicenter.works published an analysis of the government agreement on the day of its release and started a campaign called “surveillance package”. In the course of seven months, 10 out of 12 proposed measures were abandoned. This is how epicenter.works reached these outstanding results in its campaigning.

Separating the security debate from the surveillance spiral

Terrorism in Europe is much like a wasp trying to destroy a china shop. The wasp alone can never achieve its aim, but if people react with panic, the result will be destruction.

The hard truth is is that modern terrorism with cars and self made bombs can never be completely avoided. Political decisions are rarely guided by facts and analysis of previous attacks to reduce the chance of future incidents (due to the political need to propose “something” to reassure the population). Instead, they rely on the assumption that increased surveillance automatically leads to more security. In an effort to stress that this is a fallacy, epicenter.works coined the term “surveillance package” and made it clear that these reforms are unlikely to bring about a higher level of security. By the end of the campaign, this wording was picked up not only by the whole opposition, but also by media, which was a great success.

The “surveillance package” proposed by the Austrian government had the same central weakness as most recently proposed surveillance measures in Europe: their complete lack of justification in quantifiable criminological measurement. Part of the “surveillance package” campaign was to define an “objective security package” with measures that are actually adequate and important for security – such as multilingual police and better police training – none of which includes more surveillance powers. This builds on epicenter.works’ attempts to create a scientific concept to evaluate anti-terrorism measures.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

When a government restricts the fundamental rights of its citizens, the burden of proof always lies with the legislator as to why those restrictions are efficient, necessary, and proportionate. Questioning the security benefits of the proposed laws helped tremendously to undermine them as ineffective and purely opportunism-driven surveillance legislation.

Combine jurisprudence, technology and design in one team

From the first draft of the government’s work program to the proposed legislation epicenter.works provided an in-depth analysis from a legal and technical perspective. The complex analysis was complemented by single page synopses of important questions and infographics. Three press conferences, eight demonstrations in six cities, and thousands of tweets, videos, and images, as well as countless discussions with politicians and other stakeholders followed. The outcome was a combination of policy-driven and campaign-driven approaches that had great impact.

Make the many voices heard

For the first iteration of this campaign epicenter.works developed a new telephone tool that allowed citizens to subscribe to daily calls with their representatives. In a second stage, the campaign publicised the parliamentary consultation of the proposed legislation to a wider public thereby eclipsing all previous consultations threefold with a total of 18 094 responses, crashing the consultation website. In the final stage, all 6 350 public statements were analysed and visibility given to the arguments of institutions like the Austrian high court, the lawyers’ association, and the Red Cross.

Conclusion

The main lesson to take away from this is that one size doesn’t fit all. The “surveillance package” campaign was a success thanks to the diversity in methodologies, flexibility to react to new circumstances, and a multidisciplinary team of experts, grassroots campaigners, and media strategists. However, the most important detail that allowed this success was reframing the discussion away from surveillance and towards a broader security debate. In that new framing, privacy advocates will be able to propose more effective and fact-based solutions that do not rely on surveillance.

Surveillance package campaign (only in German)
https://überwachungspaket.at/

Requirements for a comprehensive surveillance footprint evaluation (only in German)
https://epicenter.works/thema/heat

(Contribution by EDRi member epicenter.works, Austria)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
18 Oct 2017

Dutch NRA: T-Mobile may continue to violate net neutrality

By Bits of Freedom

EDRi member Bits of Freedom asked the Dutch national regulatory authority, Authority for Consumers and Market (ACM), to test the mobile operator T-Mobile’s subscription against the new European rules that protect net neutrality, to verify whether its “Data-free Music” service infringes upon the principle of net neutrality. On 11 October 2017, the ACM ruled T-Mobile to be compliant with the law. Bits of Freedom disagrees and will challenge this decision.

The ACM states in its decision that T-Mobile does not discriminate against music services, “as long as they comply with the conditions that T-Mobile imposes” – which is exactly the problem! T-Mobile is imposing all kinds of conditions upon these music services and therefore decides which ones are entitled to preferential treatment. For example, if a music service does not want T-Mobile to use its logo, they are out of luck. If the music service is not able or willing to structure its systems to the whims of T-Mobile, they are left out in the cold.

The ACM also thinks that the freedom of the music service providers or consumers is not being constrained. That is hardly surprising when you realise that the ACM seems to have no regard for the impact of services like Data-free Music on the innovative nature of the internet. One of the powerful properties of the internet is precisely that every computer, and therefore every service and user, is as easy to reach as any other. A service like T-Mobile’s flies in the face of that idea by giving some services preferential treatment.

Based on the text of its decision, the ACM has also not taken into account what other operators will do if this is allowed and how that will impact the freedoms of internet users.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

With this decision, the ACM undermines the innovative power of the internet for all of Europe. Because of the leading role that the Netherlands has played for years in the field of net neutrality, everybody in Europe is looking at enforcement in the Netherlands. In the rest of Europe, several supervisors are looking into similar subscriptions at other operators. But they have not ruled yet, waiting for the ACM’s decision. The fact that the ACM did not strictly enforce the principle of net neutrality is a missed opportunity. Bits of Freedom will challenge this decision.

Bits of Freedom has advocated for net neutrality for more than seven years. The Netherlands was the first in Europe to anchor net neutrality into law, in 2012. Since the end of 2015, similar rules hold for all of Europe. Since then, operators have been constantly testing the limits of these rules. T-Mobile did this with the Data-free Music service, among other things. Earlier, ACM already ruled that this service is out of line with the Dutch interpretation of these European rules. T-Mobile went to court over this and won. Bits of Freedom then filed an enforcement request. On 11 October 2017, the ACM decided on the request.

If you get access to the internet, you should get access to the entire internet. You, and not your operator, should get to decide what you do on the internet. A service like Data-free Music, a subscription in which music services can provide you with music without expending your data plan, sounds like a great offer: listening to Spotify on your mobile phone without a risk of running out of data or paying more. And for the short term, it could be. But if operators offer one or more services at a cheaper price than their competitors, this yields an unfair advantage, taking away the opportunity for competitors and newcomers to compete for customers on their own merit.

T-Mobile’s subscription only works with a small number of music services. If a provider can’t or won’t comply to T-Mobile’s conditions, they will be left out. Even though T-Mobile claims that every service can participate, the reality is more of a problem: months after the introduction, only a handful of music services have joined.

Dutch NRA: T-Mobile may continue to violate net neutrality (11.10.2017)
https://www.bof.nl/2017/10/11/dutch-nra-t-mobile-may-continue-to-violate-net-neutrality/

T-Mobile may continue with Data-free Music (only in Dutch, 11.10.2017)
https://www.acm.nl/nl/publicaties/t-mobile-mag-dienst-datavrije-muziek-de-lucht-houden

(Contribution by EDRi member Bits of Freedom, the Netherlands)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
18 Oct 2017

Extending the use of eID to online platforms – risks to privacy?

By Anne-Morgane Devriendt

On 10 October 2017, the European Commission published the “draft principles and guidance on eID interoperability for online platforms” on the electronic Identification And Trust Services (eIDAS) observatory. Building on the eIDAS Regulation, the Commission would like to extend the scope of use for the eIDs to online platforms, in addition to public services. This raises a number of issues, particularly on the protection of privacy.

The eIDAS Regulation, adopted in 2014, is part of the “European eGovernment Action Plan 2016-2020”. It aims at making all Member State issued eIDs recognisable by all Member States from 28 September 2017. By extending the scope of use of eIDs to “online platforms” in general and not only public services, the Commission is trying to make authentication easier and more secure, as the eID itself would allow logging in. It would answer some of the issues raised by the use of passwords as main authentication method. It would also be more convenient for the users who could use the same eID across different platforms.

However, as are presented in the Commission’s document, the guidelines raise a number of issues, such as the lack of definition of “online platforms”. As the eIDAS Regulation concerns access to public services throughout the EU with the same, government approved eID, it appears that “online platforms” refers to the private sector. “Online platforms” are defined, to a certain extent, in the Commission’s Communication on Online Platforms. However, the characteristics that are used are so wide they encompass both online sales websites and social media platforms.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The second issue is protection of privacy. Indeed, the draft document states that “users should be able to preserve a level of privacy and anonymity, e.g. by using a pseudonym”. The failure to understand the basic notion that anonymity and pseudonymisation are fundamentally different is worrying. It is, or should be, obvious that using one’s eID to authenticate oneself would allow the platform to link the pseudonym to the real identity and personal information. Furthermore, while it might be useful for online sale platforms to make sure transactions are taking place between real people, it defeats the purpose of using a pseudonym on social media to separate online activities to be linked to one’s real identity.

Finally, if the Commission sets the direction to make authentication easier for both platforms and users with the use of the eID, they do not provide guidelines on the implementation of privacy by default. This would make sure that online platforms only have access to authentication information and do not use it for other purposes. One of the safeguards for the use of eIDs to access public services is the ability to monitor which public servant accessed the data and when. However, regarding the use of eIDs for authentication on online platforms, there is no provision in the draft guidelines that would make sure that data are properly secured.

Bearing in mind the huge and varied damage caused to Facebook users by its “real names” policy, the risks of this project being used by certain online platforms are real and significant.

All interested stakeholders can communicate their opinion on this draft to the Commission before 10 november 2017 through the eIDAS observatory post or by email.

Draft principles and guidance on eID interoperability for online platforms – share your views! (10.10.2017)
https://ec.europa.eu/futurium/en/blog/draft-principles-and-guidance-eid-interoperability-online-platforms-share-your-views

Workshop: Towards principles and guidance on eID interoperability for online platforms (24.04.2017)
https://ec.europa.eu/digital-single-market/en/news/workshop-towards-principles-and-guidance-eid-interoperability-online-platforms

Communication from the Commission – Online Platforms and the Digital Single Market: Opportunities and Challenges for Europe (25.05.2016)
https://ec.europa.eu/transparency/regdoc/rep/1/2016/EN/1-2016-288-EN-F1-1.PDF

(Contribution by Anne-Morgane Devriendt, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
17 Oct 2017

EU Council legal services unclear about censorship filters

By Diego Naranjo

On 16 October 2017, Politico leaked the response from the Legal Services of the Council of the European Union (CLS) to the questions raised by six member states about the legality of the upload filter proposal in the Article 13 of the Copyright Directive proposal. As the censorship filter is about restricting fundamental rights, it is regrettable that the questions did not mention the rules under which restrictions can be imposed, namely Article 52.1 of the Charter of Fundamental Rights of the European Union. This means that key issues were not addressed in the CLS response.

In anticipation of being leaked, the document is carefully written to say nothing clearly, speculating instead about future interpretations, and ending with a “to be continued” clause, because, as the CLS correctly points out, the “confusing terms in which that [explanatory] recital is drafted raise various legitimate questions to which, regrettably, no answers are given…”

Censorship filters: Not in my name… or maybe yes

Paragraph 12 of the document repeats the views of the Court of Justice of the European Union (CJEU) regarding the upload filters proposed in the Sabam/Netlog and Telekabel cases (in both cases the Court ruled against filtering) by saying that the measures to be put in place to prevent copyright infringements should be “the most affordable and effective ones among those available, taking into account elements such as ‘the resources and abilities available’ to them, the specificities and the needs of their services, as well as their size”.

The document correctly states that content recognition technology (i.e. upload filtering) is not explicitly imposed in Article 13. This is because the article is designed to coerce the companies into imposing the measure, rather than explicitly mandating that they do so. As a result, they can choose “through cooperation with the right holders, the most appropriate and proportionate measure to achieve the objective pursued”. Is that “provided for by law” as required by the Article 52.1, “necessary” or “genuinely achieve objectives of general interest” as required by the Charter of Fundamental Rights”? The answer isn’t given because the question wasn’t asked. Are the rules for restricting fundamental rights “clear and precise” as the Court has repeatedly insisted that they must be? It is impossible to read previous rulings of the CJEU and come to the conclusion that Article 13 achieves minimum standards of predictability.

If you’re censored, please contact the company customer services

Moving away from the basic principle of law that fundamental rights are the law and exceptions must be clearly provided for, the CLS builds on the notion of corporate rights. The rights of the internet companies are not restricted, because they will be wise enough to buy, and able to afford,the level of filter that the courts will deem appropriate. Can we realistically assume that such appropriate, affordable, necessary, proportionate, non-intrusive, accurate, up-to-date, multiple (text, image, video, audio…) filters exist?

The CLS states that there is no breach of freedom of expression in case that the company takes down legal content by failures of the system, since you can always contact the relevant online platform. As long as the provider does not take the cheap and easy option of saying the upload was a terms of service violation, as long as they don’t answer too slowly, as long as the file identifiers they were given were correct, as long as the usage was not a legal exception to copyright in one country but not in another, the CLS may have a point.

How this is going to work in practice for thousands of uploads taking place every minute of every day, and with many examples already known of these technologies failing, is a mystery. Will users get used to the idea of getting their blogposts, memes, videos or audios potentially censored by a company? Will this need to be taken to a court once the redress mechanism fails? Legal services have no response to that.

Data protection risks? No… but maybe yes!

Later on, when analysing the concerns raised regarding the right to data protection being affected by upload filters, the CLS replies “no”. The logic is simple: restrictions of freedom of expression do not breach of your freedom of expression, because you can complain (if the company does not take the easy option of claiming a terms of service violation). Similarly this activity is not a breach of your data protection rights, if no personal data is processed. What is not explained is how you can complain to a company about them filtering out your uploads, if the company that you are complaining to has processed no personal data and therefore does not know that it has filtered out your uploads.

Unsurprisingly, the document concludes that “in view of any technological developments, the envisaged proposed measures may lead to an interference with the right to personal data protection, further guarantees will need to be foreseen”.

Legal confusion regarding the e-Commerce Directive

When analysing whether the Copyright Directive impacts the e-Commerce Directive (contrary to the assertions of the European Commission), the CLS admits that the Commission’s proposal text for the Directive gives no clear answer. The CLS says that “the confusing terms in which that recital is drafted raise various legitimate questions to which, regrettably, no clear answers are given”. The CLS seems comfortable with the notion that it is proportionate and legally possible to impose one (harsher) interpretation for copyright, through the Copyright Directive, while leaving a less harsh regime in place for terrorism and child abuse. It may well be legally possible for one set of words (the e-Commerce Directive) to end up with two different meanings. It may well be legally possible to interpret those words in a way that treats the upload of a meme to YouTube as worse than uploading a beheading video. It may well be.

This document ends with two conclusions: First, the CLS replies to the Member States that asked about the upload filter proposals that the measures in the Article 13 are not necessarily disproportionate (but they might be and nobody actually knows what they will be). Secondly, it states and that there is no legal certainty regarding recital 38 of the Copyright Directive and its impact on the e-Commerce Directive.

Leak: Council legal service supports more clarity on proposed copyright reform (16.10.2017)
http://www.politico.eu/wp-content/uploads/2017/10/copyright-document.pdf

Six states raise concerns about legality of Copyright Directive (05.09.2017)
https://edri.org/six-states-raise-concerns-about-legality-of-copyright-directive/

Leaked document: Questions from Member States to the Council legal services on the Censorship Machine (05.09.2017)
http://statewatch.org/news/2017/sep/eu-copyright-ms-questions.htm

EU countries question legality & attack on fundamental rights
http://copybuzz.com/analysis/eu-countries-question-legality-attack-fundamental-rights/

Proposed Copyright Directive – Commissioner confirms it is illegal (28.06.2017)
https://edri.org/proposed-copyright-directive-commissioner-confirms-it-is-illegal/

(Contribution by Diego Naranjo and Joe McNamee, EDRi)

Twitter_tweet_and_follow_banner

close
17 Oct 2017

Privacy Camp 2018: Speech, settings and [in]security by design

By Kirsten Fiedler

Join us for the 6th annual Privacy Camp! Privacy Camp will take place on 23 January 2018 in Brussels, Belgium, just before the start of the CPDP conference. Privacy Camp brings together civil society, policy-makers and academia to discuss existing and looming problems for human rights in the digital environment. In the face of a “shrinking civic space” for collective action, the event aims to provide a platform for actors from across these domains to discuss and develop shared principles to address key challenges for digital rights and freedoms of individuals.

Our theme this year is “speech, settings and [in]security by design”. The two main tracks of the event therefore will focus on the one hand on the security of devices and infrastructure, and on the other hand on areas and policies where legitimate speech is endangered. Participate!

The first track will include sessions on state hacking and malware, law enforcement access to user data (so-called “e-evidence”), device security with hands-on tutorials on how to protect your communications better.

The second track will include sessions on algorithmic decision-making and discrimination via big data and privacy-invasive measures to censor legitimate speech online, as well as the hacking of democracies via the spread of misinformation and propaganda.

The event is co-organised by European Digital Rights (EDRi), Privacy Salon, USL-B Institute for European Studies and VUB-LSTS. Participation is free. Registrations will open in early December.

Participate!

We invite you to propose a panel for one of these two tracks:

Track 01 [in]security of devices
Topics: #statehacking #encryption #surveillance #statemalware #Eevidence #security #technopolitics

Track 02 [in]security of speech
Topics: #uploadfilters #censorship #algorithms #discrimination #accountability #hackingelections #misinformation #propaganda

When submitting your proposal:

  • Indicate a clear objective for your session: What would be a good outcome for you?
  • Indicate other speakers that could participate in your panel (and let us know which speaker has already confirmed, at least in principle, to participate).
  • Make it as participative as possible, think about how to include the audience and diverse actors as much as possible.
  • Send us a description of no more than 500 words.
  • Deadline for submissions is 20 November.

After the deadline, we will review your submission and let you know by 6 December whether your panel can be included in the programme. It is possible that we suggest to merge proposals if they are very similar.

Please send your proposal via email to Maren <edri.intern3(at)edri(dot)org>!

If you have questions, please contact Kirsten <kirsten.fiedler(at)edri(dot)org> or Imge <imge.ozcan(at)vub(dot)be>.

close
16 Oct 2017

EU’s plans on encryption: What is needed?

By Maryant Fernández Pérez

On 18 October 2017, the European Commission is expected to publish a Communication on counter-terrorism, which will include some lines on encryption.

What is encryption? Why is it important?

When we send an encrypted message (or store an encrypted document), no one else but the intended recipient can read it using a unique key. So even if someone manages to intercept the message when it’s on its way to the recipient, they will not be able to read its contents without the key – they can only see something that looks like a random set of characters. Encryption ensures the confidentiality of our personal data and company secrets. This is not only essential for our democratic rights and freedoms, but it also promotes trust in our digital infrastructure and communications, which is vital for innovation and economic growth. For example, encryption is essential for securing online banking transactions, and protecting the confidentiality of sources in journalism.

Encryption workarounds needed?

The European Commission has come under pressure from some EU Member States to take actions to address the perceived problem of data not being available to law enforcement authorities, due to encryption. This issue is frequently hyped as a major problem, and certain politicians have suggested simplistic and counter-productive policies to weaken encryption as a “solution” to them.

There are several techniques that law enforcement authorities use to access encrypted data. One approach consists in obtaining the key to decrypt the data, for instance, through a physical search when the key is saved on a USB drive. The key can also be obtained from the user directly, for example via social engineering or legal obligation. Another approach is to access the decrypted data through bypassing the key by exploiting a flaw or weakness in the system or by installing software or spyware. However, the existence of workarounds does not mean that law enforcement should resort to them nor that they would be necessary or proportionate, or even compatible with human rights law.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

From a technical point of view, encryption cannot be weakened “just a little”, without potentially introducing additional vulnerabilities, even if unintentionally. When there is a vulnerability, anyone can take advantage of it, not just the police or intelligence services. Sooner or later, a secret vulnerability will be exploited by a malicious user, perhaps the same one it was meant to be safeguarding us from. Law enforcement aims are legitimate. However, as pointed out by the European Union Agency for Network and Information Security (ENISA), limiting the use of encryption will create vulnerabilities, lower trust in the economy and damage civil society and industry alike.

What should the European Union do?

A more balanced approach is needed, which avoids much of the rhetoric that is often heard in relation to encryption. Such an approach would recognise a variety of options for addressing this issue without compromising everybody’s security or violating human rights.

Saying “no” to backdoors is a step into the right direction, but not the end of the debate, as there are still many ways to weaken encryption. The answer to security problems like those created by terrorism cannot be the creation of security risks. On the contrary, the EU should focus on stimulating the development and the use of high-grade standards for encryption, and not in any way undermine the development, production or use of high-grade encryption.

We are concerned by the potential inclusion of certain aspects of the forthcoming Communication, such as the increase of capabilities of Europol and what this may entail, and references to removal of allegedly “terrorist” content without accountability in line with the Commission’s recent Communication on tackling illegal content online. We remain vigilant regarding the developments in the field of counter-terrorism.

Read more:

Encryption – debunking the myths (03.05.2017)
https://edri.org/encryption-debunking-myths/

EDRi delivers paper on encryption workarounds and human rights (20.09.2017)
https://edri.org/edri-paper-encryption-workarounds/

EDRi position paper on encryption (25.01.2016)
https://www.edri.org/files/20160125-edri-crypto-position-paper.pdf

How the internet works (23.01.2012, available in six languages)
https://edri.org/files/2012EDRiPapers/how_the_internet_works.pdf

Twitter_tweet_and_follow_banner

close
16 Oct 2017

Civil society calls for the deletion of the #censorshipmachine

By EDRi

Today, 16 October, European Digital Rights (EDRi), together with 56 other civil society organisations, sent an open letter to EU decision makers. The letter calls for the deletion of the Article 13 of the Copyright Directive proposal, pointing out that monitoring and filtering of internet content that it proposes breach citizens’ fundamental rights.

The proposals in the Copyright Directive would relegate the European Union from a digital rights defender in global internet policy discussions to the leader in dismantling fundamental rights, to the detriment of internet users around the world,

said Joe McNamee, Executive Director of EDRi.

The censorship filter proposal would apply to all online platforms hosting any type of user-uploaded content such as YouTube, WordPress, Twitter, Facebook, Dropbox, Pinterest or Wikipedia. It would coerce platforms into installing filters that prevent users from uploading copyrighted materials. Such a filter would require the monitoring of all uploads and would be unable to differentiate between copyright infringements and legitimate uses of content authorised by law. It undermines legal certainty for European businesses, as it creates legal chaos and offers censorship filters as a solution.

The letter points out that the censorship filter proposal of the Article 13:

  1. would violate the right to freedom of expression set out in the Charter of Fundamental Rights;
  2. provoke such legal uncertainty that online services would have no other option than to monitor, filter and block EU citizens’ communications; and
  3. includes obligations on internet companies that would be impossible to respect without imposing excessive restrictions on citizens’ fundamental rights.

Read the letter below.

In September 2016, The European Commission published its proposal for a new Copyright Directive that aims at modernising EU copyright rules. The proposal has received mixed responses so far in the European Parliament and is awaiting a vote in the Legal Affairs Committee of the Parliament.

 


Article 13 – Monitoring and Filtering of Internet Content is Unacceptable
Open letter

Dear President Juncker,
Dear President Tajani,
Dear President Tusk,
Dear Prime Minister Ratas,
Dear Prime Minister Borissov,
Dear Ministers,
Dear MEP Voss,
Dear MEP Boni,

The undersigned stakeholders represent fundamental rights organisations.

Fundamental rights, justice and the rule of law are intrinsically linked and constitute core values on which the EU is founded. Any attempt to disregard these values undermines the mutual trust between member states required for the EU to function. Any such attempt would also undermine the commitments made by the European Union and national governments to their citizens.

Article 13 of the proposal on Copyright in the Digital Single Market include obligations on internet companies that would be impossible to respect without the imposition of excessive restrictions on citizens’ fundamental rights.

Article 13 introduces new obligations on internet service providers that share and store user-generated content, such as video or photo-sharing platforms or even creative writing websites, including obligations to filter uploads to their services. Article 13 appears to provoke such legal uncertainty that online services will have no other option than to monitor, filter and block EU citizens’ communications if they are to have any chance of staying in business.

Article 13 contradicts existing rules and the case law of the Court of Justice. The Directive of Electronic Commerce (2000/31/EC) regulates the liability for those internet companies that host content on behalf of their users. According to the existing rules, there is an obligation to remove any content that breaches copyright rules, once this has been notified to the provider.

Article 13 would force these companies to actively monitor their users‘ content, which contradicts the ‘no general obligation to monitor‘ rules in the Electronic Commerce Directive. The requirement to install a system for filtering electronic communications has twice been rejected by the Court of Justice, in the cases Scarlet Extended (C 70/10) and Netlog/Sabam (C 360/10). Therefore, a legislative provision that requires internet companies to install a filtering system would almost certainly be rejected by the Court of Justice because it would contravene the requirement that a fair balance be struck between the right to intellectual property on the one hand, and the freedom to conduct business and the right to freedom of expression, such as to receive or impart information, on the other.

In particular, the requirement to filter content in this way would violate the freedom of expression set out in Article 11 of the Charter of Fundamental Rights. If internet companies are required to apply filtering mechanisms in order to avoid possible liability, they will. This will lead to excessive filtering and deletion of content and limit the freedom to impart information on the one hand, and the freedom to receive information on the other.

If EU legislation conflicts with the Charter of Fundamental Rights, national constitutional courts are likely to be tempted to disapply it and we can expect such a rule to be annulled by the Court of Justice. This is what happened with the Data Retention Directive (2006/24/EC), when EU legislators ignored compatibility problems with the Charter of Fundamental Rights. In 2014, the Court of Justice declared the Data Retention Directive invalid because it violated the Charter.

Taking into consideration these arguments, we ask the relevant policy-makers to delete Article 13.

Signatories:

Civil Liberties Union for Europe (Liberties)
European Digital Rights (EDRi)

Access Info
ActiveWatch
Article 19
Associação D3 – Defesa dos Direitos Digitais
Associação Nacional para o Software Livre (ANSOL)
Association for Progressive Communications (APC)
Association for Technology and Internet (ApTI)
Association of the Defence of Human Rights in Romania (APADOR)
Associazione Antigone
Bangladesh NGOs Network for Radio and Communication (BNNRC)
Bits of Freedom (BoF)
BlueLink Foundation
Bulgarian Helsinki Committee
Center for Democracy & Technology (CDT)
Centre for Peace Studies
Centrum Cyfrowe
Coalizione Italiana Libertà e Diritti Civili (CILD)
Code for Croatia
COMMUNIA
Culture Action Europe
Electronic Frontier Foundation (EFF)
epicenter.works
Estonian Human Rights Centre
Freedom of the Press Foundation
Frënn vun der Ënn
Helsinki Foundation for Human Rights
Hermes Center for Transparency and Digital Human Rights
Human Rights Monitoring Institute
Human Rights Watch
Human Rights Without Frontiers
Hungarian Civil Liberties Union
Index on Censorship
International Partnership for Human Rights (IPHR)
International Service for Human Rights (ISHR)
Internautas
JUMEN – Human Rights Work in Germany
Justice & Peace
La Quadrature du Net
Media Development Centre
Miklos Haraszti (Former OSCE Media Representative)
Modern Poland Foundation
Netherlands Helsinki Committee
One World Platform
Open Observatory of Network Interference (OONI)
Open Rights Group (ORG)
OpenMedia
Panoptykon
Plataforma en Defensa de la Libertad de Información (PDLI)
Reporters without Borders (RSF)
Rights International Spain
South East Europe Media Organisation (SEEMO)
South East European Network for Professionalization of Media (SEENPM)
Statewatch
The Right to Know Coalition of Nova Scotia (RTKNS)
Xnet

CC: Permanent and Deputy Permanent Representatives of the Members States to the EU
CC: Chairs of the JURI and LIBE Committees in the European Parliament
CC: Shadow Rapporteurs and MEPs in the JURI and LIBE Committees in the European Parliament
CC: Secretariats of the JURI and LIBE Committees in the European Parliament
CC: Secretariat of the Council Working Party on Intellectual Property (Copyright)
CC: Secretariat of the Council Working on Competition
CC: Secretariat of the Council Research Working Party


Read more:

Article 13 Open letter – Monitoring and Filtering of Internet Content is Unacceptable (16.10.2017)
https://www.liberties.eu/en/news/delete-article-thirteen-open-letter/13194

Over 50 Human Rights & Media Freedom NGOs ask EU to Delete Censorship Filter & to Stop Copyright Madness (16.10.2017)
http://copybuzz.com/copyright/50-human-rights-media-freedom-ngos-ask-eu-delete-censorship-filter-stop-madness/

Deconstructing the Article 13 of the Copyright proposal of the European Commission, revision 2
https://edri.org/files/copyright/copyright_proposal_article13.pdf

The Copyright Reform: a guide for the perplexed
https://edri.org/files/copyright/Copyright_guide_for_the_perplexed.pdf

Copyright reform: Document pool
https://edri.org/copyright-reform-document-pool/

Six states raise concerns about legality of Copyright Directive (05.09.2017)
https://edri.org/six-states-raise-concerns-about-legality-of-copyright-directive/

Proposed Copyright Directive – Commissioner confirms it is illegal (28.06.2017)
https://edri.org/proposed-copyright-directive-commissioner-confirms-it-is-illegal/

Twitter_tweet_and_follow_banner

close
13 Oct 2017

Europe’s governments win the Big Brother Awards 2017 for opening the pandora’s box of surveillance

By EDRi

On Friday 13 October, the annual Belgian Big Brother Awards – a negative prize for the worst privacy abuser of the year – took place in Brussels. The jury awarded the European trend of state hacking, European Digital Right’s (EDRi) nomination, the title of the ultimate privacy villain. The public voted Automatic number-plate recognition (ANPR) cameras as their “favourite”.

State hacking has rapidly become a very powerful tool for intelligence services in recent years. Europe’s governments have been expanding the possibilities of states spying on their own citizens and pushed the fashion of “insecurity by design”. In 2017, the Belgian government followed this trend and adopted legislation that gives law enforcement authorities permission to access computers and other devices remotely. It adopted a text modifying the law on “security and intelligence services” granting the authorities new broad surveillance powers. To put it simply, it legalised the most intrusive form of government hacking.

“Government hacking affects people’s privacy rights and freedom of expression in new and deeply invasive ways – it also means an undermining of the security of the internet. Governments engaged in such activities have systematically failed to implement minimum safeguards for human rights”, said Kirsten Fiedler, Managing Director of EDRi.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

“The WannaCry attack  has highlighted that there are serious repercussions when known vulnerabilities are not immediately reported and fixed. Current practices of the intelligence services are damaging not only for the security of European citizens, but also for businesses, public administrations and critical infrastructures – like hospitals, schools and public transport”, she added.

What is state or government hacking?

Hacking means the manipulation of software, data, computer systems, networks, or other electronic devices without the permission of the person or organisation responsible. For instance, malicious software developed by a government, often relying on software flaws that are not publicly known. This means that the software flaws remain open and available for criminals to exploit. Governments hack devices with the aim to monitor computer activities and get access to sensitive data.

In 2014, it was revealed that the British intelligence service, the Government Communications Headquarters (GCHQ), had hacking capacities to activate a device’s microphone or webcam, to identify the location of a device, to collect login details and passwords for websites and record internet browsing histories on a device. The German intelligence service developed similar software, which was discovered in 2011 by EDRi-member Chaos Computer Club (CCC). Now, in March 2017, the Belgian government has given its services the power to remotely access its citizens’ devices and install malware (see Art. 38, Law modifying the law from 30 November 1998 governing the intelligence and security services).

Why is government hacking a problem?

Giving intelligence services such powers makes it difficult for individuals to protect their personal data and companies to protect their trade secrets from these kinds of attacks. Moreover, it allows foreign intelligence services to more effortlessly spy on state secrets, and it opens pandora’s box for third parties to access and control critical infrastructures – this could for example plunge hospitals into chaos. It gives governments an incentive not to report software vulnerabilities that it is aware of, facilitating crime in the name of fighting crime.

EDRi’s paper “Encryption Workarounds – A digital rights perspective” (pages 9-11) includes proposals for safeguards that need to be met to provide adequate protection of fundamental rights in cases of government hacking.

The Big Brother Awards are based on a concept created by EDRi member Privacy International. The goal is to draw attention to violations of privacy. The Belgian Big Brother Awards is organised by EDRi member Liga Voor Mensenrechten in collaboration with PROGRESS Lawyers Network (PLN), Datapanik.org, La Ligue des droits de l’Homme (LDH) and European Digital Rights (EDRi).

Belgian Big Brother Awards 2017
https://bigbrotherawards.be/en/

Encryption Workarounds – A digital rights perspective
https://edri.org/files/encryption/workarounds_edriposition_20170912.pdf

Big Brother Awards Belgium: Facebook is the privacy villain of the year (06.10.2016)
https://edri.org/bba-belgium-2016/

Twitter_tweet_and_follow_banner

close
05 Oct 2017

Dear MEPs: We need you to protect our privacy online!

By EDRi

They’re hip, they’re slick and they follow you everywhere. They know you like new shoes, playing tennis and tweeting at odd hours of the morning. Do you know what that says about your health, your relationships and your spending power? No? Well, the online companies do. They follow you everywhere you go online, they have a perfect memory, they know the sites you visited last year even if you’ve forgotten… Look who’s stalking.

European legislation protecting your personal data was updated in 2016, but the battle to keep it safe is not over yet. The European Union is revising its e-Privacy rules. We welcomed the European Commission (EC) proposal as a good starting point, but with room for improvement. The online tracking industry is lobbying fiercely against it. Online tracking and profiling gave us filter bubbles and echo chambers. Yet the lobbyists lobby for it under the pretext of “saving the internet”, “protecting quality journalism” – even “saving democracy”.

The European Parliament is currently debating its position on the EC proposal. Some Members of the European Parliament (MEPs) support “tracking business, as usual” while others support a strong future-proof norm to protect the privacy, innovation and security of future generations of EU citizens and businesses.

Priorities for defending privacy and security:

1) Protect confidentiality of our communications – both in transit and at rest!
Confidentiality of communications needs to be protected both in transit and when it is stored. Lobbyists have been campaigning for a technicality that would allow them to read and exploit your emails stored in the cloud. (Art. 5)

2) Protect our privacy: Do not add loopholes to security measures!
A “legitimate interest” exception was not included in any version of the previous e-Privacy Directives. This would be a major weakening of the legislation compared with existing rules. Our member Bits of Freedom wrote about the problems with “legitimate interest” here. (several Articles and Recitals)

3) Do not let anyone use our data without asking for our consent!
It is crucial to keep consent as the legal ground to process communications data. Neither “legitimate interest” nor “further processing” should be allowed to weaken the security and privacy of European citizens and businesses (Art.6)

4) Privacy should not be an option – what we need is privacy by default!
Provisions about default privacy settings need to be strengthened and improved, certainly not watered down or deleted. e-Privacy must ensure “privacy by design and by default” and not, as in the EC proposal, “privacy by option”. You can find our specific proposals here. The European Parliament previously adopted a Directive that criminalises unauthorised access to computer systems. It would be completely incoherent if it were to adopt legislation that foresees default settings that do not protect against unauthorised access to devices. (Art. 10)

5) No new exceptions to undermine our privacy!
Exceptions for Member States cannot become a carte blanche rendering e-Privacy useless. Therefore, the safeguards established by the Court of Justice of the European Union on cases regarding the exceptions in the relevant sections of the e-Privacy Regulation should be diligently respected – the scope of the exception should not be expanded. (Art. 11)

6) Do not undermine encryption!
Imposing a ban on undermining or attacking encryption should be a priority.

7) Protect our devices (hardware+software) by design and by default!
Hardware and software security need to be protected by design and by default.

MEPs, protect our #ePrivacy – Support amendments that follow the principles listed above!

e-Privacy revision: Document pool
https://edri.org/eprivacy-directive-document-pool/

e-Privacy: Consent (pdf)
https://edri.org/files/eprivacy/e-privacy-onepager_consent.pdf

e-Privacy: Legitimate interest (pdf)
https://edri.org/files/eprivacy/e-privacy-onepager_legitimate-interest.pdf

e-Privacy: Privacy by design and by default (pdf)
https://edri.org/files/eprivacy/e-privacy-onepager_privacy-by-default.pdf

e-Privacy: Offline tracking (pdf)
https://edri.org/files/eprivacy/e-privacy-onepager_offline-tracking.pdf

Your privacy, security and freedom online are in danger (14.09.2016)
https://edri.org/privacy-security-freedom/

Five things the online tracking industry gets wrong (13.09.2017)
https://edri.org/five-things-the-online-tracking-industry-gets-wrong/

ePrivacy Regulation: Call a representative and make your voice heard!
https://eprivacy.laquadrature.net/-piphone/

Who’s afraid of… e-Privacy? (04.10.2017)
https://medium.com/@privacyint/whos-afraid-of-e-privacy-7969a1cfe776

Twitter_tweet_and_follow_banner

close
04 Oct 2017

ENDitorial: Tinder and me: My life, my business

By Maryant Fernández Pérez

Tinder is one of the many online dating companies of the Match Group. Launched in 2012, Tinder started being profitable as of 2015, greatly thanks to people’s personal data. On 3 March 2017, journalist Judith Duportail asked Tinder to send her all her personal data they had collected, including her “desirability score”, which is composed of the “swipe-left-swipe-right” ratio and many other pieces of data and mathematic formulae that Tinder does not disclose. Thanks to her determination and support from lawyer Ravi Naik, privacy expert Paul-Olivier Dehaye and the work of Norwegian consumers advocates, Judith reported on 27 September 2017 that she received 800 pages about her online dating-related behaviour.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Tinder did not disclose how desirable the company considered Duportail to be, though, even if it had disclosed it to another journalist. The 800 pages contained information such as her Facebook “likes”, her Instagram pictures (even if she had deleted her account), her education, how many times she had connected to Tinder, when and where she entered into online conversations, and many more things. “I was amazed by how much information I was voluntarily disclosing”, Duportail stated.

800 pages of personal data – surprising?

As a Tinder user, you should know that you “agree” to Tinder’s terms of use, privacy policy and safety tips, as well as other terms disclosed if you purchase “additional features, products or services”. These include the following:

  • “You understand and agree that we may monitor or review any Content you post as part of a Service.”
  • “If you chat with other Tinder users, you provide us the content of your chats.”
  • “We do not promise, and you should not expect, that your personal information, chats, or other communications will always remain secure.”
  • “By creating an account, you grant to Tinder a worldwide, transferable, sub-licensable, royalty-free, right and license to host, store, use, copy, display, reproduce, adapt, edit, publish, modify and distribute information you authorize us to access from Facebook, as well as any information you post, upload, display or otherwise make available (collectively, ‘post’) on the Service or transmit to other users (collectively, ‘Content’).”
  • “You agree that we, our affiliates, and our third-party partners may place advertising on the Services.”
  • “If you’re using our app, we use mobile device IDs (the unique identifier assigned to a device by the manufacturer), or Advertising IDs (for iOS 6 and later), instead of cookies, to recognize you. We do this to store your preferences and track your use of our app. Unlike cookies, device IDs cannot be deleted, but Advertising IDs can be reset in “Settings” on your iPhone.”
  • “We do not recognize or respond to any [Do Not Track] signals, as the Internet industry works toward defining exactly what DNT means, what it means to comply with DNT, and a common approach to responding to DNT.”
  • “You can choose not to provide us with certain information, but that may result in you being unable to use certain features of our Service.”

Tinder explains in its Privacy Policy – but not in the summarised version of the terms – that you have a right to access and correct your personal data. What is clear to the company is that you “voluntarily” provided your information (and that of others). Duportail received part of the information Tinder and its business partners hold, no doubt partly because she is a journalist. Her non-journalist friends have not experienced the same benevolence. Your personal data has an effect not only on your online dates, “but also what job offers you have access to on LinkedIn, how much you will pay for insuring your car, which ad you will see in the tube and if you can subscribe to a loan”, Paul-Olivier Dehaye highlights.

Worse still, even if you close your account or delete info, Tinder or its business partners do not necessarily delete it. And the worst, you’ve “agreed” to it: “If you close your account, we will retain certain data for analytical purposes and recordkeeping integrity, as well as to prevent fraud, enforce our Terms of Use, take actions we deem necessary to protect the integrity of our Service or our users, or take other actions otherwise permitted by law. In addition, if certain information has already been provided to third parties as described in this Privacy Policy, retention of that information will be subject to those third parties’ policies.”

You should be in control

Civil society organisations fight this kind of practices, to defend your rights and freedoms. For instance, the Norwegian Consumer Council successfully worked for Tinder to change its terms of service. On 9 May 2017, EDRi and its member Access Now raised awareness about period trackers, dating apps like Tinder or Grindr, sex extortion via webcams and the “internet of (sex) things” at the re:publica 17 conference. Ultimately, examples like Duportail’s shows the importance of having strong EU data protection and privacy rules. Under the General Data Protection Regulation, you have a right to access your personal data, and companies should provide privacy by default and design in their services. Now, we are working on the e-Privacy Regulation to ensure you have real consent instead of a tick on a box of something you never read, to prevent companies from tracking you unless you provide express and specific consent, among many other things.

Now that you know about this or have been reminded of this, spread the word! It does not matter whether you are on Tinder or not. This is about your online future.

I asked Tinder for my data. It sent me 800 pages of my deepest, darkest secrets (26.09.2017)
https://www.theguardian.com/technology/2017/sep/26/tinder-personal-data-dating-app-messages-hacked-sold

Getting your data out of Tinder is really hard – but it shouldn’t be (27.09.2017)
https://www.theguardian.com/technology/2017/sep/27/tinder-data-privacy-tech-eu-general-data-protection-regulation

Safer (digital) sex: pleasure is just a click away (09.05.2017)
https://re-publica.com/en/17/session/safer-digital-sex-pleasure-just-click-away

Tinder bends for consumer pressure (30.03.2017)
https://www.forbrukerradet.no/siste-nytt/tinder-bends-for-consumer-pressure

(Contribution by Maryant Fernández Pérez, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close