24 Aug 2016

France and Germany: Fighting terrorism by weakening encryption

By Heini Järvinen

On 23 August, the French and German Ministers of Interior met in Paris to discuss an initiative that would extend surveillance in Europe and weaken encryption, in the name of the fight against terrorism.

Speaking at a joint press conference, French Minister of Interior Bernard Cazeneuve and his German counterpart Thomas de Maizière called for legislation that would force intermediaries to weaken encryption standards. This would, according to Cazeneuve, allow to “truly arm our democracies on the issue of encryption”. Cazeneuve also explained, failing to notice that the European e-Commerce Directive already contains this obligation, that they want to oblige internet companies to censor illegal content.

The French Ministry of Interior explained its intentions in a tweet: the country plans to ask the EU Commission to put forward an EU-wide measure that would oblige online companies, such as WhatsApp or Telegram, to decrypt communications within the context of police investigations – even if the company’s seat is not in Europe. The upcoming review of the ePrivacy Directive is very likely to become the next encryption battlefield.

These plans do not meet the approval of the French data protection authority CNIL which stated in an Op-Ed in the French newspaper Le Monde that the call for encryption backdoors “is not taking into account the importance of encryption for our security online”.

Cazeneuve first announced his intentions to “launch a European initiative, leading to a more international plan that will permit to face this new challenge” after a French government meeting on security on 11 August. Now Cazeneuve and de Maizière hope to have the issue on the agenda for the next meeting of European leaders in Bratislava on 16 September.

French intelligence services claim to be struggling with intercepting messages from Islamist extremists. However, many of the suspects of recent terrorist attacks were using unencrypted SMS, and were already known to the authorities. The investigation into the Brussels attacks in March 2016 revealed that inefficient intelligence and police work was one of the key factors that failed to prevent the attacks.

................................................................. Support our work with a one-off-donation! https://edri.org/donate/ .................................................................

Encrypted messaging services, such as Telegram, Whatsapp or Signal, can be used for sending text messages, videos and voice messages with a very high level of security. It’s extremely difficult for anyone else but the authorised recipient to read or view messages sent using end-to-end encryption. Today, encryption is used widely across the web to secure e-commerce, banking and many other online services, as well as by journalists, whistleblowers, civil rights defenders and others who need to maintain confidentiality of their communications. As the Report of the United Nations (UN) Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression David Kaye put it,

encryption and anonymity enable individuals to exercise their rights to freedom of opinion and expression in the digital age and, as such, deserve strong protection.

France says fight against messaging encryption needs worldwide initiative (11.08.2016)

Paris wants a global action on encrypted communications (only in French, 11.08.2016)

Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, David Kaye (22.05.2015)

Fraco-German initiative on the European interior security (only in French, 23.08.2016)

Tweet by the French Ministry of Interior (only in French, 23.08.2016): https://twitter.com/Place_Beauvau/status/767998554153648129

Fight against terrorism: Cazeneuve and Maiziere meet in Paris (only in French, 23.08.2016)

French minister: Apps like Telegram must be decrypted for legal probes (23.08.2016)

Attacking encryption to fight terrorism is a wrong target (only in French, 23.08.2016)



05 Jul 2016

PROCEED WITH CAUTION: Flexibilities in the General Data Protection Regulation

By Diego Naranjo

We regret that much of the ambition of the original data protection package was lost, due to one of the biggest lobbying campaigns in European history. However, we congratulate the European Parliament — for saving the essence of European data protection legislation.[1]

On 14 April 2016, the European Parliament adopted two legal instruments that will regulate the fundamental right to data protection of individuals: the General Data Protection Regulation (GDPR) and the Law Enforcement Data Protection Directive (LEDP).

Despite the overall positive outcome of the GDPR, we regret that many of the initial high expectations for the Regulation were not realised. Once the final text was passed, and ahead of the preparation of guidelines for its implementation, we have published two documents where we analyse the numerous national flexibilities contained in the text  of the Regulation. The results can be found here (the full analysis of all the flexibilities) and here (short document with the most dangerous flexibilities).


The analysis looks at the key pitfalls to be avoided in transposing these national flexibilities into Member State law. The task is huge, bearing in mind that there are almost as many provisions in which Member States can implement the Regulation differently than there are articles were in the preceding Data Protection Directive. Some of the flexibilities are harmless, but many others could be perceived by governments as opportunities to allow them to ignore essential elements of the Regulation.

We hope that this analysis can help national governments and data protection authorities to implement the GDPR in a way which protects the essence of the right to data protection by implementing the most privacy friendly interpretation of these flexibilities.

Although this analysis is a shared effort of several EDRi members and EDRi staff, we would like to give our heartfelt thanks to Chris Pounder for the initial analysis of flexibilities in the Regulation and Douwe Korff for his extensive assessment of the options available.

[1] Press Release: Vote on Data Protection and Passenger Name Record package (13.04.2016)


01 Jun 2016

The lobby-tomy 7: Not all roads lead to privacy

By Guest author

Within the privacy world, different schools of thought exist. Connecting different viewpoints to a seemingly positive ideology is also sales technique.

The new European data protection regulation is the most lobbied piece of legislation thus far. This is because the subject is very important and touches upon almost every aspect of our daily lives. Therefore EDRi member Bits of Freedom used the Dutch freedom of information act to ask the government to publish all the lobby documents they received on this new law. Bits of Freedom published these documents on their website with their analysis in a series of blogs. What parties lobby? What do they want? What does that mean for you? These nine articles are now translated into English for the EDRi-gram. This is part 7.

If one school of thought has successfully been put in the limelight, it is the “risk-based approach”. It means that when policy makers formulate obligations for industry, they should take the identifiable risks of data processing into account. Strict obligations should only accompany identified large risks. But that can’t be an excuse to create a lower level of protection for people.

................................................................. Support our work with a one-off-donation! https://edri.org/donate/ .................................................................

If we read the lobby letters correctly, one of the most important offices behind this approach is the ”Centre for Information Policy Leadership” of Hunton en Williams “LLP”. Although the term is older, they launched a “risk based approach framework” in January 2014, after which the subject has resurfaced repeatedly.

The data protection regulation creates new obligations for organisations that plan to process a certain quantity of data. An organisation is for example required to do a “privacy impact assessment” before processing data, in which it will have to evaluate the consequences of the processing for people’s privacy. In some cases, the processing should be notified to the data protection authority. Apart from that, organisations should have a data protection officer, who handles supervision of all privacy related issues internally. Furthermore, organisations are required to notify data breaches to anyone connected to the data.

Companies are not happy about this. We already mentioned in a previous blog that these are the themes that have been lobbied on the most. They say, briefly: allow us to only fulfill those obligations if it’s to mitigate large and already identified risks.


It isn’t surprising that many of the “usual suspects” support this risk based approach. TechAmerica Europe, an organisation that represented the interests of European technology companies “with American parentage”, strongly supported this. Banks also welcome such an approach, as shown in their email to the Dutch embassy to the EU – the so-called “permanent representation”. Thuiswinkel.org, a Dutch e-commerce company, says in an email to the Dutch Ministry of Justice: “The current reforms are not adequate enough in the eyes of Thuiswinkel.org, in particular because the proposals lack a ‘risk-based’ approach.” Even the Royal Academy for Sciences seems to be a proponent of this approach.


To strengthen their arguments, different parties use “commitment and consistency”. The trick with this is that people like to present one unambiguous image of themselves. So people will want to act in ways that are congruent with their statements. Therefore, the Centre for Policy Leadership uses statements of influential politicians from the group of people they are trying to influence, who have been positive about the risk based approach.

In a letter by the Centre for Information Policy Leadership to the Ministry of Justice European Commissioner Viviane Reding is quoted as a proponent of the risk based approach, just like the Council of Ministers that the letter aims to convince. You were in favor of a risk based approach right? Then you should also agree to our demands. The former European Data Protection Supervisor Peter Hustinx once made positive statements about this approach, and these are quoted quite happily in a letter by the Industry Coalition for Data Protection (ICDP) to the Ministry of Justice:
“ICDP strongly agrees with the European Data Protection Supervisor Peter Hustinx that data protection legislation is most effective when it follows a risk-based approach.”


A risk based approach can’t be an excuse to evade important obligations, as the committee of privacy watchdogs in Europe stated. A well described liability based on agreed criteria can assure that companies keep privacy protection in mind at an early stage of data processing or planning. Those criteria should obviously be proportionate, so a sole trader that serves only fifty customers per year shouldn’t be required to send a privacy impact assessment to the data protection authority every week or to hire a data protection officer (not that anyone ever suggested that, it has to be said). But we should also be wary of abuse. For example, Digital Europe, a lobby organisation for digital businesses, wants to make sure that companies can decide for themselves what constitutes risk. That would make evading supervision very easy.

Privacy schools of thought

Connecting your viewpoints to clear schools of thought can help your cause. That’s why more schools of though than the “risk based approach” are mentioned in the lobby documents. Vodafone wants a more “principle based” approach, which means they want more flexibility. Yet other companies mention the “harm based approach”, the “use based approach”, the “precautionary based approach” and others.

Whatever school of thought one prefers, no one can currently predict the risks well, particularly in a world of “big data”. What we do know is that more data will be collected and will be increasingly used. This makes every choice we make now only more important for privacy protection in the future.

To be continued

Want to continue reading about this? On the Bits of Freedom website, you can find all the lobby documents and the analysis. The next part will be about the anti-fraud argument.

Lobby-tomy series (only in Dutch)

(Contribution by Floris Kreiken, EDRi member Bits of Freedom, The Netherlands)



30 May 2016

EU Commission under investigation for EU Internet Forum documents

By Kirsten Fiedler

In the past year, EDRi made numerous formal requests to get more information about the EU Internet Forum. This Forum was set up by the EU Commission to persuade companies to do “more” to fight terrorism. After months of obstruction from the European Commission, EDRi made a maladministration complaint to the European Ombudsman. As a result, a formal inquiry has been launched.

Privatised censorship

The problem: The action points agreed with online companies in secret meetings of the Forum may have a direct negative impact on our freedom of expression. Why? Because one of the topics that is being discussed is the censoring of online content by private companies – without any judicial process.

Many case studies highlighted by onlinecensorship.org have shown that private companies regularly violate fundamental rights in the online space, flouting the principle that restrictions on civil and human rights must be based on law. This practice is now being encouraged and pushed by the EU Commission.

Additionally, the EU Commission repeatedly denied us access to the documents that are being discussed by the IT Forum. The reason for our requests is simple: The EU Commission has a very bad record of keeping such projects in line with fundamental rights.

Exclusion of civil society

Moreover, despite the fact that the Commission announced in its “Communication on the European Security Agenda” the need for an inclusion of civil society in such projects, no civil society organisation has been allowed to participate in the Forum’s meetings on terrorism.

We have raised this criticism on multiple occasions – in meetings with the department for Migration and Home Affairs (DG HOME) and in our position paper (pdf). This exclusion fails to respect the institutions’ responsibility to give citizens the opportunity to “publicly exchange their views in all areas of Union action” (Art. 11 of the Treaty on European Union).

EDRi’s complaint and Ombuds(wo)man investigation

The maladministration investigation has been launched following a complaint submitted by EDRi to Emily O’Reilly, the EU Ombudsman, on 17 February (Letter by the Ombudsman, pdf). The complaint points out that:

  1. The Commission systematically failed to respect the legal deadlines to respond to our requests.
  2. The Commissionʹs decision to merge (a process called “joining” in EU jargon) two of our access requests (GestDem 2015/6363 and GestDem 2016/0095) lacks any legal basis. By default the Commission should make non-confidential documents directly available.
  3. The Commission wrongly refused full access to the note of 10 June 2015 (pdf) and to the concept note (pdf).

The Ombudsman responded that she has decided to open an inquiry into the third and last claim. The letter states that she will be

carrying out an inspection of the relevant documents. I have therefore asked the Commission to facilitate, in accordance with Article 3(2) of the Statute of the European Ombudsman, my inspection of the Commission’s note of 10 June 2015 and the related concept note (to which only partial access was granted in the context of access request GestDem 2015/3658).

As regards the first claim, the letter states that Ombudsman is not opening an inquiry as this widespread practice is already the object of an own‐initiative inquiry. Regarding the second claim, the Ombudsman suggested we raise our request with the Commission again.

[Update 30 June 2016] We have received a letter by the EU Ombuds(wo)man informing us that she has carried out an inspection (pdf). The inspection report (pdf) finds that

The Commission transmitted copies of the unredacted documents to be inspected to the Ombudsmanʹs inquiry team by electronic means. It further classified them as confidential.

The Ombudsman invited the Commission to submit an opinion on the complaint by 30 September 2016.

We will continue to report on the EU Internet Forum and the inquiry on our website.


26 May 2016

European Parliament confirms that “Privacy Shield” is inadequate


The European Parliament has adopted a Resolution on the “Privacy Shield”. This is the new agreement to permit data to be transferred from the EU to the USA. The previous agreement – “Safe Harbour” – was overturned by the European Court of Justice in October 2015.

The Parliament’s resolution confirms that the new agreement has no chance of being upheld, if challenged at the Court of Justice of the European Union,

said Joe McNamee, Executive Director of European Digital Rights.

It questions the legal meaning of the assurances received from the USA. It points out that indiscriminate (“bulk”) surveillance is still possible and the new Ombudsman role is inadequate.

The Parliament adopted a similar resolution in 2000, when the illegal Safe Harbour agreement was adopted, but its recommendations were ignored for 15 years. The Parliament failed to demand meaningful improvements before adoption.

Incomprehensibly the Parliament voted against a sunset clause that could have been a means to inspire a meaningful renegotiation. This means that the USA will have no incentive to make any concessions. This means that the fundamental rights of European Citizens will be undermined until Privacy Shield is overturned. This means that European businesses can have no legal certainty if they rely on this agreement, which is broken by design.

Under EU data protection rules, personal data can only be transferred outside the EU under certain circumstances. The EU negotiated “Safe Harbour” as a special arrangement for the USA in 2000. After the Snowden revelations in 2013, the European Commission recognised that the arrangement was inadequate. It spent two years trying and failing to bring the deal into line with the EU’s legal framework. Then, in October 2015, the framework was overturned. The “Privacy Shield” is meant to replace the “Safe Harbour”.


Read more:

Transatlantic coalition of civil society groups: Privacy Shield is not enough – renegotiation is needed (16.03.02016)

What’s behind the shield? Unspinning the “privacy shield” spin (02.03.2016)

Opinion 01/2016 on the EU–U.S. Privacy Shield draft adequacy decision (13.04.2016)

Fifteen years late, Safe Harbor hits the rocks (06.10.2015)


25 May 2016

EU Council & Commission plan to give law enforcement authorities access to data of foreign IT companies


EU Commissioner Věra Jourová revealed plans to increase the competences of criminal law enforcement authorities in a speech at the European Criminal Law Academic Network. She announced that the Council of the European Union is currently drafting Conclusions. This draft document calls for law enforcement agencies to have direct cross-border access to personal data held by foreign service providers, without a mutual legal assistance procedure. This is more than worrisome for the privacy of European citizens.

Initially Jourová pointed out that it is important to “accelerate and streamline” mutual legal assistance (MLA) requests between national authorities, in order to collect digital evidence easier. However, “where mutual legal assistance is not suitable or available” (what this means is not explained – the Data Retention Directive was proposed in part because MLATs are time-consuming but, in the last 8 years were not seen as a priority for reform) it is important that certain types of data, including personal data, can be requested from IT companies directly.

Intransparent Council discussions

Jourová says that the new data protection rules allow for a smoother cooperation and exchange of information between police and justice authorities, as they provide common standards of data protection. Those rules “will ensure that personal data, for instance of victims or witnesses of crime, is properly protected”. The Council draft goes by the name “Conclusions on improving criminal justice in cyberspace”, however, there is no document number assigned yet.

Commissioner Jourová did not release further details about the draft. However, EDRi received a copy of a document by the German government (only in German, pdf) which provides more information regarding the content and goals of the draft Conclusions. In order to improve “criminal justice in cyberspace”, the draft wants to introduce measures that help securing electronic evidence, which otherwise would have to be deleted. This is being described especially problematic in cross-border cases (echoing the analysis from ten years ago in relation to the Data Retention Directive). The aim is to set up rules that allow a short-term access to certain data categories of information, in particular personal data. One way to achieve that, is to improve the direct cooperation of law enforcement agencies with foreign service providers.

The Council Conclusions were also discussed controversially in the Committee on Internal Security (COSI), with Member States expressing their concerns over the proposal. It was considered especially problematic that a mere “business link” of a provider to a state will constitute a sufficient legal basis for data requests by foreign law enforcement authorities. According to the proposal it would not even require a business establishment in the concerned Member States to enable direct information claims. Some EU countries were pointing out that their sovereignty must not be undermined by such measures. France allegedly expressed concern that it was not legal under national law for providers to provide foreign law enforcement agencies with data.

In the end, there was wide-ranging agreement that future discussions must include other stakeholders, such as law enforcement authorities and IT businesses.

The Conclusions will be presented at the meeting of the Justice and Home Affairs Council on 9 June. The European Commission was asked to present analysis and, where appropriate, proposals before the summer 2017.

................................................................. Support our work - make a recurrent donation! https://edri.org/supporters/ .................................................................

Mutual Legal Assistance Agreement between the US and the EU

In parallel to the draft Conclusions, the Council is also working on an Agreement with the United States regarding mutual legal assistance (pdf). This Agreement aims to modify and refine the old MLA Agreement, which entered into force in 2010. It aims to ensure an effective cooperation between participating Members States of the EU and the US in the field of criminal justice and combating organised crime and terrorism. At the moment, the Council is waiting for the Committee of Article 36 (CATS) to approve the draft text, to allow the text to be formally agreed in June 2016.

The Agreement plans to make available “the ability to obtain information on bank accounts, form Joint Investigation Teams (JITS), transmit requests using faster means of communications, obtain witness evidence by video conferencing, and secure evidence for use by administrative bodies where subsequent criminal proceedings are envisaged.”

In contrast to the Conclusions discussed above, this US-EU Agreement does not plan to bypass the current MLA system. Despite the fact that the Agreement mentions the possibility of “direct access of EU-Member States to data held by Internet Service Providers”, it wants to ensure that this conduct still requires so-called “probable clause”. To meet this criterion, the government must present specific, detailed and reliable facts to a court to demonstrate that a criminal offence has been committed. Without “probable clause” it would not be possible to request data.

(Article written by Claudius Determann, EDRi intern)


18 May 2016

Europol: Non-transparent cooperation with IT companies


Will the European Police Office’s (Europol’s) database soon include innocent people reported by Facebook or Twitter? The Europol Regulation, which has been approved on 11 May 2016, not only provides a comprehensive new framework for the police agency, but it also allows Europol to share data with private companies like Facebook and Twitter.

The history of Europol legislation

Europol supports Member States in more than 18 000 cross-border investigations a year. It started in 1993 as an intergovernmental Europol Drugs Unit (EDU) and became an international organisation with its own legal apparatus in 1995. A Council Decision from 2009 (371/JHA) established Europol as an EU agency. In March 2013, the European Commission proposed a reform of Europol via a draft regulation. After three years of work, the European Parliament approved the political compromise reached with the Council and the Commission on 11 May. The new Regulation will be applicable in May 2017.

What is Europol doing?

Europol serves as an information hub for law enforcement agencies of Member States and supports their actions in fighting against thirty types of “serious crimes”, including terrorism, organised crime, “immigrant smuggling”, “racism and xenophobia” and “computer crime”. Rather than focusing on specific areas of crime, the targets of Europol have now even been increased. For example, “sexual abuse and sexual exploitation, including child abuse material”, as well as “genocide and war crimes” are also included in the list.

................................................................. Support our work with a one-off-donation! https://edri.org/donate/ .................................................................

In all these areas, Europol’s main task is to “collect, store, process, analyse and exchange information” that it gathers from Member States, EU bodies and “from publicly available sources, including the internet and public data”.

Unlike its American equivalent, the FBI, Europol has no executive powers. It can only notify Member States of possible criminal offences in their jurisdiction, but not start an investigation on its own or arrest anybody. Thus, after notification, it is upon the Member State whether to investigate or not. If a Member State decides to take action, Europol shall provide support, which can also include participation in joint investigation teams.

The Internet Referral Unit

One of the controversial parts of the new Regulation is related to Europol’s EU Internet Referral Unit (IRU). The IRU has been operative since July 2015, and is part of the newly-founded European Counter Terrorism Centre (ECTC) at Europol. Among other tasks, the IRU monitors the internet looking for content that is “incompatible” with the terms of service of “online service providers” like Facebook, so they can “voluntarily consider” what to do with it. In other words, the IRU does not assess whether the content is illegal or not, and companies are not obliged to remove illegal content. They are “encouraged” to “voluntarily” do something with the referrals. The IRU has the power to create pressure to have apparently legal content deleted by companies, but bear no responsibility or accountability for doing so. How this complies with the EU Charter’s obligation that restrictions (on freedom of communication in this case) of fundamental rights must be “provided for by law” (rather than, for example, this kind of ad hoc, informal arrangement) is not obvious. Equally non-obvious is compliance with the requirement that such restrictions are “necessary and genuinely meet objectives of general interest”. If it was necessary to delete such content, then such content would (obviously?) be illegal, rather than just a possible breach of terms of service.

A document from the European Commission from April 2016 shows that the IRU has so far assessed over 4700 posts across 45 platforms and sent over 3200 referrals for internet companies to remove content, with an effective removal rate of 91%. This means that companies are receiving pressure to remove content that a public authority has assessed, not on the basis of the law, but on the basis of a private contract between a company and an internet user.

The whole procedure is non-transparent and is not subject to judicial oversight. In this sense, the European Data Protection Supervisor (EDPS) had recommended that Europol makes at least the list of its cooperation agreements with companies publicly available. However, this recommendation was not taken into account, and did not make it to the final text of the new Regulation.

Europol may receive and transfer personal data

Europol can also “receive” personal data, which is publicly available, from private parties like Facebook and Twitter directly. Before, this was only possible via a national police unit, assuming compliance with national law. Now a company has to declare that it is legally allowed to transfer that data and the transfer “concerns an individual and specific case”, while “no fundamental rights […] of the data subjects concerned override the public interest necessitating the transfer”. This has not been part of Europol’s rules before, and the Commission has to evaluate this practice after two years of implementation. At least until then, Europol can also feed this information into its databases on the internet.

While “Europol shall not contact private parties to retrieve personal data”, it will now be able to transfer information to private entities. The Regulation puts in place several measures to safeguard personal data protection. However, there are no transparency requirements to inform the public about any type of information exchange between Europol and companies. The Regulation only empowers a “Joint Parliamentary Scrutiny Group” to request documents. However, that remains rather unclear in the text, and will be governed by “working arrangements” between Europol and the Parliament. The accountability and respect for the rule of law in this deal is not clear.

Official Journal of the European Union: Text of the Europol Regulation adopted by the Council (11.03.2016) and the European Parliament (11.05.2016)

Police cooperation: MEPs approve new powers for Europol to fight terrorism (11.05.2016)

European Data Protection Supervisor, Executive Summary of the Opinion of the EDPS on the proposal for a Regulation on Europol (31.05.2013)

Communication from the Commission, delivering on the European Agenda on Security to fight against terrorism and pave the way towards an effective and genuine Security Union (20.04.2016)

(Contribution by Fabian Warislohner, EDRi intern)



18 May 2016

Looking back through the French anti-terror arsenal

By Guest author

Following the publication of the Action Plan Against Terrorism and Radicalisation by the French Government, summarising the whole anti-terror strategy of France, built up law by law during the past years, it is important to look back on the main measures presented in this report, especially those affecting civil rights and liberties on the Internet.

In the last two years, France adopted several laws aimed at reinforcing anti-terror measures that include access to connection data without judicial intervention (Defence Law of 2013), administrative blocking of websites, more strict sentences in some cases if an offence is committed online, remote computer searches (anti-terror law of 2014), mass surveillance (see French’s surveillance law of 2015) that includes international communications as well (International Surveillance Law 2015), and a reinforcement of some of those measures after the declaration of the State of Emergency following the Paris attacks on 13 November 2015.

................................................................. Support our work - make a recurrent donation! https://edri.org/supporters/ .................................................................

New measures affecting our rights are planned, such as those included in the currently-debated “Digital Republic Bill”. It includes plans for the involvement of the “actors of the Internet” (such as intermediaries and platforms) in the fight against online terrorist propaganda and by promoting “counter-speech”.

In addition, the bill reforming the French Criminal Justice system extends the scope of the State of Emergency, broadly anchoring it in law. This bill criminalises the “frequent consultation of terrorist websites”. It also reinforces measures on the administrative blocking of the websites, inserting a new obstruction offence to block terrorists websites. This bill jeopardises the balance of power mostly between the Judicial and the Executive power. Without drawing any constructive conclusions from the tragic events of November 2015, this bill is the proof of a dramatic headlong rush by the French Government.

On 17 May, an amendment tabled by the right-wing Member of Parliament (MP) Éric Ciotti to the new bill for a “21st Century Justice”, suggests to link video surveillance and facial recognition in public spaces.

The measures regarding civil rights and liberties on the Internet seriously infringe our online privacy and the right of information, while undermining the oversight of judges ex ante. Most of these measures are taken by administrative authorities. Therefore, they don’t require a judge to conduct a legal assessment about the actions to be taken. What is more, these laws undermine the role of the so-called “investigating magistrate”, the French independent judge who leads the investigation phase for the prosecutor of the French Republic and the judge of liberties and detention, who are less independent and have a less technical knowledge of the cases.

On the other hand, the question of encryption is being addressed with an aggressive stand to infringe citizens’ online privacy and security. It seems clearer everyday that the aim of the French Government is to carry out an offensive political agenda regarding civil rights online.

French civil rights organisation La Quadrature du Net is working hard to raise awareness of the dangers of these laws approved, and will keep challenging them using any legal means available before the French courts (the Council of State and Constitutional Council) and also European courts (the European Court of Justice and the European Court of Human Rights).

It is of utmost importance to understand the whole repressive architecture of those laws that go far beyond the “fight against terrorism” in order to challenge them. France is currently pushing those same measures in the Directive on combating terrorism, being currently discussed by the European Parliament. There is an urgent need to stop these dangerous provisions before they become binding for all Member States, and not to give in to fear but protect our values, our rights and liberties. The promotion of free software, end-to-end encryption of communications and decentralised solutions for citizens has never been as important as today, to counter mass surveillance from public and private actors.

Action Plan Against Terrorism and Radicalisation (only in French, 09.05.2016)

EDRi: EDRi’s recommendations for the European Parliament’s Draft Report on the Directive on Combating Terrorism (29.03.2016)

EDRi: Countering terrorism, a.k.a. the biggest human rights threat of 2016 (20.04.2016)

(Contribution by Christopher Talib, La Quadrature du Net, France)



18 May 2016

Danish ticketing system a threat to privacy

By Guest author

Like many countries, Denmark is replacing paper tickets for public transportation with electronic tickets. The Danish system, called Rejsekort (“travel card”), is a contactless chip card similar to the Oyster card in the United Kingdom and the OV-chipkaart in the Netherlands.

At the start of the journey, the passenger holds the card in front of a check-in card reader, and this procedure is repeated when changing to another transport vehicle (train, metro or bus). At the end of the journey, the passenger holds the card in front of a check-out card reader, and the fare for the completed journey is calculated and subtracted from the balance of the card. Check-in/out card readers are placed at all train and metro stations and in buses.

................................................................. Support our work - make a recurrent donation! https://edri.org/supporters/ .................................................................

For passengers, the chip card offers convenience. It can be used for public transport in most parts of Denmark, and passengers do not have to be familiar with the complicated fare structure. For example, in the Greater Copenhagen area, there are eight different price levels for a ticket depending on the number of zones in the journey and, in some cases, the number of zones can differ between the outbound and inbound journey.

The Rejsekort card exists in personalised and non-personalised versions, the latter being called Rejsekort Anonymous. The personalised card, which requires proof of identity similar to opening a bank account, offers a number of incentives to citizens: greater fare discounts, automatic transfer of money from a credit card to the Rejsekort, and the possibility of transferring the balance to a replacement card if the Rejsekort is lost or stolen. Despite its name, the non-personalised Rejsekort is not really anonymous since all chip cards have a unique number, and all journeys along with the unique card number are registered in the back-end systems of the Rejsekort company. Passengers can, of course, get a new non-personalised card regularly to protect their privacy, but the price of the card itself is about 10 euro, and the remaining balance on the old card is lost.

From a privacy perspective, the Danish Rejsekort is a disaster, because the unique card number is connected to all journeys. The journeys of all card holders are registered in a central database, and this information is currently retained for five years, together with the citizen ID number (for the personalised card). Whereas mass public transport in trains and buses previously offered a relatively high degree of anonymity (save for the ever more pervasive CCTV surveillance cameras), it has now become similar to air travel where so-called Passenger Name Records (PNR) are created and stored for every journey. Unlike air travel, the anonymous travel option does still exist with the more expensive paper tickets.

There has been some public debate and criticism of the data retention practices in the Rejsekort system. The response from the publicly-owned travel card company has been that since the Rejsekort is a payment card (with limited applicability to paying for public transport), the Danish legislation for bookkeeping and measures against money laundering (based on EU law) makes it mandatory to keep information about every transaction, that is every journey, for five years. Furthermore, the travel patterns of every passenger are analysed for various fraud detection purposes. The Rejsekort is based on the Mifare Classic design which is lacking in terms of security. However, card hacking is not viewed as a problem by the Rejsekort company because the company believes that any attempted fraud can be detected in the back-end systems. In some sense, surveillance of passengers’ travel transactions is used to compensate for the inadequate security of the chip card.

The fare structure for the Rejsekort gives passengers an incentive to not to check out on long journeys or to check out before their final destination, especially when travelling by bus where the check-out card reader is placed inside the bus itself. According to the terms and conditions for the Rejsekort, a personalised card can be blocked after three journeys where the check-out is not done properly, and in that case the cardholder will be put on a blacklist so that she/he is unable to get a new personalised card for a year. The fraud detection system probably looks for uncompleted journeys and travel patterns that may otherwise indicate partial fare evasion, like premature check-out. The latter profiling involves cross-referencing with general customer information which could include the address of the passenger, but the precise details of the profiling for fraud detection are not known.

Because of the public criticism, the Danish government asked the law firm Poul Schmith (Kammeradvokaten) to investigate the data processing practices of the Rejsekort company. The report from the law firm was published on 29 March 2016. In an earlier assessment of the Rejsekort system, the independent Danish Data Protection Agency did not have any remarks about the five-year retention period for all journeys, but the report from the law firm concludes that there is no legal requirement to keep information about every journey for five years. It is only necessary to keep the information until the customer can no longer dispute the transaction, that is payment for the journey. The law firm indicates that this period could be three years as this is the statutory limitation period for simple financial claims in Denmark. A privacy-friendly argument for a shorter period than three years could also be made here, since a customer generally loses the right to dispute through inactivity. The official guidelines for the Danish bookkeeping administrative order contains an example with a telephone company where it is stated that only documentation about invoiced/paid amounts must be stored for five years, not details of the individual calls. When the telephone calls can no longer be disputed by the customer, the aggregate invoice is sufficient bookkeeping documentation. Clearly, the same principle must apply to a ticketing system like Rejsekort, but apparently the Rejsekort company had missed this detail in the official bookkeeping guidelines.

A second recommendation from the law firm Poul Schmith is that customers should give consent to the processing of personal data for fraud detection. Currently, no information at all is provided about this processing to the customers. This recommendation is a bit odd. The Rejsekort company argues that the processing can be done without consent because the legitimate interest exception applies to the fraud detection. Moreover, consent as a legal basis for processing hardly makes sense here since customers cannot really refuse (if they want a Rejsekort), and it seems rather unlikely that the Rejsekort company will provide sufficient information so that the consent actually becomes meaningful. Quite interestingly, there is a discussion in the report as to whether the consent to data processing for fraud detection will be coerced or not. The law firm argues that the consent is voluntary, but only because alternatives to the Rejsekort exist, especially single-journey paper tickets. These alternatives are however more expensive and more cumbersome to use.

The Rejsekort company has announced that it will follow the recommendations made by the law firm. This also applies to some of the minor points about reducing the number of employees with access to the central database with journeys, and ensuring written documentation for agreements with data processors.

What is rejsekort? (homepage of Rejsekort A/S)

Investigation of the processing of personal data in rejsekort by the law firm Poul Schmith (only in Danish, 29.03.2016)

(Contribution by Jesper Lund, EDRi member IT-pol, Denmark)



12 May 2016

European Digital Rights at re:publica 2016


Last week, the re:publica, “Europe’s most exciting conference on Internet and society”, took place in Berlin. EDRi’s members and observers were out in force and participated in the 10th anniversary of the re:publica. We’ve collected all talks by our network for you (in chronological order):

Fight for your digital rights (in German)
Link to re:publica website
Markus Beckedahl, Digitale Gesellschaft/netzpolitik.org

Inside Hacking Team (in English)
Link to re:publica website
Vladan Joler, Share Foundation

Lessons on fighting Internet shutdowns (in English)
Link to re:publica website
Estelle Massé, Access Now

#SaveTheInternet – a new hope for net neutrality in Europe (in English)
Link to re:publica website
Thomas Lohninger, AK Vorrat Österreich and Barbara van Schewick, Professor of Law Stanford Law School

How to defend civil liberties with lawsuits – and kick governments’butts in the process (in English)
Link to re:publica website
Matthias Spielkamp, Reporters without borders Germany, and Ulf Buermeyer, Gesellschaft für Freiheitsrechte/netzpolitik.org

CTRL – living in an era of ubiquitous surveillance (in English)
Link to re:publica website
Kirsten Fiedler and Maryant Fernández, EDRi, and Estelle Masse, Access Now

Ad-wars – Ausflug in die Realitaet der Online-Werbung (in German)
Link to re:publica website
Frank Rieger, Chaos Computer Club (CCC), and Thorsten Schroeder

#freeLy: Bloggers and restrictions on the twin freedoms of media and
religion in Vietnam (in German)

Link to re:publica website
Christian Mihr, Reporter ohne Grenzen

Online Freedom of Expression in the Arab World: Obstacles and Solutions (in English)
Link to re:publica website
Dalia Othman,Tactical Tech Collective/VecBox, and Jillian York, Electronic Frontier Foundation

How the EU works (in English)
Link to re:publica website
Kirsten Fiedler, EDRi, Fukami

Who will be in a smart city? Upcoming challanges for privacy and open societies. (in English)
Link to re:publica website
Julia Manske, stiftung neue verantwortung, and Eva Blum-Dumontet, Privacy International

Representation of Secret Services in PopCulture (in English)
Link to re:publica website
Theresia Reinhold, filmmaker & historian, and Laura Ede

Online platforms as human rights arbiters (in English)
Link to re:publica website
Rikke Frank Joergensen, Danish Institute for Human Rights

Hacking with Care!
Link to re:publica website
Emily King, Hacking With Care, and Jérémie Zimmermann, La Quadrature du Net

Netzpolitischer Abend des Digitale Gesellschaft e.V. (in German)
Link to re:publica website
Alexander Sander and Volker Tripp, Digitale Gesellschaft e.V.