26 May 2016

European Parliament confirms that “Privacy Shield” is inadequate


The European Parliament has adopted a Resolution on the “Privacy Shield”. This is the new agreement to permit data to be transferred from the EU to the USA. The previous agreement – “Safe Harbour” – was overturned by the European Court of Justice in October 2015.

The Parliament’s resolution confirms that the new agreement has no chance of being upheld, if challenged at the Court of Justice of the European Union,

said Joe McNamee, Executive Director of European Digital Rights.

It questions the legal meaning of the assurances received from the USA. It points out that indiscriminate (“bulk”) surveillance is still possible and the new Ombudsman role is inadequate.

The Parliament adopted a similar resolution in 2000, when the illegal Safe Harbour agreement was adopted, but its recommendations were ignored for 15 years. The Parliament failed to demand meaningful improvements before adoption.

Incomprehensibly the Parliament voted against a sunset clause that could have been a means to inspire a meaningful renegotiation. This means that the USA will have no incentive to make any concessions. This means that the fundamental rights of European Citizens will be undermined until Privacy Shield is overturned. This means that European businesses can have no legal certainty if they rely on this agreement, which is broken by design.

Under EU data protection rules, personal data can only be transferred outside the EU under certain circumstances. The EU negotiated “Safe Harbour” as a special arrangement for the USA in 2000. After the Snowden revelations in 2013, the European Commission recognised that the arrangement was inadequate. It spent two years trying and failing to bring the deal into line with the EU’s legal framework. Then, in October 2015, the framework was overturned. The “Privacy Shield” is meant to replace the “Safe Harbour”.


Read more:

Transatlantic coalition of civil society groups: Privacy Shield is not enough – renegotiation is needed (16.03.02016)

What’s behind the shield? Unspinning the “privacy shield” spin (02.03.2016)

Opinion 01/2016 on the EU–U.S. Privacy Shield draft adequacy decision (13.04.2016)

Fifteen years late, Safe Harbor hits the rocks (06.10.2015)


25 May 2016

EU Council & Commission plan to give law enforcement authorities access to data of foreign IT companies


EU Commissioner Věra Jourová revealed plans to increase the competences of criminal law enforcement authorities in a speech at the European Criminal Law Academic Network. She announced that the Council of the European Union is currently drafting Conclusions. This draft document calls for law enforcement agencies to have direct cross-border access to personal data held by foreign service providers, without a mutual legal assistance procedure. This is more than worrisome for the privacy of European citizens.

Initially Jourová pointed out that it is important to “accelerate and streamline” mutual legal assistance (MLA) requests between national authorities, in order to collect digital evidence easier. However, “where mutual legal assistance is not suitable or available” (what this means is not explained – the Data Retention Directive was proposed in part because MLATs are time-consuming but, in the last 8 years were not seen as a priority for reform) it is important that certain types of data, including personal data, can be requested from IT companies directly.

Intransparent Council discussions

Jourová says that the new data protection rules allow for a smoother cooperation and exchange of information between police and justice authorities, as they provide common standards of data protection. Those rules “will ensure that personal data, for instance of victims or witnesses of crime, is properly protected”. The Council draft goes by the name “Conclusions on improving criminal justice in cyberspace”, however, there is no document number assigned yet.

Commissioner Jourová did not release further details about the draft. However, EDRi received a copy of a document by the German government (only in German, pdf) which provides more information regarding the content and goals of the draft Conclusions. In order to improve “criminal justice in cyberspace”, the draft wants to introduce measures that help securing electronic evidence, which otherwise would have to be deleted. This is being described especially problematic in cross-border cases (echoing the analysis from ten years ago in relation to the Data Retention Directive). The aim is to set up rules that allow a short-term access to certain data categories of information, in particular personal data. One way to achieve that, is to improve the direct cooperation of law enforcement agencies with foreign service providers.

The Council Conclusions were also discussed controversially in the Committee on Internal Security (COSI), with Member States expressing their concerns over the proposal. It was considered especially problematic that a mere “business link” of a provider to a state will constitute a sufficient legal basis for data requests by foreign law enforcement authorities. According to the proposal it would not even require a business establishment in the concerned Member States to enable direct information claims. Some EU countries were pointing out that their sovereignty must not be undermined by such measures. France allegedly expressed concern that it was not legal under national law for providers to provide foreign law enforcement agencies with data.

In the end, there was wide-ranging agreement that future discussions must include other stakeholders, such as law enforcement authorities and IT businesses.

The Conclusions will be presented at the meeting of the Justice and Home Affairs Council on 9 June. The European Commission was asked to present analysis and, where appropriate, proposals before the summer 2017.

................................................................. Support our work - make a recurrent donation! https://edri.org/supporters/ .................................................................

Mutual Legal Assistance Agreement between the US and the EU

In parallel to the draft Conclusions, the Council is also working on an Agreement with the United States regarding mutual legal assistance (pdf). This Agreement aims to modify and refine the old MLA Agreement, which entered into force in 2010. It aims to ensure an effective cooperation between participating Members States of the EU and the US in the field of criminal justice and combating organised crime and terrorism. At the moment, the Council is waiting for the Committee of Article 36 (CATS) to approve the draft text, to allow the text to be formally agreed in June 2016.

The Agreement plans to make available “the ability to obtain information on bank accounts, form Joint Investigation Teams (JITS), transmit requests using faster means of communications, obtain witness evidence by video conferencing, and secure evidence for use by administrative bodies where subsequent criminal proceedings are envisaged.”

In contrast to the Conclusions discussed above, this US-EU Agreement does not plan to bypass the current MLA system. Despite the fact that the Agreement mentions the possibility of “direct access of EU-Member States to data held by Internet Service Providers”, it wants to ensure that this conduct still requires so-called “probable clause”. To meet this criterion, the government must present specific, detailed and reliable facts to a court to demonstrate that a criminal offence has been committed. Without “probable clause” it would not be possible to request data.

(Article written by Claudius Determann, EDRi intern)


18 May 2016

Europol: Non-transparent cooperation with IT companies


Will the European Police Office’s (Europol’s) database soon include innocent people reported by Facebook or Twitter? The Europol Regulation, which has been approved on 11 May 2016, not only provides a comprehensive new framework for the police agency, but it also allows Europol to share data with private companies like Facebook and Twitter.

The history of Europol legislation

Europol supports Member States in more than 18 000 cross-border investigations a year. It started in 1993 as an intergovernmental Europol Drugs Unit (EDU) and became an international organisation with its own legal apparatus in 1995. A Council Decision from 2009 (371/JHA) established Europol as an EU agency. In March 2013, the European Commission proposed a reform of Europol via a draft regulation. After three years of work, the European Parliament approved the political compromise reached with the Council and the Commission on 11 May. The new Regulation will be applicable in May 2017.

What is Europol doing?

Europol serves as an information hub for law enforcement agencies of Member States and supports their actions in fighting against thirty types of “serious crimes”, including terrorism, organised crime, “immigrant smuggling”, “racism and xenophobia” and “computer crime”. Rather than focusing on specific areas of crime, the targets of Europol have now even been increased. For example, “sexual abuse and sexual exploitation, including child abuse material”, as well as “genocide and war crimes” are also included in the list.

................................................................. Support our work with a one-off-donation! https://edri.org/donate/ .................................................................

In all these areas, Europol’s main task is to “collect, store, process, analyse and exchange information” that it gathers from Member States, EU bodies and “from publicly available sources, including the internet and public data”.

Unlike its American equivalent, the FBI, Europol has no executive powers. It can only notify Member States of possible criminal offences in their jurisdiction, but not start an investigation on its own or arrest anybody. Thus, after notification, it is upon the Member State whether to investigate or not. If a Member State decides to take action, Europol shall provide support, which can also include participation in joint investigation teams.

The Internet Referral Unit

One of the controversial parts of the new Regulation is related to Europol’s EU Internet Referral Unit (IRU). The IRU has been operative since July 2015, and is part of the newly-founded European Counter Terrorism Centre (ECTC) at Europol. Among other tasks, the IRU monitors the internet looking for content that is “incompatible” with the terms of service of “online service providers” like Facebook, so they can “voluntarily consider” what to do with it. In other words, the IRU does not assess whether the content is illegal or not, and companies are not obliged to remove illegal content. They are “encouraged” to “voluntarily” do something with the referrals. The IRU has the power to create pressure to have apparently legal content deleted by companies, but bear no responsibility or accountability for doing so. How this complies with the EU Charter’s obligation that restrictions (on freedom of communication in this case) of fundamental rights must be “provided for by law” (rather than, for example, this kind of ad hoc, informal arrangement) is not obvious. Equally non-obvious is compliance with the requirement that such restrictions are “necessary and genuinely meet objectives of general interest”. If it was necessary to delete such content, then such content would (obviously?) be illegal, rather than just a possible breach of terms of service.

A document from the European Commission from April 2016 shows that the IRU has so far assessed over 4700 posts across 45 platforms and sent over 3200 referrals for internet companies to remove content, with an effective removal rate of 91%. This means that companies are receiving pressure to remove content that a public authority has assessed, not on the basis of the law, but on the basis of a private contract between a company and an internet user.

The whole procedure is non-transparent and is not subject to judicial oversight. In this sense, the European Data Protection Supervisor (EDPS) had recommended that Europol makes at least the list of its cooperation agreements with companies publicly available. However, this recommendation was not taken into account, and did not make it to the final text of the new Regulation.

Europol may receive and transfer personal data

Europol can also “receive” personal data, which is publicly available, from private parties like Facebook and Twitter directly. Before, this was only possible via a national police unit, assuming compliance with national law. Now a company has to declare that it is legally allowed to transfer that data and the transfer “concerns an individual and specific case”, while “no fundamental rights […] of the data subjects concerned override the public interest necessitating the transfer”. This has not been part of Europol’s rules before, and the Commission has to evaluate this practice after two years of implementation. At least until then, Europol can also feed this information into its databases on the internet.

While “Europol shall not contact private parties to retrieve personal data”, it will now be able to transfer information to private entities. The Regulation puts in place several measures to safeguard personal data protection. However, there are no transparency requirements to inform the public about any type of information exchange between Europol and companies. The Regulation only empowers a “Joint Parliamentary Scrutiny Group” to request documents. However, that remains rather unclear in the text, and will be governed by “working arrangements” between Europol and the Parliament. The accountability and respect for the rule of law in this deal is not clear.

Text of the new Europol Regulation adopted by the Council (11.03.2016) and the European Parliament (11.05.2016)

Police cooperation: MEPs approve new powers for Europol to fight terrorism (11.05.2016)

European Data Protection Supervisor, Executive Summary of the Opinion of the EDPS on the proposal for a Regulation on Europol (31.05.2013)

Communication from the Commission, delivering on the European Agenda on Security to fight against terrorism and pave the way towards an effective and genuine Security Union (20.04.2016)

(Contribution by Fabian Warislohner, EDRi intern)



18 May 2016

Looking back through the French anti-terror arsenal

By Guest author

Following the publication of the Action Plan Against Terrorism and Radicalisation by the French Government, summarising the whole anti-terror strategy of France, built up law by law during the past years, it is important to look back on the main measures presented in this report, especially those affecting civil rights and liberties on the Internet.

In the last two years, France adopted several laws aimed at reinforcing anti-terror measures that include access to connection data without judicial intervention (Defence Law of 2013), administrative blocking of websites, more strict sentences in some cases if an offence is committed online, remote computer searches (anti-terror law of 2014), mass surveillance (see French’s surveillance law of 2015) that includes international communications as well (International Surveillance Law 2015), and a reinforcement of some of those measures after the declaration of the State of Emergency following the Paris attacks on 13 November 2015.

................................................................. Support our work - make a recurrent donation! https://edri.org/supporters/ .................................................................

New measures affecting our rights are planned, such as those included in the currently-debated “Digital Republic Bill”. It includes plans for the involvement of the “actors of the Internet” (such as intermediaries and platforms) in the fight against online terrorist propaganda and by promoting “counter-speech”.

In addition, the bill reforming the French Criminal Justice system extends the scope of the State of Emergency, broadly anchoring it in law. This bill criminalises the “frequent consultation of terrorist websites”. It also reinforces measures on the administrative blocking of the websites, inserting a new obstruction offence to block terrorists websites. This bill jeopardises the balance of power mostly between the Judicial and the Executive power. Without drawing any constructive conclusions from the tragic events of November 2015, this bill is the proof of a dramatic headlong rush by the French Government.

On 17 May, an amendment tabled by the right-wing Member of Parliament (MP) Éric Ciotti to the new bill for a “21st Century Justice”, suggests to link video surveillance and facial recognition in public spaces.

The measures regarding civil rights and liberties on the Internet seriously infringe our online privacy and the right of information, while undermining the oversight of judges ex ante. Most of these measures are taken by administrative authorities. Therefore, they don’t require a judge to conduct a legal assessment about the actions to be taken. What is more, these laws undermine the role of the so-called “investigating magistrate”, the French independent judge who leads the investigation phase for the prosecutor of the French Republic and the judge of liberties and detention, who are less independent and have a less technical knowledge of the cases.

On the other hand, the question of encryption is being addressed with an aggressive stand to infringe citizens’ online privacy and security. It seems clearer everyday that the aim of the French Government is to carry out an offensive political agenda regarding civil rights online.

French civil rights organisation La Quadrature du Net is working hard to raise awareness of the dangers of these laws approved, and will keep challenging them using any legal means available before the French courts (the Council of State and Constitutional Council) and also European courts (the European Court of Justice and the European Court of Human Rights).

It is of utmost importance to understand the whole repressive architecture of those laws that go far beyond the “fight against terrorism” in order to challenge them. France is currently pushing those same measures in the Directive on combating terrorism, being currently discussed by the European Parliament. There is an urgent need to stop these dangerous provisions before they become binding for all Member States, and not to give in to fear but protect our values, our rights and liberties. The promotion of free software, end-to-end encryption of communications and decentralised solutions for citizens has never been as important as today, to counter mass surveillance from public and private actors.

Action Plan Against Terrorism and Radicalisation (only in French, 09.05.2016)

EDRi: EDRi’s recommendations for the European Parliament’s Draft Report on the Directive on Combating Terrorism (29.03.2016)

EDRi: Countering terrorism, a.k.a. the biggest human rights threat of 2016 (20.04.2016)

(Contribution by Christopher Talib, La Quadrature du Net, France)



18 May 2016

Danish ticketing system a threat to privacy

By Guest author

Like many countries, Denmark is replacing paper tickets for public transportation with electronic tickets. The Danish system, called Rejsekort (“travel card”), is a contactless chip card similar to the Oyster card in the United Kingdom and the OV-chipkaart in the Netherlands.

At the start of the journey, the passenger holds the card in front of a check-in card reader, and this procedure is repeated when changing to another transport vehicle (train, metro or bus). At the end of the journey, the passenger holds the card in front of a check-out card reader, and the fare for the completed journey is calculated and subtracted from the balance of the card. Check-in/out card readers are placed at all train and metro stations and in buses.

................................................................. Support our work - make a recurrent donation! https://edri.org/supporters/ .................................................................

For passengers, the chip card offers convenience. It can be used for public transport in most parts of Denmark, and passengers do not have to be familiar with the complicated fare structure. For example, in the Greater Copenhagen area, there are eight different price levels for a ticket depending on the number of zones in the journey and, in some cases, the number of zones can differ between the outbound and inbound journey.

The Rejsekort card exists in personalised and non-personalised versions, the latter being called Rejsekort Anonymous. The personalised card, which requires proof of identity similar to opening a bank account, offers a number of incentives to citizens: greater fare discounts, automatic transfer of money from a credit card to the Rejsekort, and the possibility of transferring the balance to a replacement card if the Rejsekort is lost or stolen. Despite its name, the non-personalised Rejsekort is not really anonymous since all chip cards have a unique number, and all journeys along with the unique card number are registered in the back-end systems of the Rejsekort company. Passengers can, of course, get a new non-personalised card regularly to protect their privacy, but the price of the card itself is about 10 euro, and the remaining balance on the old card is lost.

From a privacy perspective, the Danish Rejsekort is a disaster, because the unique card number is connected to all journeys. The journeys of all card holders are registered in a central database, and this information is currently retained for five years, together with the citizen ID number (for the personalised card). Whereas mass public transport in trains and buses previously offered a relatively high degree of anonymity (save for the ever more pervasive CCTV surveillance cameras), it has now become similar to air travel where so-called Passenger Name Records (PNR) are created and stored for every journey. Unlike air travel, the anonymous travel option does still exist with the more expensive paper tickets.

There has been some public debate and criticism of the data retention practices in the Rejsekort system. The response from the publicly-owned travel card company has been that since the Rejsekort is a payment card (with limited applicability to paying for public transport), the Danish legislation for bookkeeping and measures against money laundering (based on EU law) makes it mandatory to keep information about every transaction, that is every journey, for five years. Furthermore, the travel patterns of every passenger are analysed for various fraud detection purposes. The Rejsekort is based on the Mifare Classic design which is lacking in terms of security. However, card hacking is not viewed as a problem by the Rejsekort company because the company believes that any attempted fraud can be detected in the back-end systems. In some sense, surveillance of passengers’ travel transactions is used to compensate for the inadequate security of the chip card.

The fare structure for the Rejsekort gives passengers an incentive to not to check out on long journeys or to check out before their final destination, especially when travelling by bus where the check-out card reader is placed inside the bus itself. According to the terms and conditions for the Rejsekort, a personalised card can be blocked after three journeys where the check-out is not done properly, and in that case the cardholder will be put on a blacklist so that she/he is unable to get a new personalised card for a year. The fraud detection system probably looks for uncompleted journeys and travel patterns that may otherwise indicate partial fare evasion, like premature check-out. The latter profiling involves cross-referencing with general customer information which could include the address of the passenger, but the precise details of the profiling for fraud detection are not known.

Because of the public criticism, the Danish government asked the law firm Poul Schmith (Kammeradvokaten) to investigate the data processing practices of the Rejsekort company. The report from the law firm was published on 29 March 2016. In an earlier assessment of the Rejsekort system, the independent Danish Data Protection Agency did not have any remarks about the five-year retention period for all journeys, but the report from the law firm concludes that there is no legal requirement to keep information about every journey for five years. It is only necessary to keep the information until the customer can no longer dispute the transaction, that is payment for the journey. The law firm indicates that this period could be three years as this is the statutory limitation period for simple financial claims in Denmark. A privacy-friendly argument for a shorter period than three years could also be made here, since a customer generally loses the right to dispute through inactivity. The official guidelines for the Danish bookkeeping administrative order contains an example with a telephone company where it is stated that only documentation about invoiced/paid amounts must be stored for five years, not details of the individual calls. When the telephone calls can no longer be disputed by the customer, the aggregate invoice is sufficient bookkeeping documentation. Clearly, the same principle must apply to a ticketing system like Rejsekort, but apparently the Rejsekort company had missed this detail in the official bookkeeping guidelines.

A second recommendation from the law firm Poul Schmith is that customers should give consent to the processing of personal data for fraud detection. Currently, no information at all is provided about this processing to the customers. This recommendation is a bit odd. The Rejsekort company argues that the processing can be done without consent because the legitimate interest exception applies to the fraud detection. Moreover, consent as a legal basis for processing hardly makes sense here since customers cannot really refuse (if they want a Rejsekort), and it seems rather unlikely that the Rejsekort company will provide sufficient information so that the consent actually becomes meaningful. Quite interestingly, there is a discussion in the report as to whether the consent to data processing for fraud detection will be coerced or not. The law firm argues that the consent is voluntary, but only because alternatives to the Rejsekort exist, especially single-journey paper tickets. These alternatives are however more expensive and more cumbersome to use.

The Rejsekort company has announced that it will follow the recommendations made by the law firm. This also applies to some of the minor points about reducing the number of employees with access to the central database with journeys, and ensuring written documentation for agreements with data processors.

What is rejsekort? (homepage of Rejsekort A/S)

Investigation of the processing of personal data in rejsekort by the law firm Poul Schmith (only in Danish, 29.03.2016)

(Contribution by Jesper Lund, EDRi member IT-pol, Denmark)



12 May 2016

European Digital Rights at re:publica 2016


Last week, the re:publica, “Europe’s most exciting conference on Internet and society”, took place in Berlin. EDRi’s members and observers were out in force and participated in the 10th anniversary of the re:publica. We’ve collected all talks by our network for you (in chronological order):

Fight for your digital rights (in German)
Link to re:publica website
Markus Beckedahl, Digitale Gesellschaft/netzpolitik.org

Inside Hacking Team (in English)
Link to re:publica website
Vladan Joler, Share Foundation

Lessons on fighting Internet shutdowns (in English)
Link to re:publica website
Estelle Massé, Access Now

#SaveTheInternet – a new hope for net neutrality in Europe (in English)
Link to re:publica website
Thomas Lohninger, AK Vorrat Österreich and Barbara van Schewick, Professor of Law Stanford Law School

How to defend civil liberties with lawsuits – and kick governments’butts in the process (in English)
Link to re:publica website
Matthias Spielkamp, Reporters without borders Germany, and Ulf Buermeyer, Gesellschaft für Freiheitsrechte/netzpolitik.org

CTRL – living in an era of ubiquitous surveillance (in English)
Link to re:publica website
Kirsten Fiedler and Maryant Fernández, EDRi, and Estelle Masse, Access Now

Ad-wars – Ausflug in die Realitaet der Online-Werbung (in German)
Link to re:publica website
Frank Rieger, Chaos Computer Club (CCC), and Thorsten Schroeder

#freeLy: Bloggers and restrictions on the twin freedoms of media and
religion in Vietnam (in German)

Link to re:publica website
Christian Mihr, Reporter ohne Grenzen

Online Freedom of Expression in the Arab World: Obstacles and Solutions (in English)
Link to re:publica website
Dalia Othman,Tactical Tech Collective/VecBox, and Jillian York, Electronic Frontier Foundation

How the EU works (in English)
Link to re:publica website
Kirsten Fiedler, EDRi, Fukami

Who will be in a smart city? Upcoming challanges for privacy and open societies. (in English)
Link to re:publica website
Julia Manske, stiftung neue verantwortung, and Eva Blum-Dumontet, Privacy International

Representation of Secret Services in PopCulture (in English)
Link to re:publica website
Theresia Reinhold, filmmaker & historian, and Laura Ede

Online platforms as human rights arbiters (in English)
Link to re:publica website
Rikke Frank Joergensen, Danish Institute for Human Rights

Hacking with Care!
Link to re:publica website
Emily King, Hacking With Care, and Jérémie Zimmermann, La Quadrature du Net

Netzpolitischer Abend des Digitale Gesellschaft e.V. (in German)
Link to re:publica website
Alexander Sander and Volker Tripp, Digitale Gesellschaft e.V.



20 Apr 2016

New data protection law in Turkey

By Guest author

Turkish Parliament enacted the Data Protection Law on 24 March 2016 and it entered into force on 7 April. There had been several attempts for enacting the Law over the course of more than 10 years, but all of the bills were later withdrawn by the AKP – Justice and Development Party (the ruling party since 2002) governments.

................................................................. Support our work - make a recurrent donation! https://edri.org/supporters/ .................................................................

The AKP was quite motivated to enact the law because:

  1. Due to the nonexistence of Data Protection Law, Turkey was regarded as an “unsafe country” in terms of data protection and this resulted in certain difficulties for collaboration of Turkish security agencies with its counterparts abroad.
  2. Many Turkish companies could not operate in Europe and other places for the same reason.
  3. The government has the plan of establishing a “World Finance Center” in Istanbul, but that would be impossible without a proper DP Law.
  4. Turkey is officially a candidate country for membership to the European Union which enforces member and candidate countries to adopt a DP Law.

The hesitation of the government was due to the concern that the previous versions of the law would not be in conformity with the contemporary approach and thus would not be adequate for realizing the objectives outlined above. In the earlier versions of the bill, the Data Protection Board was to be appointed by the government or the President which draws suspicion to its autonomy. In the last version, four members of the Board were to be appointed by the government and three members by the President. This has been changed last minute and the enacted law envisages that two members are to be appointed by the President, two by the government and five to be elected by the Parliament.

Another hesitation in the previous versions stemmed from the fact that several security agencies such as police and secret service were given exemption for collecting and processing personal data. Although the new law does not explicitly mention these organisations, it has several exemptions which will pave the way for these organisations legally to collect and process data.

CHP – People’s Republican Party, the largest opposition party, declared that it will apply to the Constitutional Court for the annuling of the law.

Turkey’s data protection draft law open to abuse: Expert

Turkey passes long-awaited data protection law (07.04.2016)

Turkey’s New Data Protection Law (15.04.2016)

Turkey Completes Final Step in Approving Data Protection Legislation (07.04.2016)



20 Apr 2016

The lobby-tomy 5: legal help or political choices?

By Guest author

Is legal help always objective? Writing laws is a complicated process. A frequently used lobby strategy involves offering “legal help” and arguments that promise legal certainty. Parties claim to make no substantive choices for policy makers, but is that really the case?

................................................................. Support our work - make a recurrent donation! https://edri.org/supporters/ .................................................................

The new European data protection regulation is the most lobbied piece of legislation thus far because the subject is very important and touches upon almost every aspect of our daily lives. Therefore Bits of Freedom used the Dutch freedom of information act to ask the government to publicise all the lobby documents they received on this new law. We published these documents on the Bits of Freedom website with our analysis in a series of blogs. What parties lobby? What do they want? What does that mean for you? We have now translated these nine blogs into English for the EDRi-gram. This is part 5.
Drafting legislation is a complicated process, in particular where it concerns laws of this magnitude. An additional issue is that the subject matter is often technical in nature. This means that policy makers actively seek the help of experts. It also means that any offered help is very welcome.

Technical amendments

Parties offer that help happily. The Dutch employers’ federation VNO-NCW offers the Dutch Permanent Representation (perm rep) its expertise in a 76 page letter. The letter contains “technical amendments.” In other words, matters that according to them are not political. It concerns the correct legal articulation of an article, but also other choices: how access request should be answered (“that they should be answered is without question”).
The letter contains a lot of legal fine tuning. For example, the employers’ organisation corrects the point regarding the obligation to provide information to people, explaining that this should happen in “a notice” and not in “a policy” which is written into the regulation. That is a justifiable correction: after all, you’re not sending policies to people, but a notification that contains that policy.

However, it appears choices are made that go one step further than mere legal fine tuning. In one article for example, they edit the text to say that an organization may process information for a legitimate interest or for “that of a third party.” That makes the article much broader in scope. Although they state that this would be a return to the previous privacy directive, it concerns choices that are controversial. They also write that it should be left to organisations themselves how they answer information requests (electronically or not?), but that also exceeds mere legal fine tuning. In yet other articles they talk about diminishing the “burdens” on companies – which frames the issue in very negative terms. Even though this can sometimes be a good thing, it isn’t necessarily neutral.


Techamerica Europe (an organisation which acted on behalf of tech companies with American roots) also offered some clarifications in an email to the perm rep and the ministry of justice. They mention a misunderstanding about profiling, in which they think the intention behind the article hasn’t been addressed properly. The text at the time said that people only have to be informed about profiling if it has a “significant effect” on them and that only then they should be offered an opt-out. This means that the protection of this article grants applies only in limited cases, because of its low threshold. However, they want to change the wording “significant effect” into “severely affects.” This would mean you would only have to offer an opt-out from profiling if it has really severe consequences. This makes the protection offerered by this article much more difficult to apply. About the original text they say:
“We reject this idea, and believe that the intention of the Article is to focus on clearly unfair or discriminatory practices such as the denial of insurance cover.”

Oh really? Many different organisations, including us, would disagree with that. To us, this article is about allowing people to know that personality profiles are being developed about them and allowing them protection from this. Furthermore, it would be difficult to prove “severe consequences” in this context, which would drastically limit the protection the article offers.

Legal certainty

Closely tied to this legal help is the concept of legal certainty. It means you should be able to trust a clear interpretation of the law, instead of encountering surprising interpretations that could cost you either freedom, money, or something else. In other words: legal certainty is important for businesses and citizens alike.

This legal certainty isn’t always there in the regulation. The law aimed to harmonise all privacy legislation in Europe. The current text however has many exceptions allowing the member states of the European Union to regulate areas themselves or allowing the Commission to adopt further clarifying measures (called delegated and implementing acts).

IBM justly addresses some remarks to this in a letter to the ministry of economic affairs:
“The final text must, then, provide for a high degree of legal certainty and predictability. With its [49] delegated and implementing acts, the draft does anything but.”

But IBM extends this legal certainty to the obligations put on businesses.
“Newly proposed obligations are too vague or too complex to be properly understood – or complied with. New constraints on implementation would remove the flexibility European businesses need to innovate and thrive. Nor are IBM’s concerns limited to the information technology sector in which we participate.”
They make a connection between legal certainty and obligations. IBM wants more flexibility. But that would make it more unpredictable for people. How would people be able to tell which obligations apply to companies and whether they stick to those obligations?

Industry lobbying ultimately led to the final text of the General Data Protection Regulation having significantly more national exceptions than the preceding legislation had articles – a case of politicians learning the hard way that lobbyists don’t always know what they want.
It shows that although offering legal help can be necessary, it can also be abused.

To be continued.

Want to continue reading about this? On the Bits of Freedom website, you can find all the lobby documents and the analysis. The next part is about the “not in my backyard” argument.

For the series of blogs and documents, see the Bits of Freedom website

Email VNO-NCW en MKB Nederland to Dutch perm rep (06.03.2013)

Email by TechAmerica Europe to Dutch perm rep (15.01.2014)

Letter by TechAmerica Europe to ministry of justice (14.01.2014)

Letter by IBM to ministry of economic affairs

(Contribution by Floris Kreiken, Bits of Freedom)



20 Apr 2016

DFRI thrown out of conference on surveillance cameras

By Guest author

Every year about 200 representatives from the Swedish security industry meet to discuss security cameras. This year’s conference was particularly interesting. The Swedish government has appointed a commission to investigate possible changes in existing laws to make it easier to get permission to use surveillance cameras in public spaces, schools and workplaces.

................................................................. Support our work - make a recurrent donation! https://edri.org/supporters/ .................................................................

These cameras are usually called surveillance or security cameras. However the security industry do not like the connotation of these words. Instead, they have begun to speak of “Trygghetskameror” which is difficult to translate accurately, but might be called “comforting cameras”. The organisers want to give the impression that surveillance cameras make us safer.

If you believe that, shouldn’t you welcome the added security it would mean to have a few extra cameras filming the participants at the conference? Therefore, DFRI was there with our own cameras to help out. To our (not so great) surprise, the participants did not feel at all comforted by the fact that we filmed them. These camera friends, as they call themselves, did not want to be the target of surveillance any more than anybody else. We were asked to leave the site so we continued our filming outside as the participants arrived at the conference.

Various studies have been conducted on the effectiveness of surveillance cameras in Stockholm, Oslo, London and many other cities. The results are consistent and show that the cameras have no more than a marginal effect. We have recently seen several examples of terrorists not being deterred by security cameras. The same applies to violent offenders. Drunk and emotional people commit almost all violent crimes in public places. They do not give surveillance a thought when emotions take over. Limiting alcohol use has proven far more effective to curb violence than mass surveillance of all honest citizens peacefully moving around in the city.

Protests against increased surveillance (in Swedish) (14.04.2016)

DFRI thrown out of conference on “Comforting Cameras” (in Swedish) (13.03.2016)

Surveillance cameras provide no security (in Swedish) (11.04.2016)

(Contribution by Peter Michanek, DFRI)



08 Apr 2016

Special report: Poland’s secret services are still using and abusing telecom and Internet data


With almost two million requests for telecommunication data and more than two thousand requests for Internet data concerning Polish citizens in 2015, it is clear that the access to metadata in Poland by the country’s secret services is still out of control.

Compared to 2014, the Polish Panoptykon Foundation found that the number of requests for telecommunication slightly decreased, while the number of requests for Internet data quadrupled. Secret services are most interested in billing and user information. In 2014, police asked for telecommunication data less frequently than in previous years, but they were much more interested in Internet data that before. As to why, we can only guess. Polish law does not impose control over the secret service activities. However, research has confirmed that data is requested not only in prosecuting serious crime or preventing terrorism, but also minor crimes. The secret services are not obliged to provide full information to the public in what types of cases and for what reasons – statistically speaking – they gain access to such data. As a result, we also don’t know why the numbers are sometimes higher, sometimes lower.

Despite the European Court of Justice (CJEU) ruling in April 2014 and the ruling of Polish Constitutional Tribunal, regulations still give secret services open access to telecommunication and Internet data, without an adequate control mechanism with judicial oversight. Instead of limiting secret services powers regarding accessing data, the new Surveillance Act (which entered into force in February 2016) made the procedure of accessing Internet data even easier. So far requesting Internet data required a written request to the Internet operator. The company could reject it – which was often a case – and in any case had control over what data is obtained by the police. When the new Police Act is fully operational, no active cooperation from the ISP will be necessary– once the planned, so-called “fast and secure connections” between ISPs and police or secret services are established, even bulk transfers of data will be possible without sending a single request.

According to the new law, secret services are due to submit a report to the court every 6 months – which means ex post control instead of the prior consent of court proposed by the CJEU. The report will not include queries for users’ data, which is approximately 40% of all queries. This solutions provides neither transparency nor adequate control over secret services’ activities. Citizens will have to rely on secret services chiefs’ promise that innocent people do not have to worry – which is far from the standards set by the European Court of Justice and Polish Constitution.



download full size (.png)

(Contribution by Anna Obem, Panoptykon)