25 May 2016

European fundamental rights to be regulated by companies


Today, on 25 May, the European Commission published two new proposals of their Digital Single Market strategy: its update of the Directive on audiovisual media services (ADVMS) and a Communication on online platforms, together with the evidence document for the platforms Communication.

The Communication on Platforms worries us the most. For instance, the proposals with regard to the regulation of “illegal” “or harmful” content are hugely disturbing. The Commission seems willing to completely give up on the notion of law. Instead, regulation of free speech is pushed into the hands of private corporations.

Demanding that multiple companies in multiple jurisdictions arbitrarily implement whatever national law or other preferences they choose is a sure-fire way of building new barriers in the “Digital Single Market”

said Joe McNamee, Executive Director of European Digital Rights (EDRi).

The Communication repeats “voluntary measures” almost like an ideological mantra – whatever the question is, the answer is always “platforms can fix it”. What about public authorities’ responsibilities to enforce the law?

The Commission refers to the EU Internet Forum as an “important example”. Indeed it is, although not in the way the Commission meant.The EU Internet Forum is a highly-untransparent Commission-driven project, undertaken in cooperation with exclusively US online companies (and not even all of the relevant ones!) to tackle terrorist content and hate speech online. The Commission describes it as a “multi-stakeholder” model, even though only the Commission and three US companies were involved in drafting the outcome.

The Communication also talks about limitations of liability for platforms when they take “good faith” law enforcement measures. Sounds familiar? The ill-fated Stop Online Piracy Act (SOPA) that was abandoned after huge protests in the USA proposed exactly the same thing. The good news is that, for now, no new liability measures are being proposed to coerce platforms into taking these measures. However, there are threats of “formal notice and action” procedures that make it clear that law will be used if the companies do not police European citizens extensively enough.


Today’s Communication offers “evidence” is either misrepresented or moulded to suit pre-existing policies. The Commission appears eager to ensure that the online monopolies monitor online activity, take action to remove any content that creates legal risks for them, and arbitrarily police content to “protect” unspecified and undefined “societal values”. Not a word is wasted on review processes, effectiveness, proportionality, possible counter-productive or anti-competitive effects.


Background information:

EDRi response to AVMS Consultation (30.09.2015)

EDRi response to Platforms Consultation (06.01.2016)

The EU Platforms Consultation – Just How Biased is it (14.12.2014)

Leaked EU Communication – Part 1: Privatised censorship and surveillance (26.04.2016)

Leaked EU Communication – Part 2: Protecting Google at all costs (28.04.2016)

EU Internet Forum against terrorist content and hate speech online: Document pool (10.03.2016)

Our overview of the Digital Single Market Communication (17.06.2015)


23 May 2016

Copyfails: Time to #fixcopyright!

By Diego Naranjo

We believe that new technologies bring new ways to access culture – they are not a threat for creators. We believe that the legitimacy crisis of the current EU copyright regime is created by the system itself. We believe there’s a need for a modernised copyright regime which takes into consideration the needs of all parts of society, including creators.

Europe needs a more profound reform of the EU copyright regime than the one that the European Commission has announced. To illustrate this, we have identified nine copyfails – crucial failures of the current EU system. You can read the first blogpost of our “copyfails” series, presenting the copyfail #1 here.

The European Commission has set in its agenda reforming copyright as one of the foundations to build the Digital Single Market. However, the Communication published at the end of 2015 did not meet the expectations of the announced “more modern, more European” copyright. On the contrary, the Commission apparently only wants to paper over the serious cracks in the wobbling structure of EU copyright legislation rather than addressing the real problems.


Are you ready to #fixcopyright in the EU? Follow #fixcopyright on Twitter!


Read more:

Copyright reform: Restoring the facade of a decrepit building (16.12.2015)



19 May 2016

Copyfail #1: Chaotic system of freedoms to use copyrighted works in the EU

By Diego Naranjo

This article is the first one in the series presenting Copyfails. The EU is reforming its copyright rules. We want to introduce to you the main failures of the current copyright system, with suggestions on how to fix them. You can find the nine key failures here.


How has it failed?

The current EU Copyright Directive outlines 21 different optional freedoms to use copyrighted works. These freedoms, called “exceptions and limitations”, specify how strict copyright rules can avoided in certain useful circumstances, as long as this does not interfere with the exploitation of the work by the creator. This would include, for example, using copyrighted material for educational purposes, adapting it for people with disabilities, making copies of music or films for personal use, or using it for academic quotations.

Each EU country putting the Directive into practice can choose to either include or exclude any of these optional exceptions. As a result, there are literally over two million ways to implement the Directive! In a borderless, open Internet, it is crazy that a simpler solution to implement flexibilities that do not interfere with the normal exploitation of the copyrighted material is not implemented.

However, copyright lobbyists are vehemently opposed to any flexibility. Indeed, in 2001, when the Directive was adopted, lobbyists argued that the one mandatory exception (for incidental copies in networks) was absolutely unworkable and would “a gaping hole in rightsholders’ protection under the reproduction right“. Fifteen years later, it is very obvious that no such “gaping hole” was created. Now, they warn again against a more flexible regime. Now, as then, they are wrong.


Why is this important?

People across the EU should be able enjoy the same rights. Harmonisation of the copyright rules is needed for creating a Digital Single Market – not 28 EU markets as we currently have.

The implications of the copyfail #1 are huge, for example:

  • In the UK, people are not allowed to make copies of music that they legally buy.
  • In Austria and Lithuania is illegal to send quotations by e-mail.
  • In some countries, like France, the uses of copyrighted works in schools are considerably more restricted, than in others, like Estonia. The latter allows teachers within an educational context to quote works to any justified extent, compile works of any nature and translate and adapt entire works, while France doesn’t.

How to fix it?


Read more:

Copyright combinatronics (16.11.2011)

Copyright exceptions and limitations – back to the future (25.03.2015)

Copyright reform: Restoring the facade of a decrepit building (16.12.2015)



18 May 2016

ENDitorial: Next year, you’ll complain about the Terrorism Directive

By Joe McNamee

The European Union is currently in the process of adopting a Directive on terrorism. The Directive is expected to be finalised later this year and then each Member State government will decide what it means, and will adopt national laws to put it into practice.

The European Commission wrote the draft Directive in two weeks, after the Paris attacks in November 2015, basing itself mainly on existing EU and international instruments. It was prepared without an impact assessment. An impact assessment assesses various options and the evidence available. This step is generally necessary for every piece of legislation. However, it appears it is not needed when human lives and civil liberties are at stake. Why? Because it is important to give people a feeling (even if the feeling is misguided) that the EU is doing something to “protect” them.

This is just the beginning of the political process, so the press is not interested.

Within three months, the EU Member States had adopted their informal position, or as they call it, their “general approach”. They did this in the absence of an impact assessment that would have provided the analysis to indicate what may or may not be lawful, appropriate, useful, necessary or proportionate. They added new text on blocking of websites “inciting to commit” terrorist offences, without any indication of whether the content would be legal or illegal, or why they might think that this might be a good idea. The Member States also envisage “the taking and the fixing of audio recordings in private or public vehicles and places, and of visual images of persons in public vehicles and places…”.

This is just the beginning of the political process, so the press is not interested.

The European Parliament is now preparing its position. Unlike legislation on other subjects, where relevant specialised parliamentary committees give their opinions to the Committee in charge, in this case only one Committee is assessing the Directive, because speed seems to be more important than evidence or effectiveness or even usefulness. Various Parliamentarians in the “Civil Liberties” Committee have proposed adding random additional (often incongruous or irrelevant) measures to the Directive – on blocking of websites, on banning terrorist malware (sic), on criminalising online platforms for failing to remove content on “counterfeiting trademarks” (sic), and a whole host of other measures that, similarly, are neither relevant nor based on any evidence of usefulness, effectiveness, proportionality, legality… anything.

This is just the beginning of the political process, so the press is not interested.

Now, instead of voting on the 438 amendments tabled, a small group of just eight representatives of political groups are having closed-door meetings, to create “compromise amendments” on the controversial points. Once the political groups go through this process, a vote can happen with everyone hiding in the crowd – citizens will never know who made what proposal or supported what decision. The ensuing vote in the Civil Liberties Committee will just be a mere formality because the other politicians tend to respectively follow the eight politicians that negotiated the text.

This is just the beginning of the political process, so the press is not interested.

Once the vote has happened in Committee, it does not need to be endorsed by the full Parliament. Instead, the eight Parliamentarians and Member States will start closed-door, secret meetings, together with the European Commission (known as “trilogues”) to reach a compromise between the position taken in the Committee’s so-called “orientation vote” and the Member States’ “general approach” (neither of which is binding on either institution). This process is secret, the negotiation drafts are secret and the procedure has no formal end-date.

This process is too opaque, so the press is not interested.

Once the trilogue process finishes, both the Member States and the Parliament are committed to rubber-stamping the agreement, so it is almost impossible to make any changes at this stage. Some policy-makers won’t agree on specific points, but they will nonetheless vote for the adoption of the text. If you ask them, they will tell you that it’s because of the “necessary” political trade-offs. If some politicians vote differently, they will be reprimanded by their political group.

Only at this stage the press is given press releases, a nice spin about “fighting terrorism”, a clear end-date for the Parliament vote.

The press is interested, the process is over. It’s too late to change anything.

Next year, when your Member State starts blocking websites, without quite knowing why, when it starts imposing restrictions on Tor and proxy servers, without quite knowing why, when unaccountable, unclear legislation leads to arbitrary and discriminatory enforcement, and your government says that it is “EU law that it is obliged to implement” and you wonder why the press never reported on it, when you search in vain for who is accountable for a weak and dangerous text, come back and read this again.

In this process, EDRi uses all of its political resources to obtain documents, provide input to decision-makers, and use all possible opportunities to shine a light on closed processes and defend citizens’ rights from the dangers created by opaque and unaccountable decision-making. Thankfully, the fact that the process is rotten does not mean that the individuals involved are not open to positive and constructive input – many work hard to achieve the best possible outcome for Europe’s citizens. Even if we cannot always – or often – claim credit for the successes we have in positively influencing such processes, it is important to engage as effectively as possible, while demanding that trilogues are reformed or, ideally, abandoned.

................................................................. Support our work with a one-off-donation! https://edri.org/donate/ .................................................................

EDRi: EDRi’s recommendations for the European Parliament’s Draft Report on the Directive on Combating Terrorism (29.03.2016)

EDRi: Countering terrorism, a.k.a. the biggest human rights threat of 2016 (20.04.2016)

EDRi: Trilogues: the system that undermines EU democracy and transparency (20.04.2016)

Proposal for a Directive on combating terrorism (02.12.2015)

Council General Approach on the proposed Directive on combating terrorism (11.03.2016)

European Parliament draft report on Directive on combating terrorism (09.03.2016)

European Parliament amendments tabled to Directive on combating terrorism (nos 56 to 246)(08.04.2016)

European Parliament amendments tabled to Directive on combating terrorism (nos 247 to 438)(08.04.2016)

(Contribution by Joe McNamee, EDRi)



18 May 2016

Advocate General: Dynamic IP address can be personal data


On 12 May Manuel Campos Sánchez-Bordona, Advocate General (AG) of the European Court of Justice (CJEU), gave his opinion in the Case Patrick Breyer against the Federal Republic of Germany, C-582/14.

Patrick Breyer sued the German government for violating his right to data protection by storing the data about him visiting websites of the German government longer than necessary. The government’s websites use so-called “logs” that keep record of which particular dynamic IP address was having access to the service. Breyer claims that the storage of this data constitutes a processing of personal data, which is protected under the Data Protection Directive 95/46/EC. According to the Directive, such processing of personal data is generally unlawful, unless it is justified, for example by a previously given consent. The Republic of Germany, however, stated that the logs to its website are essential for its functioning, as they are important for preventing abuse and prosecuting network attacks. In the ongoing procedure, the German Federal Court of Justice (BGH) eventually forwarded two questions to the CJEU, asking for preliminary ruling.

................................................................. Support our work - make a recurrent donation! https://edri.org/supporters/ .................................................................

The first question was addressing the issue whether a dynamic IP address, together with the time of access, constitute personal data according to Article 2 (a) of the Data Protection Directive. The question referred specifically to whether a person is already “identifiable” if the information which is necessary to identify an individual has to be provided by a third party, in this case a telecommunication company. The Advocate General followed Breyer’s legal submissions in his opinion, stating that the definition of personal data also encompasses dynamic IP addresses, as long as an Access Provider holds additional information enabling the identification of an individual.

The Advocate General’s position directly contradicts the position of the Irish High Court in EMI Records & Ors -v-Eircom Ltd from 2010. In that case, Mr Justice Charleton (now elevated to the Irish Supreme Court), having explained that the internet is not “an amorphous extraterrestrial body” (paragraph 5), ruled that IP address data collected for the explicit purpose of identifying individuals in the context of copyright enforcement is not personal data.

The Advocate General’s view is also in line with an opinion from 2009 by the Article 29 Working Party (WP29), an independent body with advisory status regarding data protection, which was set up according to the Data Protection Directive. Back then, the WP29 considered that IP addresses can be personal data, even though they do not generally identify an individual by name.

The second question was assessing whether restrictions of data processing under the German Telemedia Act (§15 TMG), which lead to a prohibition of the processing, may violate the Article 7 (f) of the Data Protection Directive. The provision states that personal data may be processed, if it is necessary for the purposes of legitimate interest of the processor. However, the German Telemedia Act (TMG) allows the collection and use of users’ data in only limited circumstances, but not for the purpose of ensuring the general operability of the telemedia service. The AG claims that the purpose of a functioning of “telemedia” (as defined in §1 TMG) constitutes a legitimate interest according to the Directive and should therefore be permitted, when it prevails over the interests or the fundamental rights of the concerned person.

Patrick Breyer stated in reaction to the opinion that

nobody has a right to record everything we do and say online. Generation Internet has a right to access information online just as unmonitored and without inhibition as our parents read the paper, listened to the radio or browsed books.

The European Court of Justice did not yet set a date for the final decision.

Request for a preliminary ruling of the German Bundesgerichtshof (17.12.2016) http://curia.europa.eu/juris/document/document.jsf?text=&docid=162555&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=801346

The 2009 opinion by the Article 29 Working Party (20.06.2016) http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2007/wp136_en.pdf

Opinion of the Advocate General (only in German, 12.05.2016) http://curia.europa.eu/juris/document/document.jsf?text=&docid=178241&pageIndex=0&doclang=DE&mode=req&dir=&occ=first&part=1&cid=810242

EMI Records & Ors -v-Eircom Ltd (16.04.2010)

(Contribution by Claudius Determann, EDRi intern)



18 May 2016

Hungary: New government proposals raise concerns

By Guest author

The Hungarian government is ramping up its “terrorist” measures; a constitutional amendment that establishes a new state of exception is one of the measures it foresees as necessary to keep the population safe.

The threat of terrorism in Hungary is considered to be low by the UK Foreign Office, the CIA, and Hungary’s Strategic Defense Research Centre. Even the Prime Minister’s chief national security advisor says they have no knowledge of terrorist plots. Despite that, the country has maintained a medium-level terror alert since the Paris attacks in November 2015. Most observers agree that any necessary reforms could be addressed by modifying the measures in the current legal framework. However, in early June 2016, the Hungarian National Assembly will vote on an anti-terrorism bill that will require a sixth amendment to a constitution that is already a patchwork of amendments.

................................................................. Support our work with a one-off-donation! https://edri.org/donate/ .................................................................

Encryption users put on their guard

Plans to ban encryption software were abandoned after being castigated in the media. However, under the conditions specified in changes to the Act on Electronic Commerce, Internet Service Providers (ISPs) that provide encryption services would be under an obligation to store the metadata of encryption users for a year (along with the contents of their communications in the case of weaker forms of encryption); and if they refuse to share this data with an intelligence agency, they could face a fine of up to 30 000 euro. Troublingly, without the inclusion of proper safeguards, encryption users could be more susceptible to privacy abuse. While the government portal declares that the stored data would only be accessible to the secret services after judicial approval, the Hungarian Civil Liberties Union (TASZ) asserts otherwise: authorisation for such requests under current legislation is conferred not by a court, but by a government ministry.

Maximum privacy for state enterprises

Tucked away in the recesses of the 2017 Central Budget Governance Act are also some provisions that would potentially shield state-owned companies from public scrutiny. According to the proposal up for debate next week, non-disclosure protections of data relating to the assets, functioning and contracts of state companies involved in activities such as central data acquisition and telecommunications management could be exempted from Freedom of Information (FOI) laws for up to 30 years. The grounds for protections are mainly economic. The rationale: the availability of certain kinds of information could threaten the national economy by harming the interests of state companies. However, even the authority responsible for Freedom of Information, not famous for taking a strong stand against government excesses, questioned the exceptions last week, reminding the Parliament’s Budget Committee that the statutory and constitutional basis for restricting access in court, related to legitimate economic interests, does already exist.

“Terror alert”: A new state of exception

The constitutional grounds for invoking a state of emergency currently exist as well. Constitutional provisions stipulate in Articles 48-54 how authorities may respond to armed attacks, industrial disasters and other kinds of imminent dangers to national security. The absence of a clear justification for the amendment is one of the reasons why the establishment of a new form of state of emergency (literally “status of terror alert”, terrorveszély-helyzet) is controversial. According to the draft it would enable measures that would not be subject to ex post judicial control. What constitutes the grounds for invoking this status is not entirely clear either, and this is worrying since it would empower the government, if it felt it was endangered, to summon the National Defence Forces or limit civil rights.

In early 2016, when the proposed text of the constitutional amendment was only accessible to members of parliament under a pledge of secrecy, the Eötvös Károly Institute commented on a leaked copy, arguing that the legal basis for the declaration of a state of “terror alert” lacked the precision that such extensive power would require. It called on the government to conduct an evidence-based assessment of the threats that the country faced and to examine whether its needs could be addressed through parliamentary acts instead. Unlike many of the other dramatic changes to Hungary’s legal framework that have occurred in recent years, multiparty support will now be needed for most of the key measures to become law. And the latest version of the bill has been significantly tempered with this in mind; it now includes measures that would enable the National Assembly to have a say in the matter.

The Hungarian Minister of Defence maintains that the primary aim of the reforms is to protect the population more effectively. Even if he’s right, his reasons for thinking that Hungary would never cross the line into a military dictatorship are not entirely convincing: “We live in an open world”; and with a “multiparty democracy, parliamentary control and the internet” that “could never happen”. In the absence of adequate safeguards designed to discourage a head of state persuaded that this, or any other form of overreach, was necessary, the amendments that are on the table would offer an accommodating legal environment. The fact that “nobody [in the government] would want this” shouldn’t be enough to reassure a population living in a constitutional democracy. That’s what constitutional guarantees are for.

The Hungarian Parliament is About to Enact New Anti-Terror Laws (05.05.2016)

Communication of the National Authority for Data Protection and Freedom of Information to the Parliamentary Budget Committee (only in Hungarian, 09.05.2016)

(Contribution by Christiana Maria Mauro, EDRi Observer)



18 May 2016

EC wants to add facial recognition to transnational databases


On 4 May 2016, the European Commission (EC) published a proposal to recast the EURODAC Regulation. The European Automated Fingerprint Identification System (EURODAC) was initially introduced in 2003 to establish an EU asylum fingerprint database, and to share this information with national law enforcement authorities and Europol.

................................................................. Support our work - make a recurrent donation! https://edri.org/supporters/ .................................................................

According to this proposal, if a person applies for asylum anywhere within the EU, their fingerprints will be required, and transmitted to the central EURODAC system. The purpose is ostensibly to help identify irregularly third-country nationals or stateless persons. The regulation was amended in 2013, expanding its purpose from border control to general law enforcement, especially regarding terrorism and serious crime.

In addition to the fingerprints collected so far, the new recast proposed by the European Commission introduces a new feature, facial recognition. Additionally, it includes plans for longer storage periods, an expansion of data categories and comparison capabilities, and mandatory photographing. The Commission claims that the use of biometrics would facilitate the identification of asylum seekers, and therefore improve the effectiveness of the EU return policy. As Matthias Monroy, advisor to German Member of Parliament (MP) Andrej Hunko argues, the biometric data can be taken even against the will of the people concerned. This would also be applicable in the case of minors.

As EDRi member Statewatch states, there will be two different search options within the proposed system. One option is to compare the picture of a person with the available personal data of the individual when checks are conducted. This option is called “1:1 matching”. The second search option is to look for a face in the database, a process known as “1:n matching”.

The Commission envisages that Frontex, the European border management agency, and Europol, the European police cooperation agency, will have access to all the stored information in the EURODAC. Both agencies could conduct searches based on a facial image in the future. This will bring EURODAC in line with the other systems such as the Entry/Exit System. The proposal also allows for the information to be shared with third countries where it is necessary for return purposes.

In sum, the Commission is proposing to take surveillance measures in the attempt to gain full control over the movement of migrants through a facial recognition software and a database filled with the faces of millions of women, men and children. Are the costs of 30 million euros of this new system justified? Is the huge invasion of people’s privacy, freedom of movement justified? When such invasive tools are implemented and “normalised”, what are the chances that it won’t be rolled out more and more extensively?

European Commission’s proposal for a new Regulation on EURODAC (04.05.2016)

Facial recognition to accompany fingerprints in transnational databases (12.05.2016)

Report on the proposal on the website of Andrej Hunke, Die Linke, Germany (11.05.2016)

(Contribution by Claudius Determann, EDRi intern)



18 May 2016

Europol: Non-transparent cooperation with IT companies


Will the European Police Office’s (Europol’s) database soon include innocent people reported by Facebook or Twitter? The Europol Regulation, which has been approved on 11 May 2016, not only provides a comprehensive new framework for the police agency, but it also allows Europol to share data with private companies like Facebook and Twitter.

The history of Europol legislation

Europol supports Member States in more than 18 000 cross-border investigations a year. It started in 1993 as an intergovernmental Europol Drugs Unit (EDU) and became an international organisation with its own legal apparatus in 1995. A Council Decision from 2009 (371/JHA) established Europol as an EU agency. In March 2013, the European Commission proposed a reform of Europol via a draft regulation. After three years of work, the European Parliament approved the political compromise reached with the Council and the Commission on 11 May. The new Regulation will be applicable in May 2017.

What is Europol doing?

Europol serves as an information hub for law enforcement agencies of Member States and supports their actions in fighting against thirty types of “serious crimes”, including terrorism, organised crime, “immigrant smuggling”, “racism and xenophobia” and “computer crime”. Rather than focusing on specific areas of crime, the targets of Europol have now even been increased. For example, “sexual abuse and sexual exploitation, including child abuse material”, as well as “genocide and war crimes” are also included in the list.

................................................................. Support our work with a one-off-donation! https://edri.org/donate/ .................................................................

In all these areas, Europol’s main task is to “collect, store, process, analyse and exchange information” that it gathers from Member States, EU bodies and “from publicly available sources, including the internet and public data”.

Unlike its American equivalent, the FBI, Europol has no executive powers. It can only notify Member States of possible criminal offences in their jurisdiction, but not start an investigation on its own or arrest anybody. Thus, after notification, it is upon the Member State whether to investigate or not. If a Member State decides to take action, Europol shall provide support, which can also include participation in joint investigation teams.

The Internet Referral Unit

One of the controversial parts of the new Regulation is related to Europol’s EU Internet Referral Unit (IRU). The IRU has been operative since July 2015, and is part of the newly-founded European Counter Terrorism Centre (ECTC) at Europol. Among other tasks, the IRU monitors the internet looking for content that is “incompatible” with the terms of service of “online service providers” like Facebook, so they can “voluntarily consider” what to do with it. In other words, the IRU does not assess whether the content is illegal or not, and companies are not obliged to remove illegal content. They are “encouraged” to “voluntarily” do something with the referrals. The IRU has the power to create pressure to have apparently legal content deleted by companies, but bear no responsibility or accountability for doing so. How this complies with the EU Charter’s obligation that restrictions (on freedom of communication in this case) of fundamental rights must be “provided for by law” (rather than, for example, this kind of ad hoc, informal arrangement) is not obvious. Equally non-obvious is compliance with the requirement that such restrictions are “necessary and genuinely meet objectives of general interest”. If it was necessary to delete such content, then such content would (obviously?) be illegal, rather than just a possible breach of terms of service.

A document from the European Commission from April 2016 shows that the IRU has so far assessed over 4700 posts across 45 platforms and sent over 3200 referrals for internet companies to remove content, with an effective removal rate of 91%. This means that companies are receiving pressure to remove content that a public authority has assessed, not on the basis of the law, but on the basis of a private contract between a company and an internet user.

The whole procedure is non-transparent and is not subject to judicial oversight. In this sense, the European Data Protection Supervisor (EDPS) had recommended that Europol makes at least the list of its cooperation agreements with companies publicly available. However, this recommendation was not taken into account, and did not make it to the final text of the new Regulation.

Europol may receive and transfer personal data

Europol can also “receive” personal data, which is publicly available, from private parties like Facebook and Twitter directly. Before, this was only possible via a national police unit, assuming compliance with national law. Now a company has to declare that it is legally allowed to transfer that data and the transfer “concerns an individual and specific case”, while “no fundamental rights […] of the data subjects concerned override the public interest necessitating the transfer”. This has not been part of Europol’s rules before, and the Commission has to evaluate this practice after two years of implementation. At least until then, Europol can also feed this information into its databases on the internet.

While “Europol shall not contact private parties to retrieve personal data”, it will now be able to transfer information to private entities. The Regulation puts in place several measures to safeguard personal data protection. However, there are no transparency requirements to inform the public about any type of information exchange between Europol and companies. The Regulation only empowers a “Joint Parliamentary Scrutiny Group” to request documents. However, that remains rather unclear in the text, and will be governed by “working arrangements” between Europol and the Parliament. The accountability and respect for the rule of law in this deal is not clear.

Text of the new Europol Regulation adopted by the Council (11.03.2016) and the European Parliament (11.05.2016)

Police cooperation: MEPs approve new powers for Europol to fight terrorism (11.05.2016)

European Data Protection Supervisor, Executive Summary of the Opinion of the EDPS on the proposal for a Regulation on Europol (31.05.2013)

Communication from the Commission, delivering on the European Agenda on Security to fight against terrorism and pave the way towards an effective and genuine Security Union (20.04.2016)

(Contribution by Fabian Warislohner, EDRi intern)



18 May 2016

Looking back through the French anti-terror arsenal

By Guest author

Following the publication of the Action Plan Against Terrorism and Radicalisation by the French Government, summarising the whole anti-terror strategy of France, built up law by law during the past years, it is important to look back on the main measures presented in this report, especially those affecting civil rights and liberties on the Internet.

In the last two years, France adopted several laws aimed at reinforcing anti-terror measures that include access to connection data without judicial intervention (Defence Law of 2013), administrative blocking of websites, more strict sentences in some cases if an offence is committed online, remote computer searches (anti-terror law of 2014), mass surveillance (see French’s surveillance law of 2015) that includes international communications as well (International Surveillance Law 2015), and a reinforcement of some of those measures after the declaration of the State of Emergency following the Paris attacks on 13 November 2015.

................................................................. Support our work - make a recurrent donation! https://edri.org/supporters/ .................................................................

New measures affecting our rights are planned, such as those included in the currently-debated “Digital Republic Bill”. It includes plans for the involvement of the “actors of the Internet” (such as intermediaries and platforms) in the fight against online terrorist propaganda and by promoting “counter-speech”.

In addition, the bill reforming the French Criminal Justice system extends the scope of the State of Emergency, broadly anchoring it in law. This bill criminalises the “frequent consultation of terrorist websites”. It also reinforces measures on the administrative blocking of the websites, inserting a new obstruction offence to block terrorists websites. This bill jeopardises the balance of power mostly between the Judicial and the Executive power. Without drawing any constructive conclusions from the tragic events of November 2015, this bill is the proof of a dramatic headlong rush by the French Government.

On 17 May, an amendment tabled by the right-wing Member of Parliament (MP) Éric Ciotti to the new bill for a “21st Century Justice”, suggests to link video surveillance and facial recognition in public spaces.

The measures regarding civil rights and liberties on the Internet seriously infringe our online privacy and the right of information, while undermining the oversight of judges ex ante. Most of these measures are taken by administrative authorities. Therefore, they don’t require a judge to conduct a legal assessment about the actions to be taken. What is more, these laws undermine the role of the so-called “investigating magistrate”, the French independent judge who leads the investigation phase for the prosecutor of the French Republic and the judge of liberties and detention, who are less independent and have a less technical knowledge of the cases.

On the other hand, the question of encryption is being addressed with an aggressive stand to infringe citizens’ online privacy and security. It seems clearer everyday that the aim of the French Government is to carry out an offensive political agenda regarding civil rights online.

French civil rights organisation La Quadrature du Net is working hard to raise awareness of the dangers of these laws approved, and will keep challenging them using any legal means available before the French courts (the Council of State and Constitutional Council) and also European courts (the European Court of Justice and the European Court of Human Rights).

It is of utmost importance to understand the whole repressive architecture of those laws that go far beyond the “fight against terrorism” in order to challenge them. France is currently pushing those same measures in the Directive on combating terrorism, being currently discussed by the European Parliament. There is an urgent need to stop these dangerous provisions before they become binding for all Member States, and not to give in to fear but protect our values, our rights and liberties. The promotion of free software, end-to-end encryption of communications and decentralised solutions for citizens has never been as important as today, to counter mass surveillance from public and private actors.

Action Plan Against Terrorism and Radicalisation (only in French, 09.05.2016)

EDRi: EDRi’s recommendations for the European Parliament’s Draft Report on the Directive on Combating Terrorism (29.03.2016)

EDRi: Countering terrorism, a.k.a. the biggest human rights threat of 2016 (20.04.2016)

(Contribution by Christopher Talib, La Quadrature du Net, France)



18 May 2016

Big Brother Awards Germany 2016

By Guest author

The annual gala for the German Big Brother Awards (BBA), organised by EDRi member Digitalcourage, was held on 22 April 2016 in Bielefeld, Germany.

English-language coverage of the event was stepped up this year. While English translations of most of the laudations have been available on the website in previous years, this year Digitalcourage’s interpreter also read out the translated speeches and live-interpreted the rest of the event. This service was made available to the audience via radio headphones and broadcast as an audio stream. A video using the English audio track has also been published.

................................................................. Support our work with a one-off-donation! https://edri.org/donate/ .................................................................

The gala was opened with a guest speech by the former German Federal Minister for Justice, Sabine Leutheusser-Schnarrenberger. She famously resigned during her first term in office in 1996 to oppose the conservative–liberal coalition’s decision to introduce eavesdropping on private homes. This decision, the only “conscientious resignation“ of a German government minister to date, was later vindicated by the Constitutional Court when it declared the disputed law partly illegal in 2004. In her second term as Justice Minister in the conservative–liberal coalition of 2009–2013, she had a key role in preventing the re-introduction of telecommunications data retention under that government (the subsequent conservative–social-democratic coalition revived the measure in 2014/2015).

The Lifetime award went to Germany’s interior secret service, whose name literally translates as “Protection of the Constitution”, for 65 years of violating civil and privacy rights. The “Protection of the Constitution” was criticised in particular for monitoring and stigmatising groups and individuals that are critical of the state and society, for its uncontrollable system of informers, and for covering up allegedly illegal practices. Despite its history of scandals, the “Protection of the Constitution” is not being reigned in, but instead upgraded and entrusted with further tasks and intelligence capabilities. One of these new plans is to take up active investigation of social networks. This plan was famously leaked last summer by the blog netzpolitik.org, which led to its founder and one other journalist briefly facing a criminal investigation for treason, until this was halted and the Federal Prosecutor General was made to resign. The Lifetime award category was also chosen as the winner of the Audience Award by the guests present at the gala, with an unusually strong majority.

In the Consumer Protection category the Generali insurance was awarded, because it promises advantages to the insured if they use an app to transfer their fitness data and shopping behaviour to the insurance company, which in turn transfers the data to a credit-point system in South Africa. Participation in this scheme will not give the customers lower insurance premiums – instead they are incited to redirect their purchases of certain supposedly healthy products to a few branded shops. The laudation called this a “gamified” training for what would appear to be a healthy life-style; it also identified and strongly criticised the underlying trend of abandoning the solidarity-based model of insurance.

The award in the Economy category went to the US company and campaign platform Change.org for its business model of marketing personally identifiable information of signatories together with their political statements. As the laudation pointed out, Change.org appears to be a progressive, social project, but it really is a for-profit US corporation that shows many deficiencies with respect to data protection law. For example, it continues to store user data in the US using a reference in its privacy policy to the “Safe Harbor“ framework, which was ruled to be invalid by the European Court of Justice in October 2015.

A somewhat controversial moment came when the German head of Change.org appeared at the gala, only the fourth awardee in the German BBA’s 16-year history willing to do so. He tried to circumvent the previously agreed procedure, which was that he would first receive the award and only then be invited to respond in an interview with the gala’s moderator. After a few minutes of on-stage negotiation the award was handed to Change.org’s German press spokeswoman and time for a response was given as planned. German head of Change.org Gregor Hackmack used the opportunity to point to recent successful petitions and to details of the site’s system that lets petition signatories receive newsletters, but did not address core issues such as the marketing of sensitive data or the fact that the site’s terms cite the invalid Safe Harbor framework – which it still does even at the time of writing this article, more than three weeks after the BBA gala.

Next, Berlin’s Public Transport Company (BVG) was recognised in the Technology Category for the “VBB Fahrcard”, a contactless chipcard also known as “(((eTicket”. This card replaces previous season tickets, but involves registration of individual journeys which is both unnecessary for season tickets and a function whose existence was long concealed and allegedly lied about by the transport company. .

IBM Germany was awarded in the “Workplace” Category for the software “Social Dashboard”. This is an example of an internal social network for work-related communication. It models employees’ relationships and calculates a “reputation” score, which was portrayed as an attempt to control and evaluate employees’ social behaviour that will create wrong incentives and increase pressure at work.

The gala also featured a ”Newspeak” section, which this year highlighted the German neologism “Datenreichtum” (data wealth) – a term recently used by a German government minister and others in an attempt to paint the trend of Big Data in more positive colours.

Reprimands went to Germany’s reform of its prostitution law (which will introduce compulsory registration and curtail the fundamental right of inviolability of the home on the mere suspicion that prostitution might be taking place), to the Google Impact Challenge (an attempt by Google to woo NGOs into handling their communication and administration using Google’s tools, and thus a scheme to tap into civil society as a new data source), and to cashless festivals (the trend where festival goers are compelled to wearing RFID wristbands and charge these with cash before being able to buy provisions on the site).

As the sole “positive” subject of the gala, Member of the European Parliament (MEP) Jan Philipp Albrecht and his team, including policy advisor Ralf Bendrath, were recognised in an honourable mention. Their work for the EU’s General Data Protection Regulation (GDPR) was praised as “seminal”, and the GDPR itself was hailed as a good start and a real chance for data protection in Europe. It was also pointed out that NGO work would have to continue to keep a close eye on how the regulation will be implemented and where it could be improved in the future.

Big Brother Awards Germany 2016

Recorded video of the gala
https://vimeo.com/163909275 (original audio)
https://vimeo.com/164733597 (English translation/interpretation)

Media coverage, reactions to the Change.org appearance (only in German)
https://digitalcourage.de/blog/changeorg-antwortet-auf-bigbrotheraward (transcript of the response given by Change.org on stage)
https://digitalcourage.de/blog/2016/es-wird-ihnen-nicht-gefallen-sie-bekommen-einen-bigbrotheraward (explanation on how Digitalcourage communicates with awardees ahead of the gala)
https://digitalcourage.de/blog/2016/besser-geht-immer (comment on the stand-off)

(Contribution by Sebastian Lisken, EDRi member Digitalcourage, Germany)