freedom of expression

Freedom of expression is one of the key benefits of the digial era. The global information society permits interaction on a scale that was previously unheard of – promoting intercultural exchange and democracy. Consequently, the protection of this freedom is central to much of EDRi’s work.

26 May 2016

European Parliament confirms that “Privacy Shield” is inadequate

By EDRi

The European Parliament has adopted a Resolution on the “Privacy Shield”. This is the new agreement to permit data to be transferred from the EU to the USA. The previous agreement – “Safe Harbour” – was overturned by the European Court of Justice in October 2015.

The Parliament’s resolution confirms that the new agreement has no chance of being upheld, if challenged at the Court of Justice of the European Union,

said Joe McNamee, Executive Director of European Digital Rights.

It questions the legal meaning of the assurances received from the USA. It points out that indiscriminate (“bulk”) surveillance is still possible and the new Ombudsman role is inadequate.

The Parliament adopted a similar resolution in 2000, when the illegal Safe Harbour agreement was adopted, but its recommendations were ignored for 15 years. The Parliament failed to demand meaningful improvements before adoption.

Incomprehensibly the Parliament voted against a sunset clause that could have been a means to inspire a meaningful renegotiation. This means that the USA will have no incentive to make any concessions. This means that the fundamental rights of European Citizens will be undermined until Privacy Shield is overturned. This means that European businesses can have no legal certainty if they rely on this agreement, which is broken by design.

Under EU data protection rules, personal data can only be transferred outside the EU under certain circumstances. The EU negotiated “Safe Harbour” as a special arrangement for the USA in 2000. After the Snowden revelations in 2013, the European Commission recognised that the arrangement was inadequate. It spent two years trying and failing to bring the deal into line with the EU’s legal framework. Then, in October 2015, the framework was overturned. The “Privacy Shield” is meant to replace the “Safe Harbour”.

PR_privacyshield_sharepic_20160526

Read more:

Transatlantic coalition of civil society groups: Privacy Shield is not enough – renegotiation is needed (16.03.02016)
https://edri.org/transatlantic-coalition-of-civil-society-groups-privacy-shield-is-not-enough-renogitation-is-needed/

What’s behind the shield? Unspinning the “privacy shield” spin (02.03.2016)
https://edri.org/privacyshield-unspinning-the-spin/

Opinion 01/2016 on the EU–U.S. Privacy Shield draft adequacy decision (13.04.2016)
http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2016/wp238_en.pdf

Fifteen years late, Safe Harbor hits the rocks (06.10.2015)
https://edri.org/safeharbor-the-end/

Twitter_tweet_and_follow_banner

close
25 May 2016

EU Council & Commission plan to give law enforcement authorities access to data of foreign IT companies

By EDRi

EU Commissioner Věra Jourová revealed plans to increase the competences of criminal law enforcement authorities in a speech at the European Criminal Law Academic Network. She announced that the Council of the European Union is currently drafting Conclusions. This draft document calls for law enforcement agencies to have direct cross-border access to personal data held by foreign service providers, without a mutual legal assistance procedure. This is more than worrisome for the privacy of European citizens.

Initially Jourová pointed out that it is important to “accelerate and streamline” mutual legal assistance (MLA) requests between national authorities, in order to collect digital evidence easier. However, “where mutual legal assistance is not suitable or available” (what this means is not explained – the Data Retention Directive was proposed in part because MLATs are time-consuming but, in the last 8 years were not seen as a priority for reform) it is important that certain types of data, including personal data, can be requested from IT companies directly.

Intransparent Council discussions

Jourová says that the new data protection rules allow for a smoother cooperation and exchange of information between police and justice authorities, as they provide common standards of data protection. Those rules “will ensure that personal data, for instance of victims or witnesses of crime, is properly protected”. The Council draft goes by the name “Conclusions on improving criminal justice in cyberspace”, however, there is no document number assigned yet.

Commissioner Jourová did not release further details about the draft. However, EDRi received a copy of a document by the German government (only in German, pdf) which provides more information regarding the content and goals of the draft Conclusions. In order to improve “criminal justice in cyberspace”, the draft wants to introduce measures that help securing electronic evidence, which otherwise would have to be deleted. This is being described especially problematic in cross-border cases (echoing the analysis from ten years ago in relation to the Data Retention Directive). The aim is to set up rules that allow a short-term access to certain data categories of information, in particular personal data. One way to achieve that, is to improve the direct cooperation of law enforcement agencies with foreign service providers.

The Council Conclusions were also discussed controversially in the Committee on Internal Security (COSI), with Member States expressing their concerns over the proposal. It was considered especially problematic that a mere “business link” of a provider to a state will constitute a sufficient legal basis for data requests by foreign law enforcement authorities. According to the proposal it would not even require a business establishment in the concerned Member States to enable direct information claims. Some EU countries were pointing out that their sovereignty must not be undermined by such measures. France allegedly expressed concern that it was not legal under national law for providers to provide foreign law enforcement agencies with data.

In the end, there was wide-ranging agreement that future discussions must include other stakeholders, such as law enforcement authorities and IT businesses.

The Conclusions will be presented at the meeting of the Justice and Home Affairs Council on 9 June. The European Commission was asked to present analysis and, where appropriate, proposals before the summer 2017.

................................................................. Support our work - make a recurrent donation! https://edri.org/supporters/ .................................................................

Mutual Legal Assistance Agreement between the US and the EU

In parallel to the draft Conclusions, the Council is also working on an Agreement with the United States regarding mutual legal assistance (pdf). This Agreement aims to modify and refine the old MLA Agreement, which entered into force in 2010. It aims to ensure an effective cooperation between participating Members States of the EU and the US in the field of criminal justice and combating organised crime and terrorism. At the moment, the Council is waiting for the Committee of Article 36 (CATS) to approve the draft text, to allow the text to be formally agreed in June 2016.

The Agreement plans to make available “the ability to obtain information on bank accounts, form Joint Investigation Teams (JITS), transmit requests using faster means of communications, obtain witness evidence by video conferencing, and secure evidence for use by administrative bodies where subsequent criminal proceedings are envisaged.”

In contrast to the Conclusions discussed above, this US-EU Agreement does not plan to bypass the current MLA system. Despite the fact that the Agreement mentions the possibility of “direct access of EU-Member States to data held by Internet Service Providers”, it wants to ensure that this conduct still requires so-called “probable clause”. To meet this criterion, the government must present specific, detailed and reliable facts to a court to demonstrate that a criminal offence has been committed. Without “probable clause” it would not be possible to request data.

(Article written by Claudius Determann, EDRi intern)

Twitter_tweet_and_follow_banner

close
25 May 2016

European fundamental rights to be regulated by companies

By EDRi

Today, on 25 May, the European Commission published two new proposals of their Digital Single Market strategy: its update of the Directive on audiovisual media services (ADVMS) and a Communication on online platforms, together with the evidence document for the platforms Communication.

The Communication on Platforms worries us the most. For instance, the proposals with regard to the regulation of “illegal” “or harmful” content are hugely disturbing. The Commission seems willing to completely give up on the notion of law. Instead, regulation of free speech is pushed into the hands of private corporations.

Demanding that multiple companies in multiple jurisdictions arbitrarily implement whatever national law or other preferences they choose is a sure-fire way of building new barriers in the “Digital Single Market”

said Joe McNamee, Executive Director of European Digital Rights (EDRi).

The Communication repeats “voluntary measures” almost like an ideological mantra – whatever the question is, the answer is always “platforms can fix it”. What about public authorities’ responsibilities to enforce the law?

The Commission refers to the EU Internet Forum as an “important example”. Indeed it is, although not in the way the Commission meant.The EU Internet Forum is a highly-untransparent Commission-driven project, undertaken in cooperation with exclusively US online companies (and not even all of the relevant ones!) to tackle terrorist content and hate speech online. The Commission describes it as a “multi-stakeholder” model, even though only the Commission and three US companies were involved in drafting the outcome.

The Communication also talks about limitations of liability for platforms when they take “good faith” law enforcement measures. Sounds familiar? The ill-fated Stop Online Piracy Act (SOPA) that was abandoned after huge protests in the USA proposed exactly the same thing. The good news is that, for now, no new liability measures are being proposed to coerce platforms into taking these measures. However, there are threats of “formal notice and action” procedures that make it clear that law will be used if the companies do not police European citizens extensively enough.

EC_Platforms_Communication_20160525

Today’s Communication offers “evidence” is either misrepresented or moulded to suit pre-existing policies. The Commission appears eager to ensure that the online monopolies monitor online activity, take action to remove any content that creates legal risks for them, and arbitrarily police content to “protect” unspecified and undefined “societal values”. Not a word is wasted on review processes, effectiveness, proportionality, possible counter-productive or anti-competitive effects.

Charter_quote_PR_20160525

Background information:

EDRi response to AVMS Consultation (30.09.2015)
https://edri.org/files/avmsd_response.pdf

EDRi response to Platforms Consultation (06.01.2016)
https://edri.org/files/platforms.html

The EU Platforms Consultation – Just How Biased is it (14.12.2014)
https://edri.org/eu-commission-platforms-consultation-how-biased-is-it/

Leaked EU Communication – Part 1: Privatised censorship and surveillance (26.04.2016)
https://edri.org/leaked-eu-communication-privatised-censorship-and-surveillance/

Leaked EU Communication – Part 2: Protecting Google at all costs (28.04.2016)
https://edri.org/leaked-eu-communication-part-2-protecting-google-at-all-costs/

EU Internet Forum against terrorist content and hate speech online: Document pool (10.03.2016)
https://edri.org/eu-internet-forum-document-pool/

Our overview of the Digital Single Market Communication (17.06.2015)
https://edri.org/overview-of-dsm-communication/

Twitter_tweet_and_follow_banner

close
23 May 2016

Copyfails: Time to #fixcopyright!

By Diego Naranjo

We believe that new technologies bring new ways to access culture – they are not a threat for creators. We believe that the legitimacy crisis of the current EU copyright regime is created by the system itself. We believe there’s a need for a modernised copyright regime which takes into consideration the needs of all parts of society, including creators.

Europe needs a more profound reform of the EU copyright regime than the one that the European Commission has announced. To illustrate this, we have identified nine copyfails – crucial failures of the current EU system. You can read the first blogpost of our “copyfails” series, presenting the copyfail #1 here.

The European Commission has set in its agenda reforming copyright as one of the foundations to build the Digital Single Market. However, the Communication published at the end of 2015 did not meet the expectations of the announced “more modern, more European” copyright. On the contrary, the Commission apparently only wants to paper over the serious cracks in the wobbling structure of EU copyright legislation rather than addressing the real problems.

copyfails

Are you ready to #fixcopyright in the EU? Follow #fixcopyright on Twitter!

 

Read more:

Copyright reform: Restoring the facade of a decrepit building (16.12.2015)
https://edri.org/copyright-reform-restoring-the-facadeof-a-decrepit-building/

 

Twitter_tweet_and_follow_banner

close
19 May 2016

Copyfail #1: Chaotic system of freedoms to use copyrighted works in the EU

By Diego Naranjo

This article is the first one in the series presenting Copyfails. The EU is reforming its copyright rules. We want to introduce to you the main failures of the current copyright system, with suggestions on how to fix them. You can find the nine key failures here.

COPYFAIL #1

How has it failed?

The current EU Copyright Directive outlines 21 different optional freedoms to use copyrighted works. These freedoms, called “exceptions and limitations”, specify how strict copyright rules can avoided in certain useful circumstances, as long as this does not interfere with the exploitation of the work by the creator. This would include, for example, using copyrighted material for educational purposes, adapting it for people with disabilities, making copies of music or films for personal use, or using it for academic quotations.

Each EU country putting the Directive into practice can choose to either include or exclude any of these optional exceptions. As a result, there are literally over two million ways to implement the Directive! In a borderless, open Internet, it is crazy that a simpler solution to implement flexibilities that do not interfere with the normal exploitation of the copyrighted material is not implemented.

However, copyright lobbyists are vehemently opposed to any flexibility. Indeed, in 2001, when the Directive was adopted, lobbyists argued that the one mandatory exception (for incidental copies in networks) was absolutely unworkable and would “a gaping hole in rightsholders’ protection under the reproduction right“. Fifteen years later, it is very obvious that no such “gaping hole” was created. Now, they warn again against a more flexible regime. Now, as then, they are wrong.

copyfail1

Why is this important?

People across the EU should be able enjoy the same rights. Harmonisation of the copyright rules is needed for creating a Digital Single Market – not 28 EU markets as we currently have.

The implications of the copyfail #1 are huge, for example:

  • In the UK, people are not allowed to make copies of music that they legally buy.
  • In Austria and Lithuania is illegal to send quotations by e-mail.
  • In some countries, like France, the uses of copyrighted works in schools are considerably more restricted, than in others, like Estonia. The latter allows teachers within an educational context to quote works to any justified extent, compile works of any nature and translate and adapt entire works, while France doesn’t.

How to fix it?

fixcopyrighttip-1

Read more:

Copyright combinatronics (16.11.2011)
https://edri.org/edrigramnumber9-22copyright-combinatronics/

Copyright exceptions and limitations – back to the future (25.03.2015)
https://edri.org/copyright-exceptions-and-limitations/

Copyright reform: Restoring the facade of a decrepit building (16.12.2015)
https://edri.org/copyright-reform-restoring-the-facadeof-a-decrepit-building/

 

Twitter_tweet_and_follow_banner

close
18 May 2016

ENDitorial: Next year, you’ll complain about the Terrorism Directive

By Joe McNamee

The European Union is currently in the process of adopting a Directive on terrorism. The Directive is expected to be finalised later this year and then each Member State government will decide what it means, and will adopt national laws to put it into practice.

The European Commission wrote the draft Directive in two weeks, after the Paris attacks in November 2015, basing itself mainly on existing EU and international instruments. It was prepared without an impact assessment. An impact assessment assesses various options and the evidence available. This step is generally necessary for every piece of legislation. However, it appears it is not needed when human lives and civil liberties are at stake. Why? Because it is important to give people a feeling (even if the feeling is misguided) that the EU is doing something to “protect” them.

This is just the beginning of the political process, so the press is not interested.

Within three months, the EU Member States had adopted their informal position, or as they call it, their “general approach”. They did this in the absence of an impact assessment that would have provided the analysis to indicate what may or may not be lawful, appropriate, useful, necessary or proportionate. They added new text on blocking of websites “inciting to commit” terrorist offences, without any indication of whether the content would be legal or illegal, or why they might think that this might be a good idea. The Member States also envisage “the taking and the fixing of audio recordings in private or public vehicles and places, and of visual images of persons in public vehicles and places…”.

This is just the beginning of the political process, so the press is not interested.

The European Parliament is now preparing its position. Unlike legislation on other subjects, where relevant specialised parliamentary committees give their opinions to the Committee in charge, in this case only one Committee is assessing the Directive, because speed seems to be more important than evidence or effectiveness or even usefulness. Various Parliamentarians in the “Civil Liberties” Committee have proposed adding random additional (often incongruous or irrelevant) measures to the Directive – on blocking of websites, on banning terrorist malware (sic), on criminalising online platforms for failing to remove content on “counterfeiting trademarks” (sic), and a whole host of other measures that, similarly, are neither relevant nor based on any evidence of usefulness, effectiveness, proportionality, legality… anything.

This is just the beginning of the political process, so the press is not interested.

Now, instead of voting on the 438 amendments tabled, a small group of just eight representatives of political groups are having closed-door meetings, to create “compromise amendments” on the controversial points. Once the political groups go through this process, a vote can happen with everyone hiding in the crowd – citizens will never know who made what proposal or supported what decision. The ensuing vote in the Civil Liberties Committee will just be a mere formality because the other politicians tend to respectively follow the eight politicians that negotiated the text.

This is just the beginning of the political process, so the press is not interested.

Once the vote has happened in Committee, it does not need to be endorsed by the full Parliament. Instead, the eight Parliamentarians and Member States will start closed-door, secret meetings, together with the European Commission (known as “trilogues”) to reach a compromise between the position taken in the Committee’s so-called “orientation vote” and the Member States’ “general approach” (neither of which is binding on either institution). This process is secret, the negotiation drafts are secret and the procedure has no formal end-date.

This process is too opaque, so the press is not interested.

Once the trilogue process finishes, both the Member States and the Parliament are committed to rubber-stamping the agreement, so it is almost impossible to make any changes at this stage. Some policy-makers won’t agree on specific points, but they will nonetheless vote for the adoption of the text. If you ask them, they will tell you that it’s because of the “necessary” political trade-offs. If some politicians vote differently, they will be reprimanded by their political group.

Only at this stage the press is given press releases, a nice spin about “fighting terrorism”, a clear end-date for the Parliament vote.

The press is interested, the process is over. It’s too late to change anything.

Next year, when your Member State starts blocking websites, without quite knowing why, when it starts imposing restrictions on Tor and proxy servers, without quite knowing why, when unaccountable, unclear legislation leads to arbitrary and discriminatory enforcement, and your government says that it is “EU law that it is obliged to implement” and you wonder why the press never reported on it, when you search in vain for who is accountable for a weak and dangerous text, come back and read this again.

In this process, EDRi uses all of its political resources to obtain documents, provide input to decision-makers, and use all possible opportunities to shine a light on closed processes and defend citizens’ rights from the dangers created by opaque and unaccountable decision-making. Thankfully, the fact that the process is rotten does not mean that the individuals involved are not open to positive and constructive input – many work hard to achieve the best possible outcome for Europe’s citizens. Even if we cannot always – or often – claim credit for the successes we have in positively influencing such processes, it is important to engage as effectively as possible, while demanding that trilogues are reformed or, ideally, abandoned.

................................................................. Support our work with a one-off-donation! https://edri.org/donate/ .................................................................

EDRi: EDRi’s recommendations for the European Parliament’s Draft Report on the Directive on Combating Terrorism (29.03.2016)
https://edri.org/files/counterterrorism/CounterTerror_LIBEDraftReport_EDRi_position.pdf

EDRi: Countering terrorism, a.k.a. the biggest human rights threat of 2016 (20.04.2016)
https://edri.org/countering-terrorism-aka-the-biggest-human-rights-threat-of-2016/

EDRi: Trilogues: the system that undermines EU democracy and transparency (20.04.2016)
https://edri.org/trilogues-the-system-that-undermines-eu-democracy-and-transparency/

Proposal for a Directive on combating terrorism (02.12.2015)
http://ec.europa.eu/dgs/home-affairs/what-we-do/policies/european-agenda-security/legislative-documents/docs/20151202_directive_on_combatting_terrorism_en.pdf

Council General Approach on the proposed Directive on combating terrorism (11.03.2016)
http://data.consilium.europa.eu/doc/document/ST-6655-2016-INIT/en/pdf

European Parliament draft report on Directive on combating terrorism (09.03.2016)
http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-%2F%2FEP%2F%2FNONSGML%2BCOMPARL%2BPE-577.046%2B01%2BDOC%2BPDF%2BV0%2F%2FEN

European Parliament amendments tabled to Directive on combating terrorism (nos 56 to 246)(08.04.2016)
http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//NONSGML+COMPARL+PE-580.621+01+DOC+PDF+V0//EN&language=EN

European Parliament amendments tabled to Directive on combating terrorism (nos 247 to 438)(08.04.2016)
http://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&reference=PE-580.626&format=PDF&language=EN&secondRef=01

(Contribution by Joe McNamee, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
18 May 2016

Advocate General: Dynamic IP address can be personal data

By EDRi

On 12 May Manuel Campos Sánchez-Bordona, Advocate General (AG) of the European Court of Justice (CJEU), gave his opinion in the Case Patrick Breyer against the Federal Republic of Germany, C-582/14.

Patrick Breyer sued the German government for violating his right to data protection by storing the data about him visiting websites of the German government longer than necessary. The government’s websites use so-called “logs” that keep record of which particular dynamic IP address was having access to the service. Breyer claims that the storage of this data constitutes a processing of personal data, which is protected under the Data Protection Directive 95/46/EC. According to the Directive, such processing of personal data is generally unlawful, unless it is justified, for example by a previously given consent. The Republic of Germany, however, stated that the logs to its website are essential for its functioning, as they are important for preventing abuse and prosecuting network attacks. In the ongoing procedure, the German Federal Court of Justice (BGH) eventually forwarded two questions to the CJEU, asking for preliminary ruling.

................................................................. Support our work - make a recurrent donation! https://edri.org/supporters/ .................................................................

The first question was addressing the issue whether a dynamic IP address, together with the time of access, constitute personal data according to Article 2 (a) of the Data Protection Directive. The question referred specifically to whether a person is already “identifiable” if the information which is necessary to identify an individual has to be provided by a third party, in this case a telecommunication company. The Advocate General followed Breyer’s legal submissions in his opinion, stating that the definition of personal data also encompasses dynamic IP addresses, as long as an Access Provider holds additional information enabling the identification of an individual.

The Advocate General’s position directly contradicts the position of the Irish High Court in EMI Records & Ors -v-Eircom Ltd from 2010. In that case, Mr Justice Charleton (now elevated to the Irish Supreme Court), having explained that the internet is not “an amorphous extraterrestrial body” (paragraph 5), ruled that IP address data collected for the explicit purpose of identifying individuals in the context of copyright enforcement is not personal data.

The Advocate General’s view is also in line with an opinion from 2009 by the Article 29 Working Party (WP29), an independent body with advisory status regarding data protection, which was set up according to the Data Protection Directive. Back then, the WP29 considered that IP addresses can be personal data, even though they do not generally identify an individual by name.

The second question was assessing whether restrictions of data processing under the German Telemedia Act (§15 TMG), which lead to a prohibition of the processing, may violate the Article 7 (f) of the Data Protection Directive. The provision states that personal data may be processed, if it is necessary for the purposes of legitimate interest of the processor. However, the German Telemedia Act (TMG) allows the collection and use of users’ data in only limited circumstances, but not for the purpose of ensuring the general operability of the telemedia service. The AG claims that the purpose of a functioning of “telemedia” (as defined in §1 TMG) constitutes a legitimate interest according to the Directive and should therefore be permitted, when it prevails over the interests or the fundamental rights of the concerned person.

Patrick Breyer stated in reaction to the opinion that

nobody has a right to record everything we do and say online. Generation Internet has a right to access information online just as unmonitored and without inhibition as our parents read the paper, listened to the radio or browsed books.

The European Court of Justice did not yet set a date for the final decision.

Request for a preliminary ruling of the German Bundesgerichtshof (17.12.2016) http://curia.europa.eu/juris/document/document.jsf?text=&docid=162555&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=801346

The 2009 opinion by the Article 29 Working Party (20.06.2016) http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2007/wp136_en.pdf

Opinion of the Advocate General (only in German, 12.05.2016) http://curia.europa.eu/juris/document/document.jsf?text=&docid=178241&pageIndex=0&doclang=DE&mode=req&dir=&occ=first&part=1&cid=810242

EMI Records & Ors -v-Eircom Ltd (16.04.2010)
http://www.bailii.org/ie/cases/IEHC/2010/H108.html

(Contribution by Claudius Determann, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
18 May 2016

Hungary: New government proposals raise concerns

By Guest author

The Hungarian government is ramping up its “terrorist” measures; a constitutional amendment that establishes a new state of exception is one of the measures it foresees as necessary to keep the population safe.

The threat of terrorism in Hungary is considered to be low by the UK Foreign Office, the CIA, and Hungary’s Strategic Defense Research Centre. Even the Prime Minister’s chief national security advisor says they have no knowledge of terrorist plots. Despite that, the country has maintained a medium-level terror alert since the Paris attacks in November 2015. Most observers agree that any necessary reforms could be addressed by modifying the measures in the current legal framework. However, in early June 2016, the Hungarian National Assembly will vote on an anti-terrorism bill that will require a sixth amendment to a constitution that is already a patchwork of amendments.

................................................................. Support our work with a one-off-donation! https://edri.org/donate/ .................................................................

Encryption users put on their guard

Plans to ban encryption software were abandoned after being castigated in the media. However, under the conditions specified in changes to the Act on Electronic Commerce, Internet Service Providers (ISPs) that provide encryption services would be under an obligation to store the metadata of encryption users for a year (along with the contents of their communications in the case of weaker forms of encryption); and if they refuse to share this data with an intelligence agency, they could face a fine of up to 30 000 euro. Troublingly, without the inclusion of proper safeguards, encryption users could be more susceptible to privacy abuse. While the government portal declares that the stored data would only be accessible to the secret services after judicial approval, the Hungarian Civil Liberties Union (TASZ) asserts otherwise: authorisation for such requests under current legislation is conferred not by a court, but by a government ministry.

Maximum privacy for state enterprises

Tucked away in the recesses of the 2017 Central Budget Governance Act are also some provisions that would potentially shield state-owned companies from public scrutiny. According to the proposal up for debate next week, non-disclosure protections of data relating to the assets, functioning and contracts of state companies involved in activities such as central data acquisition and telecommunications management could be exempted from Freedom of Information (FOI) laws for up to 30 years. The grounds for protections are mainly economic. The rationale: the availability of certain kinds of information could threaten the national economy by harming the interests of state companies. However, even the authority responsible for Freedom of Information, not famous for taking a strong stand against government excesses, questioned the exceptions last week, reminding the Parliament’s Budget Committee that the statutory and constitutional basis for restricting access in court, related to legitimate economic interests, does already exist.

“Terror alert”: A new state of exception

The constitutional grounds for invoking a state of emergency currently exist as well. Constitutional provisions stipulate in Articles 48-54 how authorities may respond to armed attacks, industrial disasters and other kinds of imminent dangers to national security. The absence of a clear justification for the amendment is one of the reasons why the establishment of a new form of state of emergency (literally “status of terror alert”, terrorveszély-helyzet) is controversial. According to the draft it would enable measures that would not be subject to ex post judicial control. What constitutes the grounds for invoking this status is not entirely clear either, and this is worrying since it would empower the government, if it felt it was endangered, to summon the National Defence Forces or limit civil rights.

In early 2016, when the proposed text of the constitutional amendment was only accessible to members of parliament under a pledge of secrecy, the Eötvös Károly Institute commented on a leaked copy, arguing that the legal basis for the declaration of a state of “terror alert” lacked the precision that such extensive power would require. It called on the government to conduct an evidence-based assessment of the threats that the country faced and to examine whether its needs could be addressed through parliamentary acts instead. Unlike many of the other dramatic changes to Hungary’s legal framework that have occurred in recent years, multiparty support will now be needed for most of the key measures to become law. And the latest version of the bill has been significantly tempered with this in mind; it now includes measures that would enable the National Assembly to have a say in the matter.

The Hungarian Minister of Defence maintains that the primary aim of the reforms is to protect the population more effectively. Even if he’s right, his reasons for thinking that Hungary would never cross the line into a military dictatorship are not entirely convincing: “We live in an open world”; and with a “multiparty democracy, parliamentary control and the internet” that “could never happen”. In the absence of adequate safeguards designed to discourage a head of state persuaded that this, or any other form of overreach, was necessary, the amendments that are on the table would offer an accommodating legal environment. The fact that “nobody [in the government] would want this” shouldn’t be enough to reassure a population living in a constitutional democracy. That’s what constitutional guarantees are for.

The Hungarian Parliament is About to Enact New Anti-Terror Laws (05.05.2016)
http://www.liberties.eu/en/news/hungary-new-anti-terror-laws

Communication of the National Authority for Data Protection and Freedom of Information to the Parliamentary Budget Committee (only in Hungarian, 09.05.2016)
http://bit.ly/1VXRG2H

(Contribution by Christiana Maria Mauro, EDRi Observer)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
18 May 2016

EC wants to add facial recognition to transnational databases

By EDRi

On 4 May 2016, the European Commission (EC) published a proposal to recast the EURODAC Regulation. The European Automated Fingerprint Identification System (EURODAC) was initially introduced in 2003 to establish an EU asylum fingerprint database, and to share this information with national law enforcement authorities and Europol.

................................................................. Support our work - make a recurrent donation! https://edri.org/supporters/ .................................................................

According to this proposal, if a person applies for asylum anywhere within the EU, their fingerprints will be required, and transmitted to the central EURODAC system. The purpose is ostensibly to help identify irregularly third-country nationals or stateless persons. The regulation was amended in 2013, expanding its purpose from border control to general law enforcement, especially regarding terrorism and serious crime.

In addition to the fingerprints collected so far, the new recast proposed by the European Commission introduces a new feature, facial recognition. Additionally, it includes plans for longer storage periods, an expansion of data categories and comparison capabilities, and mandatory photographing. The Commission claims that the use of biometrics would facilitate the identification of asylum seekers, and therefore improve the effectiveness of the EU return policy. As Matthias Monroy, advisor to German Member of Parliament (MP) Andrej Hunko argues, the biometric data can be taken even against the will of the people concerned. This would also be applicable in the case of minors.

As EDRi member Statewatch states, there will be two different search options within the proposed system. One option is to compare the picture of a person with the available personal data of the individual when checks are conducted. This option is called “1:1 matching”. The second search option is to look for a face in the database, a process known as “1:n matching”.

The Commission envisages that Frontex, the European border management agency, and Europol, the European police cooperation agency, will have access to all the stored information in the EURODAC. Both agencies could conduct searches based on a facial image in the future. This will bring EURODAC in line with the other systems such as the Entry/Exit System. The proposal also allows for the information to be shared with third countries where it is necessary for return purposes.

In sum, the Commission is proposing to take surveillance measures in the attempt to gain full control over the movement of migrants through a facial recognition software and a database filled with the faces of millions of women, men and children. Are the costs of 30 million euros of this new system justified? Is the huge invasion of people’s privacy, freedom of movement justified? When such invasive tools are implemented and “normalised”, what are the chances that it won’t be rolled out more and more extensively?

European Commission’s proposal for a new Regulation on EURODAC (04.05.2016)
http://ec.europa.eu/transparency/regdoc/rep/1/2016/EN/1-2016-272-EN-F1-1.PDF

Facial recognition to accompany fingerprints in transnational databases (12.05.2016)
http://statewatch.org/news/2016/may/eu-facial-recognition.html

Report on the proposal on the website of Andrej Hunke, Die Linke, Germany (11.05.2016)
http://www.andrej-hunko.de/7-beitrag/3103-eu-adds-facial-recognition-capabilities-to-police-databases

(Contribution by Claudius Determann, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
18 May 2016

Europol: Non-transparent cooperation with IT companies

By EDRi

Will the European Police Office’s (Europol’s) database soon include innocent people reported by Facebook or Twitter? The Europol Regulation, which has been approved on 11 May 2016, not only provides a comprehensive new framework for the police agency, but it also allows Europol to share data with private companies like Facebook and Twitter.

The history of Europol legislation

Europol supports Member States in more than 18 000 cross-border investigations a year. It started in 1993 as an intergovernmental Europol Drugs Unit (EDU) and became an international organisation with its own legal apparatus in 1995. A Council Decision from 2009 (371/JHA) established Europol as an EU agency. In March 2013, the European Commission proposed a reform of Europol via a draft regulation. After three years of work, the European Parliament approved the political compromise reached with the Council and the Commission on 11 May. The new Regulation will be applicable in May 2017.

What is Europol doing?

Europol serves as an information hub for law enforcement agencies of Member States and supports their actions in fighting against thirty types of “serious crimes”, including terrorism, organised crime, “immigrant smuggling”, “racism and xenophobia” and “computer crime”. Rather than focusing on specific areas of crime, the targets of Europol have now even been increased. For example, “sexual abuse and sexual exploitation, including child abuse material”, as well as “genocide and war crimes” are also included in the list.

................................................................. Support our work with a one-off-donation! https://edri.org/donate/ .................................................................

In all these areas, Europol’s main task is to “collect, store, process, analyse and exchange information” that it gathers from Member States, EU bodies and “from publicly available sources, including the internet and public data”.

Unlike its American equivalent, the FBI, Europol has no executive powers. It can only notify Member States of possible criminal offences in their jurisdiction, but not start an investigation on its own or arrest anybody. Thus, after notification, it is upon the Member State whether to investigate or not. If a Member State decides to take action, Europol shall provide support, which can also include participation in joint investigation teams.

The Internet Referral Unit

One of the controversial parts of the new Regulation is related to Europol’s EU Internet Referral Unit (IRU). The IRU has been operative since July 2015, and is part of the newly-founded European Counter Terrorism Centre (ECTC) at Europol. Among other tasks, the IRU monitors the internet looking for content that is “incompatible” with the terms of service of “online service providers” like Facebook, so they can “voluntarily consider” what to do with it. In other words, the IRU does not assess whether the content is illegal or not, and companies are not obliged to remove illegal content. They are “encouraged” to “voluntarily” do something with the referrals. The IRU has the power to create pressure to have apparently legal content deleted by companies, but bear no responsibility or accountability for doing so. How this complies with the EU Charter’s obligation that restrictions (on freedom of communication in this case) of fundamental rights must be “provided for by law” (rather than, for example, this kind of ad hoc, informal arrangement) is not obvious. Equally non-obvious is compliance with the requirement that such restrictions are “necessary and genuinely meet objectives of general interest”. If it was necessary to delete such content, then such content would (obviously?) be illegal, rather than just a possible breach of terms of service.

A document from the European Commission from April 2016 shows that the IRU has so far assessed over 4700 posts across 45 platforms and sent over 3200 referrals for internet companies to remove content, with an effective removal rate of 91%. This means that companies are receiving pressure to remove content that a public authority has assessed, not on the basis of the law, but on the basis of a private contract between a company and an internet user.

The whole procedure is non-transparent and is not subject to judicial oversight. In this sense, the European Data Protection Supervisor (EDPS) had recommended that Europol makes at least the list of its cooperation agreements with companies publicly available. However, this recommendation was not taken into account, and did not make it to the final text of the new Regulation.

Europol may receive and transfer personal data

Europol can also “receive” personal data, which is publicly available, from private parties like Facebook and Twitter directly. Before, this was only possible via a national police unit, assuming compliance with national law. Now a company has to declare that it is legally allowed to transfer that data and the transfer “concerns an individual and specific case”, while “no fundamental rights […] of the data subjects concerned override the public interest necessitating the transfer”. This has not been part of Europol’s rules before, and the Commission has to evaluate this practice after two years of implementation. At least until then, Europol can also feed this information into its databases on the internet.

While “Europol shall not contact private parties to retrieve personal data”, it will now be able to transfer information to private entities. The Regulation puts in place several measures to safeguard personal data protection. However, there are no transparency requirements to inform the public about any type of information exchange between Europol and companies. The Regulation only empowers a “Joint Parliamentary Scrutiny Group” to request documents. However, that remains rather unclear in the text, and will be governed by “working arrangements” between Europol and the Parliament. The accountability and respect for the rule of law in this deal is not clear.

Text of the new Europol Regulation adopted by the Council (11.03.2016) and the European Parliament (11.05.2016)
http://data.consilium.europa.eu/doc/document/ST-14957-2015-REV-2/en/pdf

Police cooperation: MEPs approve new powers for Europol to fight terrorism (11.05.2016)
http://www.europarl.europa.eu/news/en/news-room/20160504IPR25747/Police-cooperation-MEPs-approve-new-powers-for-Europol-to-fight-terrorism

European Data Protection Supervisor, Executive Summary of the Opinion of the EDPS on the proposal for a Regulation on Europol (31.05.2013)
http://eur-lex.europa.eu/legal-content/EN/AUTO/?uri=uriserv:OJ.C_.2014.038.01.0003.01.ENG&toc=OJ:C:2014:038:TOC

Communication from the Commission, delivering on the European Agenda on Security to fight against terrorism and pave the way towards an effective and genuine Security Union (20.04.2016)
http://ec.europa.eu/dgs/home-affairs/what-we-do/policies/european-agenda-security/legislative-documents/docs/20160420/communication_eas_progress_since_april_2015_en.pdf

(Contribution by Fabian Warislohner, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close