privacy

The right to privacy is a crucial element of our personal security, for free speech and for democratic participation. It is a fundamental right in the primary law of the European Union and is recognised in numerous international legal instruments. Digital technologies have generated a new environment of potential benefits and threats to this fundamental right. As a result, defending our right to privacy is at the centre of EDRi’s priorities.

27 May 2020

Hungary: “Opinion police” regulate Facebook commentaries

By Guest author

There have been a number of critical news reports from around the world stating that Hungary’s COVID-19 state-of-emergency legislation is “creating a chilling effect”. Such headlines require further explanation, as the situation is far from new. Individuals who cross government authorities and their allies and supporters with public and private expressions of criticism have been losing their positions for over a decade; and any chilling effects that successive governments have had on citizens’ behaviour were apparent long before the current regime. The population has been socialized over the course of generations to keep their political opinion to themselves, and to steer clear of identifying with any activity, or individual, deemed “risky”. In Hungary, the threshold for what is considered risky has been particularly low for some time.

What qualifies as news is the sustained media attention that the chilling effect in Hungary has received over the past two weeks. Its COVID-19 emergency legislation has attracted intense scrutiny nationally and globally.

The media furore was triggered by the detention of two people by local police authorities due to statements posted on Facebook that allegedly posed the risk of “alarming the population” or “interfering with public protection” during the crisis.

In a blog addressing the highly mediated cases, the Hungarian Civil Liberties Union (HCLU) maintains that existing legislation could have been used to tackle the problem of publication and dissemination of false information. The blog’s headline, “The opinion police are at the door”, alludes to the legendary Socialist-era terror of a doorbell sounding in the middle of the night.

Legal retaliation

The first case involved an individual in Eastern Hungary detained for hours for “publishing false facts on a social media site”. The “alarmist content” consisted of disapproval of the lockdown policy with additional remarks, presumably addressed to the Prime Minister (“You’re a merciless tyrant, but bear in mind that dictators invariably fall”). The man recalls half a dozen law enforcement authorities arriving at his home on May 12. The charges were dropped, but the YouTube channel of the police posted a video that was widely viewed of the man being removed from his home and placed into a police vehicle.

The following day the home of an opposition-party member in the South of the country was raided at dawn, with a heavy police presence; his communication devices were confiscated and the man was detained for four hours at police headquarters for having shared a post from an opposition MEP on a closed Facebook group; the post criticised a controversial government decision to empty thousands of hospital beds across the country to free them for potential Coronavirus patients. He remarked that in the town of Gyula “1,170 [hospital] beds were freed as well”. The fact was not in dispute. The Facebook user from Gyula was not charged with a criminal offense.

Social media users warned of “continual monitoring”

An announcement posted online by the authorities in one Eastern county of Hungary explicitly alerted social media users to the fact that the police are “continually monitoring the internet,” Politico reports. According to the National Law Enforcement website, 87 people have been targeted in criminal investigations in connection with the COVID-19 measure. Of these cases, only 6 have reached the prosecution phase.

What’s in the public interest?

Back at the end of March when the Hungarian Parliament passed a bill introducing emergency powers without a sunset clause, the move garnered a surprising amount of coverage and criticism. Of particular concern was an amendment to the Criminal Code introducing prison terms of up to five years for individuals convicted of “distorting the truth” or “spreading falsehoods” connected to the Coronavirus pandemic; like many countries worldwide, the aim of the legislation was to protect the public during the pandemic, but anecdotal reports suggest that often it’s the authorities themselves, rather than the public, that stand to gain from special protection.

The news outlet Daily Népszava reported that on May 19 the Parliament adopted a 160-page bill which will grant the National Security Services a mandate that could entail major data security risks. The NSS will be empowered to “monitor the content of electronic communications networks” at the local and federal level of government to prevent cyber attacks.

News website 444 suggests that a state surveillance system is being established with the passage of this law. In effect, the secret service will be given access to all public data, including tax, social security, health, and criminal records.

Another controversial aspect of the legislative package is that the contact data of persons interrogated over the course of a criminal investigation could be retained by the authorities for up to twenty years, even if the suspect is found innocent.

Privacy experts say that the legislation does not offer sufficient data protection safeguards. The Head of the National Authority for Data Protection and Freedom of Information (NAIH), Attila Péterfalvi, has written to the State Secretary of the Ministry of Interior with concern that “unlimited surveillance (…) will not allow for special protection of personal data.”

From state of emergency to surveillance state?

In a resolution adopted by the European Parliament in late April, Hungary was sharply criticised for its COVID-19 measures. Limits to free speech under an indefinite state of emergency were said to be “totally incompatible with European values”. The Minister of Justice announced recently that a bill revoking emergency powers is expected to be adopted on June 20. While the government is already declaring victory on the public relations front and calling for “apologies” from Brussels, the contents of the bill are still unknown. Observers suspect that the government will annul the “state of emergency” while preserving many of the emergency powers.

Read more:

The Impact of Covid-19 Measures on Democracy, the Rule of Law and Fundamental Rights in the EU (23.04.20)
https://www.europarl.europa.eu/RegData/etudes/BRIE/2020/651343/IPOL_BRI(2020)651343_EN.pdf

Jourová: Commission looking at Hungary’s emergency changes to labour code and GDPR (15.05.20)
https://www.euractiv.com/section/justice-home-affairs/news/jourova-commission-looking-at-hungarys-emergency-changes-to-labour-code-and-gdpr

Hungary’s Government Using Pandemic Emergency Powers To Silence Critics (18.05.20)
https://www.techdirt.com/articles/20200514/17321444504/hungarys-government-using-pandemic-emergency-powers-to-silence-critics.shtml

(In Hungarian) Mindent is megtudhatnak ezután a nemzetbiztonsági szolgálatok az emberekről (19.05.2020)
https://nepszava.hu/3078579_mindent-is-megtudhatnak-ezutan-a-nemzetbiztonsagi-szolgalatok-az-emberekrol

Open Letter: Commission Has Clear Legal Grounds to Pursue Hungary to Protect Free Speech and Privacy (15.05.2020)
https://www.liberties.eu/en/news/hu-ngo-open-letter-jourova-reynerds/19272

(Contribution by Christiana Mauro, EDRi observer)

close
27 May 2020

German Constitutional Court stops mass surveillance abroad

By Gesellschaft für Freiheitsrechte

The German Federal Intelligence Service (BND) has so far been able to spy on foreign citizens abroad en masse and without cause—even on sensitive groups such as journalists. In response, EDRi member Gesellschaft für Freiheitsrechte (GFF, Society for Civil Rights), alongside five media organizations, filed a constitutional complaint against the BND law that allowed this surveillance to occur. On May 19, 2020, the German Constitutional Court made clear that the BND may not carry out mass surveillance abroad, and is bound by the German Constitution (Basic Law) even as it relates to foreign citizens and cross-boarder communication.

With regards to their local activities, German authorities—such as the BND—are naturally bound by the Basic Law, the constitution of the Federal Republic of Germany. When acting abroad, however, a 2017 change in the law allowed the BND to act with seemingly limitless power; the BND could monitor the telecommunications of foreigners abroad without any limits or specific restrictions. Within its own borders, such surveillance is a clear violation of Article 10 of the Basic Law, which protects the freedom of communications. However, the 2017 BND law assumed that the secret service, when acting outside of German territory, is not bound by the Basic Law.

The BND law thus created considerable risks for foreign journalists who rely on trust and confidentiality when communicating with their sources. In response to the significant threats to constitutional rights created by the BND law, several journalists—supported by the GFF and partner organizations—filed a complaint in the German Constitutional Court (Bundesverfassungsgericht). This complaint led to a landmark decision regarding the protection of fundamental rights and freedom of the press.

New standards for the work of the BND

The ruling of the Constitutional Court is of fundamental importance: it definitively establishes that German authorities are required to protect the fundamental rights contained in the Basic Law abroad.

“This statement was long overdue and is a great success that goes far beyond this specific case,” says Ulf Buermeyer, Chairman of GFF. “The fact that German authorities are also bound by the Basic Law abroad considerably strengthens human rights worldwide—as well as Germany’s credibility in the world”.

According to the Constitutional Court’s interpretation of the Basic Law, monitoring communications abroad without cause is only permissible in very limited circumstances. In addition, vulnerable groups of people such as journalists must be given special protection. The targeted surveillance of individuals must be subject to stricter limitations. The court also noted that the BND’s surveillance practices should be monitored by financially independent counsels.

This decision sends an international signal

For the first time in over 20 years the Federal Constitutional Court has issued a decision regarding BND surveillance. The Court’s ruling is a landmark decision with international significance. In 2013, Edward Snowden’s NSA disclosures revealed a global system of mass surveillance, in which Germany—particularly the BND—participated. Now, more than seven years after the NSA revelations, Germany’s highest court has ruled that international surveillance must also be in accordance with the German Basic Law. This ruling sends an international signal and could affect the surveillance activities of other countries’ intelligence services.

Read more:

In their current form, surveillance powers of the Federal Intelligence Service regarding foreign telecommunications violate fundamental rights of the Basic Law (19.05.20)
https://www.bundesverfassungsgericht.de/SharedDocs/Pressemitteilungen/EN/2020/bvg20-037.html

We have filed a lawsuit against the BND law – No Trust No News
https://notrustnonews.org/?lang=en

BND law (06.11.16)
https://freiheitsrechte.org/bnd-law/

About GFF
https://freiheitsrechte.org/english/

(Contribution by Gesellschaft für Freiheitsrechte, EDRi member from Germany)

close
11 Mar 2020

Accountable Migration Tech: Transparency, governance and oversight

By Petra Molnar

Migration continues to dominate headlines around the world. For example, given the currently deteriorating situation at the border between Greece and Turkey, with reports of increasingly repressive measures to turn people away, new technologies already play a part in border surveillance and decision-making at the border.

Our previous two blogposts explored how far-reaching migration control technologies actually are. From refugee camps to border spaces to immigration hearing rooms, we are seeing the rise of automated decision-making tools replacing human officers making decisions about your migration journey. The use of these technologies also opens the door for violations of migrants’ rights.

How are these technologies of migration control impacting fundamental rights and what can we do about it?

Life and liberty

We should not underestimate the far-reaching impacts of new technologies on the lives and security of people on the move. The right to life and liberty is one of the most fundamental internationally protected rights, and highly relevant to migration and refugee contexts. Multiple technological experiments already impinge on the right to life and liberty. The starkest example is the denial of liberty when people are placed in detention. Immigration detention is highly discretionary. The justification of increased incarceration on the basis of algorithms that have been tampered with, such as at the US-Mexico border, shows just how far we are willing to justify incursions on basic human rights under the guise of national security and border enforcement. Errors, mis-calibrations, and deficiencies in training data can result in profound rights infringements of safety, security, and liberty of migrants when they are placed in unlawful detention. For example, aspects of training data which are mere coincidences in reality may be treated as relevant patterns by a machine-learning system, leading to outcomes which are considered arbitrary. This is one reason why the EU General Data Protection Regulation (GDPR) requires the ability to demonstrate that the correlations applied in algorithmic decision-making are “legitimate justifications for the automated decisions”.

Equality rights and freedom from discrimination

Equality and freedom from discrimination are integral to human dignity, particularly in situations where negative inferences against marginalised groups are frequently made. Algorithms are vulnerable to the same decision-making concerns that plague human decision-makers: transparency, accountability, discrimination, bias, and error. The opaque nature of immigration and refugee decision-making creates an environment ripe for algorithmic discrimination. Decisions in this system – from whether a refugee’s life story is “truthful” to whether a prospective immigrant’s marriage is “genuine” – are highly discretionary, and often hinge on assessment of a person’s credibility. In the experimental use of AI lie detectors at EU airports, what will constitute truthfulness and how will differences in cross-cultural communication be dealt with in order to ensure that problematic inferences are not encoded and reinforced into the system? The complexity of migration – and the human experience – is not easily reducible to an algorithm.

Privacy rights

Privacy is not only a consumer or property interest: it is a human right, rooted in foundational democratic principles of dignity and autonomy. We must consider the differential impacts of privacy infringements when looking at the experiences of people on the move. If collected information is shared with repressive governments from whom refugees are fleeing, the ramifications can be life-threatening. Or, if automated decision-making systems designed to predict a person’s sexual orientation are infiltrated by states targeting the LGBTQI+ community, discrimination and threats to life and liberty will likely occur. A facial recognition algorithm developed at Stanford University already tried to discern a person’s sexual orientation from photos. This use of technology has particular ramifications in the refugee and immigration context, where asylum applications based on sexual orientation grounds often rely on having to prove one’s persecution based on outdated tropes around non-heteronormative behaviour. This is why protecting people’s privacy is paramount for their safety, security, and well-being.

Procedural justice

When we talk about human rights of people on the move, we must also consider procedural justice principles that affect how a person’s application is reviewed, assessed, and what due process looks like in an increasingly automated context.

For example, in immigration and refugee decision-making, procedural justice dictates that the person affected by administrative processes has a right to be heard, the right to a fair, impartial and independent decision-maker, the right to reasons – also known as the right to an explanation – and the right to appeal an unfavourable decision. However, it is unclear how administrative law will handle the augmentation or even replacement of human decision-makers by algorithms.

While these technologies are often presented as tools to be used by human decision-makers, the line between machine-made and human-made decision-making is often unclear. Given the persistence of automation bias, or the predisposition towards considering automated decisions as more accurate and fair, what rubrics will human decision-makers use to determine how much weight to place on the algorithmic predictions, as opposed to any other information available to them, including their own judgment and intuition? When things go wrong and you wish to challenge an algorithmic decision, how will we decide what counts as a reasonable decision? It’s not clear how tribunals and courts will deal with automated decision-making, what standards of review will be used, and what redress or appeal will look like for people wishing to challenge incorrect or discriminatory decisions.

What we need: Context-specific governance and oversight

Technology replicates power in society, and its benefits are not experienced equally. Yet no global regulatory framework currently exists to oversee the use of new technologies in the management of migration. Much of technological development occurs in so-called “black boxes”, where intellectual property laws and proprietary considerations shield the public from fully understanding how the technology operates.

While conversations around the ethics of Artificial Intelligence (AI) are taking place, ethics do not go far enough. We need a sharper focus on oversight mechanisms grounded in fundamental human rights that recognise the high risk nature of developing and deploying technologies of migration control. Affected communities must also be involved in these conversations. Rather than developing more technology “for” or “about” refugees and migrants and collecting vast amounts of data, people who have themselves experienced displacement should be at the centre of discussions on when and how emerging technologies should be integrated into refugee camps, border security, or refugee hearings – if at all.
As a starting point, states and international organisations developing and deploying migration control technologies should, at the minimum:

  • commit to transparency and report publicly what technology is being developed and used and why;
  • adopt binding directives and laws that comply with internationally protected fundamental human rights obligations that recognise the high risk nature of migration control technologies;
  • establish an independent body to oversee and review all use of automated technologies in migration management;
  • foster conversations between policymakers, academics, technologists, civil society, and affected communities on the risks and promises of using new technologies.

Stay tuned for updates on our AI and migration project over the next couple of months as we document the lived experiences of people on the move who are affected by technologies of migration control. If you are interested in finding out more about this project or have feedback and ideas, please contact petra.molnar [at] utoronto [dot] ca.

Mozilla Fellow Petra Molnar joins us to work on AI & discrimination (26.09.2020)
https://edri.org/mozilla-fellow-petra-molnar-joins-us-to-work-on-ai-and-discrimination/

The human rights impacts of migration control technologies (12.02.2020)
https://edri.org/the-human-rights-impacts-of-migration-control-technologies/

Immigration, iris-scanning and iBorderCTRL (26.02.2020)
https://edri.org/immigration-iris-scanning-and-iborderctrl/

Introducing De-Carceral Futures: Bridging Prison and Migrant Justice – Editors’ Introduction: Detention, Prison, and Knowledge Translation in Canada and Beyond
http://carfms.org/introducing-de-carceral-futures/

The Privatization of Migration Control (24.02.2020)
https://www.cigionline.org/articles/privatization-migration-control

Law and Autonomous Systems Series: Automated Decisions Based on Profiling – Information, Explanation or Justification? That is the Question! (27.04.2018)
https://www.law.ox.ac.uk/business-law-blog/blog/2018/04/law-and-autonomous-systems-series-automated-decisions-based-profiling

Briefing: A manufactured refugee crisis at the Greek-Turkish border (04.03.2020)
https://www.thenewhumanitarian.org/analysis/2020/03/04/refugees-greece-turkey-border

Clearview’s Facial Recognition App Has Been Used By The Justice Department, ICE, Macy’s, Walmart, And The NBA (27.02.2020)
https://www.buzzfeednews.com/article/ryanmac/clearview-ai-fbi-ice-global-law-enforcement

Why faces don’t always tell the truth about feelings (26.02.2020)
https://www.nature.com/articles/d41586-020-00507-5

(Contribution, Petra Molnar, Mozilla Fellow, EDRi)

close
26 Feb 2020

Swedish law enforcement given the permission to hack

By Dataskydd.net

On 18 February 2020, the Swedish parliament passed a law that enables Swedish law enforcement to hack into devices such as mobile phones and computers that the police thinks a suspect might use. As with the recent new data retention law only one party (and one member of another party) voted against the resolution (286-26 with 37 absent). The previous data retention law was struck down, and given the directions of the recent Court of Justice of the European Union (CJEU) Advocate General (AG) Opinions on data retention, the current data retention law is likely to be struck down as well.

What capabilities does this give law enforcement agencies?

For crimes that “under the circumstances” can reasonably give at least a two-year prison sentence, law enforcement agencies (LEAs) can request a court warrant to hack into the suspect’s device. This warrant can be given to gather information (for example from encrypted messaging apps) or even in some cases to stop information from being sent from that device.

The law has a number of serious issues that has been pointed out to lawmakers over several years when the law was going through the public inquiry phase. For example, the law does not say that a minimum sentence of two years in prison is required, but that if the prosecutor just believes that the suspected crime might carry two, or more, years in prison, that already give LEAs the legal basis to ask for a court warrant.

Even more worryingly, even citizens who are not suspect of having committed any crimes, but are associated with a suspect might be the targets of hacking by the police. The law gives the LEAs a mandate to hack devices that they reasonably think a suspect might primarily use. So if a suspect might uses their mother’s phone, for example, that device is open to hacking. If you are someone that the police think their suspect will call or message, your phone might also be in danger of being hacked, just because you happen to know someone that the police suspects of a crime. They can also be allowed to use hacking to find a suspect – this means you simply shouldn’t be at the wrong place at the wrong time, or else the police might hack your devices.

The law also includes a clause that states that if the prosecutor feels like the courts will be too slow to issue a warrant, he or she can issue it. If the court then finds that the warrant was wrongly issued, the prosecutor will then have to go to court for review, and any evidence gathered can not be used against the suspect. Of course, the person whose device was hacked (who might not even be a person suspected of a crime) has already had their privacy breached, and the law doesn’t provide any recourse for such abuses.

The new law goes into effect on the 1 April 2020 and will be valid for five years, after which the Swedish parliament will decide to make it permanent or not.

Dataskydd.net
https://dataskydd.net/english/

AG’s Opinion: Mass retention of data incompatible with EU law (29.01.2020)
https://edri.org/ag-opinion-mass-retention-of-data-incompatible-with-eu-law/

Proposal for the police hacking law 2019/20:64 (only in Swedish)
https://data.riksdagen.se/fil/8AB041AD-9F29-4602-8630-1AB528FA4673

Dataskydd.net’s statement on the report proposing the new law (only in Swedish)
https://www.regeringen.se/493a2f/contentassets/32d970c3c63140d68350d964dccffb51/39.-dataskydd.net.pdf

(Contribution by Eric Skoglund, EDRi observer Dataskydd.net)

close
26 Feb 2020

ECtHR: UK Police data retention scheme violated the right to privacy

By Laureline Lemoine

On 13 February 2020, the European Court of Human Rights (ECtHR) issued its judgment in the case Gaughran v. The United Kingdom (UK), on the indefinite retention of personal data (DNA profile, fingerprints and photograph) of a man who had a spent conviction. The Court ruled that in the case of the applicant, the retention at issue constituted a disproportionate interference and therefore a violation of his right to respect for private life (Art. 8 of the European Convention on Human Rights) since the interference could not be regarded as necessary in a democratic society.

The Court relied much of its arguments on the S. and Marper v. The UK (2008) case where it found that the blanket and indiscriminate nature of the powers of retention of the fingerprints, cellular samples and DNA profiles of the applicants, suspected but not convicted of offences, as set out in the UK law, breached their right to respect of private life. Subsequently, the relevant legislation was amended in England and Wales, but not in Northern Ireland, where the applicant in the Gaughran case committed the criminal offence for which he was convicted.

Indefinite retention regime and lack of necessary and relevant safeguards

According to the ECtHR case law, the Court does not necessarily or solely focus on the duration of the retention period to assess the proportionality of a measure. The Court will rather look at whether the regime takes into account the seriousness of the offending and the need to retain the data, as well as the safeguards available to the individual.

In this case, because the UK chose to put in place a regime of indefinite retention, the Court argues that there is therefore “a need to for the State to ensure that certain safeguards were present and effective for the applicant” (para. 94).

The Court pointed out in this regard that the lack of any relevant safeguards. First, the biometric data and the photograph of the applicant were retained “without reference to the seriousness of his offence and without regard to any continuing need to retain that data indefinitely”. Moreover, the Court noted the absence of any real review available to the individual, as well as the police, which can only delete the data in exceptional circumstances (para. 94).

The UK overstepped the acceptable margin of appreciation

Part of the proportionality assessment is also looking at the margin of appreciation of the state. Similar to the national security argument, a wide margin of appreciation is often invoked by the governments to justify measures interfering with fundamental rights.

The Court found the margin to be way narrower than what the UK claimed, and still not wide enough to conclude that the retention was proportionate. Contrary to what the UK stated, the Court states (para. 82-84) that the majority of states have regimes in which there is a defined retention period. Furthermore, it states that the UK is one of the few of the Council of Europe jurisdictions to permit indefinite retention of DNA profiles, fingerprints and photographs of convicted persons.

Moreover, the UK claimed that indefinite retention measures were relevant and sufficient as the more data is retained, the more crime is prevented. This dangerous and false narrative was challenged by the Court as this would justify the “storage of information on the whole population and their deceased relatives, which would most definitely be excessive and irrelevant” (para. 89).

Beware of evolving technologies

One of the UK’s argument regarding the proportionality of the measure was that the applicant’s photograph was held on a local database and could not be searched against other photographs. However, the technology had developed since then, and the police is now able to apply facial recognition and facial mapping techniques to the applicant’s photograph by uploading it to a national database.

The potential use of facial recognition was influential in determining whether there had been an interference with the right to privacy. The Court also highlighted the risk of such evolving technologies, in relation to state powers. In this regard, the Court stresses the need to examine compliance with the right to privacy when “obscure powers” are vested in a state, “creating a risk of arbitrariness, especially where the technology available is continually becoming more sophisticated” (para. 88).

Because the applicant’s personal data has been retained indefinitely without consideration of the seriousness of his offence, the need for indefinite retention and without any real possibility of review, the Court held, unanimously, that there had been a violation of Article 8 (right to respect for private and family life) of the European Convention on Human Rights.

This judgment is especially relevant because it shows that blanket data retention policies without any safeguards breach the right to privacy of individuals, even when measures are considered to fall within the state’s discretion. This judgment could also impact ongoing discussions in the EU around future data retention legislation, as well as ongoing cases in the Court of Justice of the EU.

Judgment – Case of Gaughran v. The United Kingdom
https://hudoc.echr.coe.int/fre#{%22itemid%22:[%22001-200817%22]}

Press release: Indefinite retention of DNA, fingerprints and photograph of man convicted of drink driving breached his privacy rights (13.02.2020)
https://hudoc.echr.coe.int/fre#{%22itemid%22:[%22003-6638275-8815904%22]}

Data retention: “National security” is not a blank cheque (29.01.2020)
https://edri.org/data-retention-national-security-is-not-a-blank-cheque/

Indiscriminate data retention considered disproportionate, once again (15.01.2020)
https://edri.org/indiscriminate-data-retention-considered-disproportionate-once-again/

(Contribution by Laureline Lemoine, EDRi)

close
12 Feb 2020

Digitalcourage fights back against data retention in Germany

By Digitalcourage

On 10 February 2020, EDRi member Digitalcourage published the German government’s plea in the data retention case at the European Court of Justice (ECJ). Dated 9 September 2019, the document from the government explains the use of retained telecommunications data by secret services, the question whether the 2002 ePrivacy Directive might apply to various forms of data retention, which exceptions from human rights protections apply to secret service operations, and justifies its plans for the use of data retention to solve a broad range of crimes with the example of a case of the abduction of a Vietnamese man in Berlin by Vietnamese agents. However, this case is very specific and, even if then the retained data was “useful”, that is not a valid legal basis for mass data retention, and therefore can not justify drastic incisions into the basic rights of all individuals in Germany. Finally, the German government also argues that the scope and time period of the storage makes a difference regarding the compatibility of data retention laws with fundamental rights.

Digitalcourage calls for all existing illegal data retention laws to be declared invalid in the EU. There are no grounds for blanket and suspicion-less surveillance in a democracy and under the rule of law. Whether it is content data or metadata that is being stored, data retention (blanket and mass collection of telecommunications data) is inappropriate, unnecessary and ineffective, and therefore illegal. Where the German government argues that secret services need to use telecommunications data to protect state interests, Digitalcourage agrees with many human rights organisations that activities of secret services can be a direct threat to the core trust between the general public and the state. The ECJ has itself called for the storage to be reduced to the absolutely required minimum – and that, according to Digitalcourage, can be only be fulfilled if no data is stored without individual suspicion.

Digitalcourage
https://digitalcourage.de/

Press release: EU data retention: Digitalcourage publishes and criticises the position of the German government (only in German, 10.02.2020)
https://digitalcourage.de/pressemitteilungen/2020/bundesregierung-eugh-eu-weite-vorratsdatenspeicherung

(Contribution by Sebastian Lisken, EDRi member Digitalcourage, Germany)

close
03 Feb 2020

ECtHR: Obligation on companies to identify all phone users is legal

By Diego Naranjo

On 30 January 2020, the European Court of Human Rights (ECtHR) issued its judgment on the Breyer VS Germany case. The case was brought by Patrick Breyer (currently a Member of the European Parliament, MEP) and Jonas Breyer (herewith “the applicants”), who complained about the obligation introduced by the Telecommunications Act in Germany to register all customers of pre-paid SIM cards. Similar obligations have been imposed in Romania and elsewhere. In total, there are 15 Council of Europe (CoE) Member States requiring subscriber registration of pre-paid SIM customers, versus 32 that do not have such laws. The applicants claimed a violation of Articles 8 and 10 of the European Convention of Human Rights – right to privacy and freedom of expression, respectively.

Indiscriminate collection of personal data? This time it is ruled legal.

According to the Court, the scope of the applicants’ complaint was not sufficiently grounded regarding freedom of expression and, therefore, analysed the application solely on a potential violation of the right to private life. The Court, by six votes to one, declared that there was no violation of the right to private life. According to the majority of the Court, even though there was a clear interference with the right to private life, the interference was legitimate because of reasons of public safety and prevention of disorder or crime. It was also necessary in a democratic society because it “strongly simplifies and accelerates investigation by law-enforcement agencies” and it can “contribute” to more “effective law enforcement”. Furthermore, it added the data stored and the interference deriving from it was “while not trivial, of a rather limited nature”.

But is efficiency the right approach? In the recent Advocate General (AG) Opinion on four data retention cases before the Court of Justice of the European Union (CJEU), the AG points out that the argument of efficiency cannot lead to watering democracy and that the fight crime (or terrorism in that case) cannot be analysed just in terms of “efficiency”. Indeed, installing CCTV cameras in every room in every house in order to prevent violence against women may be very “efficient”, but efficiency cannot be the ultimate reason (or even a legal basis) to implement any measures we could imagine.

Dissenting Opinion: Sensitive data and lack of effective safeguards

Fortunately, not all judges agreed. The dissenting Opinion of judge Carlo Ranzoni raises relevant questions and arguments which could well lead to a referral of the case to the Grand Chamber. In his dissenting Opinion, Ranzoni argues that he found a violation of Article 8 for various reasons. First, in the case in question, the measures are not confined to the fight against terrorism or other serious crimes (and even when investigating serious crimes, not all measures are justified). Ranzoni also argues that, even though the information stored was not sensitive in itself, the majority of the Court overlooked the possibilities of the “identification of the parties to every telephone call or message exchange and the attribution of possibly sensitive information to an identifiable person”, which in his opinion makes it comparable to similar interferences in the right to private life, as it did in Benedik v. Slovenia, which were not described then as of a “rather limited nature” by the ECtHR (para. 5 of the dissenting Opinion).

Ranzoni further suggests that the law in question allows for the storage of (and access to) data of all SIM card subscribers, without a link to the investigation of any serious crime, for a long period of time. This is a serious interference, and not a light one. However, the “crux of the case” is, according to Ranzoni, the analysis of the quality of the safeguards and how effectively they can prevent abuses. According to him, supervising authorities do not have real capacity to investigate possible abuses, because as the Constitutional Court itself pointed out, “the retrieving authority does not have to give reasons for its request”, and therefore the Federal Network Agency (that is in charge of the retrieval of data of phone users from companies for requesting authorities) would not be able to analyse if the request is admissible (para. 22). Therefore, effective review and supervision of retrieval requests by a judicial or otherwise independent authority are nonexistent. Finally, according to Ranzoni, the vast majority of victims of the interference “are left without any possibility of review” since “it appears unrealistic [for Data Protection Authorities] to review some 35 million data sets consulted by a wide range of different authorities” (para. 25 of the dissenting Opinion).

What next?

The applicants can still apply for a referral of the case to the Grand Chamber that could still overturn this judgment. The dissenting opinion brings strong arguments justifying such a referral. In the meantime, the pending cases Privacy International, C-623/17 and Ordre des barreaux francophones et germanophone et al. C-520/18 are also awaiting a judgment. If the CJEU judgment follows the AG Opinion, the obligation on private companies to perform mass blanket data retention of communications data would be considered once again illegal. If that happens in the CJEU, some of the arguments put forward by the majority of judges in the present ECtHR Breyer case (such as the “efficiency” for law enforcement argument) may help the applicants to overturn the arguments of the majority in this case.

Judgment: Case Breyer v. Germany
http://hudoc.echr.coe.int/eng?i=001-200442

Data retention: “National security” is not a blank cheque (29.01.2020)
https://edri.org/data-retention-national-security-is-not-a-blank-cheque/

AG’s Opinion: Mass retention of data incompatible with EU law (29.01.2020)
https://edri.org/ag-opinion-mass-retention-of-data-incompatible-with-eu-law/

(Contribution by Diego Naranjo, EDRi)

close
29 Jan 2020

AG’s Opinion: Mass retention of data incompatible with EU law

By Privacy International

On 15 January, Advocate General (AG) Campos Sánchez-Bordona of the Court of Justice of the European Union (CJEU), issued his Opinions (C-623/17, C-511/18 and C-512/18 and C-520/18) on how he believes the Court should rule on vital questions relating to the conditions under which security and intelligence agencies in the UK, France and Belgium could have access to communications data retained by telecommunications providers.

The AG addressed two major questions:

  1. When states seek to impose obligations on electronic communications services in the name of national security, do such requirements fall within the scope of EU law?
  2. If the answer to the first question is yes, then what does EU law require of the national schemes at issue, which include: a French data retention regime, a Belgian data retention regime, and UK regime for the collection of bulk communications data?

The AG’s short answers to those questions are:

  1. Yes, EU law applies whenever states seek to impose processing requirements on electronic communications services, even if those obligations may be motivated by national security concerns; and
  2. Accordingly, the national regimes at issue must all comply with the CJEU’s previous judgments in Digital Rights Ireland and Others, Cases C-293/12 and C-594/12 (“Digital Rights Ireland”), and Tele2 Sverige and Watson and Others, Cases C-203/15 and C-698/15 (“Tele2/Watson”). None of them do, which leads the AG to advise that none of the regimes are compatible with EU law.

The AG’s Opinion is an affirmation of the basic principle at the heart of EDRi member Privacy International’s work: national security measures must be subject to the rule of law and respect our fundamental rights.

Privacy International initiated the challenge to the UK bulk communications data regime, and intervened in the challenge to the French data retention law.

Does EU law apply?

Central to all three Opinions is the question of whether EU law applies when Member States are acting to protect their national security. The AG concludes that the national security context does not disapply EU law. Instead, one must look to the effect of the proposed requirement – data retention or collection – on electronic communications services. Requiring these service providers to retain and/or transmit data to the security and intelligence agencies (SIAs) falls under EU law because such practices qualify as the “processing of personal data”.

Stating this principle in the negative, the AG says: “The provisions of the directive will not apply to *activities* which are intended to safeguard national security and are undertaken by the public authorities themselves, without requiring the cooperation of private individuals and, therefore, without imposing on them obligations in the management of business” (UK Case C-623/17, paragraph 34/79) (emphasis in original).

Is the UK Bulk Communications Data Regime compatible with EU law?

In the UK case, Privacy International challenged the bulk acquisition and use of communications data by Government Communications Headquarters (GCHQ and the Security Service MI5. That case began in the Investigatory Powers Tribunal (IPT), which referred to the CJEU the questions that the AG is addressing. The IPT asked the CJEU to decide, first, whether requiring an electronic communications network to turn over communications data in bulk to the SIAs falls within the scope of European Union law; and second, if the answer to the first question is yes, what safeguards should apply to that bulk access to data?

As noted above, the AG’s answer to the first question is yes, which brings the second question into play. In short, the AG declares that the UK bulk communications and data retention regime (as implemented under section 94 of the Telecommunications Act 1984) “does not satisfy the conditions established in the Tele2 Sverige and Watson judgment, because it involves general and indiscriminate retention of personal data” (UK Case C-623/17, paragraph 37).

The AG re-emphasises that access to retained data “must be subject to prior review by a court or an independent administrative authority” (UK Case C-623/17, paragraph 139). The value of this authority lies in its commitment “to both safeguarding national security and to defending citizens’ fundamental rights” (Id.).

The AG further endorses the application of the other conditions from the Tele2/Watson judgment, including:

  • the requirement to inform affected parties, unless this would compromise the effectiveness of the measure; and
  • the retention of the data within the European Union. (UK Case C-623/17, paragraph 43)

Is the French Data Retention Regime compatible with EU law?

The French case similarly asked whether general and indiscriminate data retention was permissible under EU law for the purposes of combating terrorism.

The AG concluded that the French regime amounts to generalised and indiscriminate data retention and as such it is not compatible with EU law (French Cases C-511/18 and C-12/18, paragraph 111). The French legislation at issue imposes a one-year retention obligation on all electronic communications operators and others with regard to all data of all subscribers for the purpose of the investigation, finding, and prosecution of criminal offenses.

The AG reiterates the conclusion of the Tele2/Watson judgment that the fight against terrorism or similar threats to national security cannot justify generalised and indiscriminate retention of data. He suggests that data retention should be targeted and permissible only if certain criteria are satisfied, for example targeting a specific group of people or a particular geographical area (French Cases C-511/18 and C-12/18, paragraph 133). The Belgian opinion elaborates on possible types of targeting criteria. On the question of access to retained data, he advises that access should depend on previous authorisation of a judicial or independent administrative authority following a reasoned request by the competent authorities.

The AG, furthermore, concluded that that real-time collection of traffic and location data of individuals suspected to be connected to a specific terrorist threat would be permissible under EU law so long as it does not impose on the service providers an obligation to retain additional data beyond what it is already required for billing or marketing services. Independent authorisation is also necessary for accessing this data (French Cases C-511/18 and C-12/18, paragraphs 142-3).

Similarly to the UK Opinion above, the AG reaffirms the requirement to inform affected parties, unless this would compromise the effectiveness of the measure that was already established in Tele2/Watson case and concludes that the French law is not compatible with the EU requirements (French Cases C-511/18 and C-12/18, paragraph 153).

Are AG’s opinions the judgments of the CJEU?

The AG’s opinions are not binding on the CJEU. The Court will issue its Opinion in the coming months.

What comes next?

Following the CJEU judgment, each case will be sent back to each state’s national courts. If the CJEU agrees with the Advocate General, then national courts will have to apply the CJEU judgment and accordingly find domestic regimes incompatible with EU law.

This article was originally published at: https://privacyinternational.org/news-analysis/3334/advocate-generals-opinion-national-security-mass-retention-regimes-are

Indiscriminate data retention considered disproportionate, once again (15.01.2020)
https://edri.org/indiscriminate-data-retention-considered-disproportionate-once-again/

(Contribution by Caroline Wilson Palow and Ilia Siatitsa, EDRi member Privacy International)

close
29 Jan 2020

Data retention: “National security” is not a blank cheque

By Laureline Lemoine

On 15 January, Advocate General (AG) Campos Sánchez-Bordona of the Court of Justice of the European Union (CJEU) delivered his opinions on four cases regarding data retention regimes in France, Belgium and the UK, in the context of these Members States’ surveillance programmes.

The AG endorsed the previous case law on data retention, confirming that a general and indiscriminate retention of personal data is disproportionate, even when such schemes are implemented for national security reasons.

An interesting take from his Opinions is how he challenged EU Member States who tend to consider national security as their get-out-of-jail-free card.

National security cannot be used as escape route from EU law

One of the questions the AG had to answer concerned the applicability of the ePrivacy Directive, which Member States contested. They argued that EU law was not applicable, as the surveillance programmes were a matter of national security, in the context of terrorism threats, and therefore not within the EU’s jurisdiction.

Even though the matter had already been solved in the Tele 2 case, the AG, faced with determined Member States, provided for a clear and hopefully once and for all analysis on the national security argument. In all three opinions, the AG stresses that, in these cases, national security reasons could not prevent the applicability of EU law. For the AG, the notion of “national security” is too vague to be invoked to oppose the application of safeguards regarding the protection of personal data and confidentiality of citizens (C-511/18 and C-512/18, para. 74).

He therefore proceeded to define this notion in light of the ePrivacy Directive. The Directive would not apply when activities related to “national security” are undertaken by the public authorities directly themselves, by their own means and for their own account. But as soon as the States impose obligations on private actors for these same reasons, the Directive applies (C-511/18 and C-512/18, para. 79 to 85).

In these cases, telecom operators are obliged, under the law, to retain the data of their subscribers and to allow public authorities access to it. It does not matter that these obligations are imposed for national security reasons.

…and neither can the fundamental right of security

To add another layer to the “security” argument, the French case mentioned the right to security under Article 6 of the Charter of Fundamental Rights of the European Union as a justification to the data retention scheme. This could be a valid argument, but as the AG points out, the right to security protected in the Charter is the right to personal security against arbitrary arrest or detention and does not cover public security in the sense of terrorism threats and attacks (C-511/18 and C-512/18, para. 98, 99).

Terrorism as an excuse?

As part of the “national security” argument, France also argued that the general and indiscriminate retention of personal data was put in place to fight terrorism, in a context of serious and persistent threats to national security.

The AG, however, rightly points out that in the French legislation, terrorism is only one of the justifications possible for such a data retention regime. Terrorism threats are part of the factual context and the excuse for imposing such a regime, while in reality, the regime applies generally, for the purpose of fighting crime (C-511/18 and C-512/18, para. 119 & 120).

Moreover, the CJEU had already rejected, in the Tele2 case, the possibility of having a general and indiscriminate data retention regime for antiterrorism reasons. The AG underlines that this is not incompatible with the view of the Court that fighting terrorism is a legitimate and vital interest for the State. But the case law of the CJEU is clear that, such an objective of general interest, as vital as it can be, cannot in itself justify the necessity of a general and indiscriminate retention regime.

In response to Member States arguing against anything less than a general and indiscriminate retention for this purpose, the AG explains that the fight against terrorism cannot only be contemplated in regards to its efficiency. Because of the scale and the means put into this issue, it must be part of the Rule of Law and must respect fundamental rights. Relying only on efficiency would mean ignoring other democratic issues and could potentially, in extreme cases, lead to harms done to citizens. (C-511/18 and C-512/18, para. 131).

The AG succeeds in debunking the Member States’ arguments, but stops short of preventing abuse.

The danger of “state of emergency” exceptions

Indeed, at the end of his analysis, the AG very briefly (C-511/18 and C-512/18, para. 104 and C-520/18, para. 105 & 106) explains that regardless of what he argued, Member States could be allowed to impose an obligation to retain data, as wide and general as needs be. This could only be done in really exceptional situations, where there is an imminent threat or an extraordinary risk justifying the enactment of a state of emergency in a Member States.

The only safeguard mentioned is the “limited period” that these kind of schemes could run for. This is not enough as we saw how a “state of emergency” can be abused. In France, after the terrorist attacks of November 2015, l’état d’urgence, state of emergency, was enacted and went on for two years. It has been shown that this scheme was not only used for antiterrorism purposes, but also as a tool of social, security and political control, used to conduct surveillance and arrests of, for example, climate activists who are considered “extremists” .

More globally, this has been demonstrated by the various electronic surveillance programmes implemented by the USA after 9/11 in the name of the “war on terror”.

The AG’s opinions are not binding but usually influence the final judgments of the CJEU, which will be issued in the upcoming months. EDRi will be following the development of these cases.

Indiscriminate data retention considered disproportionate, once again (15.01.2020)
https://edri.org/indiscriminate-data-retention-considered-disproportionate-once-again/

Preliminary Statement: Advocate General’s Opinion Advises that Mass Surveillance Regime is Unlawful (15.01.2020)
https://privacyinternational.org/press-release/3332/preliminary-statement-advocate-generals-opinion-advises-mass-surveillance-regime

AG’s Opinion: Mass retention of data incompatible with EU law (29.01.2020)
https://edri.org/ag-opinion-mass-retention-of-data-incompatible-with-eu-law

CJEU Press Release: Advocate General Campos Sánchez-Bordona: the means and methodsof combating terrorismmust be compatible with the requirements of the rule of law (15.01.2020)
https://curia.europa.eu/jcms/upload/docs/application/pdf/2020-01/cp200004en.pdf

(Contribution by Laureline Lemoine, EDRi)

close
15 Jan 2020

Indiscriminate data retention considered disproportionate, once again

By EDRi

EDRi’s initial reaction on the press release of the AG Opinion on data retention

Today’s Court of Justice of the European Union (CJEU) Advocate General’s Opinions continue the firmly established case-law of the CJEU considering mass collection of individuals communications data incompatible with EU law. The Advocate General reaffirms that blanket retention of telecommunication data is disproportionate to its purported goal of national security and combating crime and terrorism.

Today, on 15 January, the CJEU Advocate General Campos Sánchez-Bordona delivered his Opinionsn on four cases regarding data retention regimes in France, Belgium and the UK. These cases focus on the compatibility of these Member States’ surveillances programmes with the existing case law on data retention and the applicability of the ePrivacy Directive in those cases.

Once again, the Advocate General of the CJEU has firmly sided to defend the right to privacy, and declared that indiscriminate retention of all traffic and location data of all subscribers and registered users is disproportionate.

said Diego Naranjo, Head of Policy at EDRi.

The European Commission needs to take note of yet another strong message against illegal data retention laws. While combating crime and terrorism are legitimate goals, this should not come at the expense of fundamental rights. It’s crucial to ensure that the EU upholds the Charter of Fundamental Rights and prevents any new proposal for data retention legislation of a general and indiscriminate nature.

The Opinions respond to four references for a preliminary ruling, sent by the French Council of State (joined cases C-511/18 and C-512/18, La Quadrature du Net and Others), Belgian Constitutional Court (Case C-520/18, Ordre des barreaux francophones et germanophone and Others) and the UK Investigatory Powers Tribunal (Case C-623/17, Privacy International). The Advocate General confirms that the ePrivacy Directive and EU law applies to data retention for the purpose of national security. He proposes to uphold the case-law of the Tele2 case and stressed that “a general and indiscriminate retention of all traffic and location data of all subscribers and registered users is disproportionate” and that only limited and discriminate retention with limited access to that data is lawful. He states that “the obligation to retain data imposed by the French legislation is general and indiscriminate, and therefore is a particularly serious interference in the fundamental rights enshrined in the Charter” and similar criticism is raised on the Belgian and UK laws.

Following the invalidation of the data retention Directive in the Digital Rights Ireland case in 2014, Member States have been relying on the ePrivacy Directive to enact national data retention legislation. In 2016, the CJEU clarified this possibility and ruled in the Tele2 case that blanket data retention measures are incompatible with the Charter of Fundamental Rights of the European Union. Since then, as the Commission has been reluctant to intervene, civil society organisations have been challenging unlawful data retention legislation in different Member States.Blanket data retention of telecommunications data is a very invasive surveillance measure of the entire population. This can entail the collection of sensitive information about citizens’ social contacts, movements and private lives, without any suspicion. Telecommunications data retention also undermines professional confidentiality, the protection of journalistic sources and compromises the freedom of the press, and prevents confidential electronic communications. The retained data is also of high interest for criminal organisations and unauthorised state actors from all over the world – several successful data breaches have been documented. Overall, blanket data retetion damages preconditions of open and democratic societies.

EDRi member Privacy International has also issued a preliminary statement, it can be found here: https://privacyinternational.org/press-release/3332/preliminary-statement-advocate-generals-opinion-advises-mass-surveillance-regime

Note: This press release is a quick response based solely on the Court’s press release. A detailed analysis will follow in due time.

close