27 Mar 2020

Open letter: Civil society urges Member States to respect the principles of the law in Terrorist Content Online Regulation

By EDRi

On 27 March 2020, European Digital Rights (EDRi) and 12 of its member organisations sent an open letter to representatives of Member States in the Council of the EU. In the letter, we voice our deep concern over the proposed legislation on the regulation of terrorist content online and what we view as serious potential threats to fundamental rights of privacy, freedom of expression, etc.

You can read the letter here (pdf) and below

Brussels, 27 March 2020

Dear representatives of Member States in the Council of the EU,

We hope that you are keeping well in this difficult time.

We are writing to you to voice our serious concerns with the proposed Regulation on preventing the dissemination of terrorist content online (COM/2018/640 final). We have raised these concerns before and many similar critiques have been expressed in letters opposing the Regulation from human rights officials, civil society groups, and human rights advocates.i

We firmly believe that any common position on this crucial file must respect fundamental rights and freedoms, the constitutional traditions of the Member States and existing Union law in this area. In order for this to happen, we urge you to ensure that the rule of law in cross-border cases is respected, that the competent authorities tasked with ordering the removal of illegal terrorist content are independent, to refrain from adopting mandatory (re)upload filters and guarantee that the exceptions for certain protected forms of expression, such as education, journalistic and research materials, are maintained in the proposal. We explain why in more detail further below.

First, we ask you to respect the principles of territoriality and ensure access to justice in cases of cross-border takedowns by ensuring that only the Member State in which the hosting service provider has its legal establishment can issue removal orders. The Regulation should also allow removal orders to be contested in the Member State of establishment to ensure meaningful access to an effective remedy. As recent CJEU case law has established “efficiency” or “national security” reasons cannot lead to short-cuts to rule of law mechanisms and safeguards.ii

Secondly, the principle of due process demands that the legality of content be determined by a court or independent administrative authority. This important principle should be reflected in the definition of ‘competent authorities’. For instance, we note that in the Digital Rights Ireland case, the Court of Justice of the European Union considered that the Data Retention Directive was invalid, inter alia, because access to personal data by law enforcement authorities was not made dependent on a prior review carried out by a court or independent administrative authority.iii In our view, the removal of alleged terrorist content entails a very significant interference with freedom of expression and as such, calls for the application of the same safeguards.

Thirdly, the Regulation should not impose the use of upload or re-upload filters (automated content recognition technologies) to those services under the scope of the Regulation. As the coronavirus crisis makes abundantly clear, filters are far from accurate. Only in recent days, Twitter, Facebook and YouTube have moved to full automation of removal of content, leading to bad scores of legitimate articles about coronavirus being removed.iv The same will happen if filters are applied to alleged terrorist content. There is also mounting data suggesting that algorithms are biased and have a discriminatory impact, which is a particular concern for communities affected by terrorism and whose counter-speech has proven to be vital against radicalisation and terrorist propaganda. Furthermore, a provision imposing specific measures on platforms should favour a model that gives room for manoeuvre to service providers on which actions to take to prevent the dissemination of illegal terrorist content, taking into account their capacities and resources, size and nature (whether non-for-profit, for-profit or community-led).

Finally, it is crucial that certain protected forms of expression, such as educational, artistic, journalistic and research materials are exempted from the proposal, and that it includes feasible measures to ensure how this can be successfully implemented. The determination of whether content amounts to incitement to terrorism or even glorification of terrorism is highly context specific. Research materials should be defined to include content that serves as evidence of human rights abuses. The jurisprudence of the European Court of Human Rights (ECtHR)v specifically requires a particular caution to ,such protected forms of speech and expression. It is vital that these principles are reflected in the Terrorist Content Regulation, including through the adoption of specific provisions protecting freedom of expression as outlined above.

We remain at your disposal for any support you may need from us in the future.

Sincerely,
Access Now – https://www.accessnow.org/
Bits of Freedom – https://www.bitsoffreedom.nl/
Centrum Cyfrowe – https://centrumcyfrowe.pl
Committee to Protect Journalists (CPJ) – https://cpj.org/
Daphne Keller – Director Program on Platform Regulation Stanford University
Digitale Gesellschaft – https://digitalegesellschaft.de/
Digitalcourage – https://digitalcourage.de/
D3 – Defensa dos Dereitos Digitais –
https://www.direitosdigitais.pt/
Državljan D – https://www.drzavljand.si/
EDRi – https://edri.org/
Electronic Frontier Foundation (EFF) – https://www.eff.org/
Epicenter.Works – https://epicenter.works
Free Knowledge Advocacy Group EU- https://wikimediafoundation.org/
Hermes Center – https://www.hermescenter.org/
Homo Digitalis – https://www.homodigitalis.gr/en/
IT-Political Association of Denmark – https://itpol.dk/
Panoptykon Foundation – https://en.panoptykon.org
Vrijschrifthttps://www.vrijschrift.org
Wikimedia Spain – https://wikimedia.es

Footnotes

i.

ii.

iii.

  • See Digital Rights Ireland v. Minister for Communications, Marine and Natural Resources, Joined Cases C‑293/12 and C‑594/12, 08 April 2014 at para. 62.

iv.

v.

  • In cases involving the dissemination of “incitement to violence” or terrorism by the press, the ECtHR’s starting point is that it is “incumbent [upon the press] to impart information and ideas on political issues just as on those in other areas of public interest. Not only does the press have the task of imparting such information and ideas: the public also has a right to receive them.” See Lingens v Austria, App. No. 9815/82,8 July 1986, para 41.
  • The ECtHR also repeatedly held that the public enjoyed the right to be informed of different perspectives, e.g. on the situation in South East Turkey, however unpalatable they might be to the authorities. See also Özgür Gündemv. Turkey, no. 23144/93, 16 March 2000, para.60 and 63 and the Council of Europe handbook on protecting the right to freedom of expression under the European Convention on Human Rights, summarizing the Court’s case law on positive obligations of States with regards to the protection of journalists (p.90-93), available at: https://rm.coe.int/handbook-freedom-of-expression-eng/1680732814
Twitter_tweet_and_follow_banner close
11 Mar 2020

Germany: Invading refugees’ phones – security or population control?

By Gesellschaft für Freiheitsrechte

In its new study, EDRi member Society for Civil Rights (GFF) examines how German authorities sniff out refugees’ phones. The aim of “data carrier evaluation” is supposed to be determining a person’s identity and their country of origin. However, in reality, it violates refugees’ rights and does not produce any meaningful results.

If an asylum seeker in Germany cannot present either a passport or documents replacing it, the Federal Office for Migration and Refugees (BAMF), is authorised to carry out a “data carrier evaluation” – to extract and analyse data from the asylum seeker’s phones and other devices to check their owner’s stated origin and identity. Data that is analysed includes the country codes of their contacts, incoming and outgoing calls and messages, browser history, geodata from photos, as well as email addresses and usernames used in applications such as Facebook, booking.com or dating apps. Notably, BAMF carries out this data analysis regardless of any concrete suspicion that the asylum-seekers made untruthful statements regarding their identity or country of origin.

The study “Invading Refugees’ Phones: Digital Forms of Migration Control” examines and assesses how BAMF evaluates refugees’ data and what kinds of results data carrier evaluation has produced so far. For the study, journalist Anna Biselli and GFF lawyer Lea Beckmann comprehensively researched and evaluated numerous sources. These include data carrier evaluation reports, asylum files, internal BAMF regulations, such as a user manual for reading mobile data carriers, and training documents for BAMF employees, as well as information that was made public by parliamentary inquiries.

High costs, useless results

The study found that evaluating data carriers is not an effective way of establishing a person’s identity and country of origin. Since data carrier evaluations started in 2017, BAMF has examined about 20 000 mobile phones of asylum seekers. When invading refugees privacy via data carrier evaluation produces results, it usually only confirms what the persons themselves stated during their interviews with BAMF employees.

In 2018, only 2% of the successful data carrier evaluations revealed contradictions to the asylum seekers’ statements. Graphic: GFF/Julia Zé

There were already doubts about the effectiveness of data carrier evaluation before the law on Better Enforcement of the Obligation to leave the Country was passed. The law aims to speed up deportations. By introducing data carrier evaluations, legislators hoped to verify a person’s identity, country of origin and grounds for protection more quickly than before. In practice, the procedure has fallen short of these expectations. It has also turned out to be very expensive. 

In relation to the limited benefit of data carrier evaluations, the costs of the procedure are clearly disproportionate. In February 2017, the Federal Ministry of the Interior stated that installation costs of 3,2 million euros were to be expected. By the end of 2018, however, 7,6 million euros had already been spent on the system, more than twice as much as originally estimated.

Total costs of the BAMF for reading and evaluating data media: From just under 7 million euros in 2017 to an expected 17 million euros in 2022. graph: GFF/Julia Zé

A blatant violation of fundamental rights

Examining refugees’ phones can be seen as a human rights violation. Despite that, Germany has spent millions of euros on introducing and developing this practice. Data carrier evaluations circumvent the basic right to informational self-determination, which has been laid down by the German Federal Constitutional Court. Refugees are subject to second-class data protection. At the same time, they are especially vulnerable and lack meaningful access to legal remedies.

Germany is not the only country to experiment with digital forms of migration control. BAMF’s approach is part of a broader, international trend towards testing new surveillance and monitoring technologies on marginalised populations, including refugees. Individual people, as well as their individual histories, are increasingly being reduced to data records. GFF will combat this trend with legal means: We are currently preparing legal action against the BAMF’s data carrier evaluation. 

We thank the Digital Freedom Fund for their support.

Gesellschaft für Freiheitsrechte (GFF, Society for Civil Rights)
https://freiheitsrechte.org/english/

Invading Refugees’ Phones: Digital Forms of Migration Control
https://freiheitsrechte.org/home/wp-content/uploads/2020/02/Study_Invading-Refugees-Phones_Digital-Forms-of-Migration-Control.pdf

The human rights impacts of migration control technologies (12.02.2020)
https://edri.org/the-human-rights-impacts-of-migration-control-technologies/

Immigration, iris-scanning and iBorderCTRL (26.02.2020)
https://edri.org/immigration-iris-scanning-and-iborderctrl/

(Contribution by EDRi member Gesellschaft für Freiheitsrechte – GFF, Germany)

close
26 Feb 2020

Romania: Mandatory SIM registration declared unconstitutional, again

By ApTI

On 18 February 2020, the Romanian Constitutional Court unanimously declared unconstitutional a new legislative act adopted in September 2019 introducing mandatory SIM card registration. The legislative act in question was an emergency ordinance issued by the Government which wanted to introduce this obligation as a measure “to improve the operation of the 112 emergency service number”. This is the second time the court issues an unconstitutionality decision on mandatory SIM card registration proposals.

The court dismissed the law on procedural grounds, as the government failed to demonstrate the emergency of adopting the ordinance. It also highlighted in its press release that this act of introducing the SIM card registration provision has actually been postponed twice, therefore the urgency to issue an emergency ordinance was non-existent.

Although this is the sixth attempt to introduce legislation on mandatory SIM card registration in Romania, the battle is far from over as, this time, the court did not go into a substantive analysis.

Coincidentally or not, two days later, a false bomb threat (the first in years) was reported in the media. A man called the 112 emergency number claiming that he placed a bomb inside a shopping mall. The call was made from a prepaid SIM card, a fact that the 112 service specifically highlighted in their press release, while amping up about the number of staff involved in responding to this call, subtly suggesting the implications of this types of calls are massive, and if they are proven false, there is nobody that can be taken responsible as prepaid SIM card calls can’t be traced back to an individual.

The Constitutional Court’s decision is a welcomed victory for now, but given the track record of proposing a new legislative proposal on this topic every year or two, we can expect similar legislative proposals in the future.

Background

After a tragic failure by the police to save a teenage girl who was abducted but managed to call the 112 emergency number three times before she was murdered, the Romanian Government adopted an Emergency Ordinance which introduced the obligation to register prepaid SIM cards.

EDRi member ApTI, together with the Association for the Defence of Human Rights in Romania – the Helsinki Committee (APADOR-CH), asked the Romanian Ombudsman to send the law to the Constitutional Court. The Ombudsman challenged the case in court and ApTI sent an amicus curiae in support the unconstitutionality claims, showing that:

  1. there was no reason to justify the adoption of an emergency ordinance instead of going through the normal parliamentary procedure;
  2. restricting individual freedoms by requesting to register prepaid SIM cards is a measure disproportionate to the goal that was intended – to limit fake calls to the 112 emergency number;
  3. no data protection impact assessment has been carried out and the national data protection authority did not support the law;
  4. the Constitutional Court already decided in 2014 that the measure to introduce mandatory SIM card registration limits the fundamental rights and freedoms and such measures can only be introduced if they are necessary and proportionate.

The 6th attempt to introduce mandatory SIM card registration in Romania (23.10.2020)
https://edri.org/the-sixth-attempt-to-introduce-mandatory-sim-registration-in-romania/

Constitutional Court press release (only in Romanian, 18.02.2020)
http://www.ccr.ro/download/comunicate_de_presa/Comunicat-de-presa-18-februarie-2020.pdf

Timeline of legislative initiatives to introduce mandatory SIM card registration (only in Romanian)
https://apti.ro/Ini%C5%A3iativ%C4%83-legislativ%C4%83-privind-%C3%AEnregistrarea-utilizatorilor-serviciilor-de-comunica%C5%A3ii-electronice-tip-Prepay

Fake bomb threat at a shopping mall in Romania (only in Romanian, 20.02.2020)
https://www.hotnews.ro/stiri-esential-23674657-sts-cifrele-alarmei-false-bomba-veranda-mall-8-secunde-apel-5-institutii-alerta-100-specialisti-apelul-fost-facut-cartela-prepay.htm

Constitutional Court decision nr. 461/2014 (only in Romanian)
https://privacy.apti.ro/decizie-curtea-constitutionala-prepay-461-2014/

(Contribution by Valentina Pavel, EDRi member ApTI, Romania)

close
19 Feb 2020

The impact of competition law on your digital rights

By Laureline Lemoine

This is the first article in a series dealing with competition law and Big Tech. The aim of the series is to look at what competition law has achieved when it comes to protecting our digital rights, where it has failed to deliver on its promises, and how to remedy this.

This series will first look at how competition and privacy law interact, to then focus on how they can support each other in tackling data exploitation and other issues related to Big Tech companies. With a potential reform of competition rules in mind, this series is also a reflection on how competition law could offer a mechanism to regulate Big Tech companies to limit their increasing power over our democracies.

Our personal data is seen by Big Tech companies as a commodity with economic value, and they cannot get enough of it. They track us online and harvest our personal data, including sensitive health data. Data protection and online privacy legislations aim to protect individuals against invasive data exploitation. Even though well-enforced privacy and data protection legislation are a must-have in our connected societies, there are other avenues that could be explored simultaneously. Because of the power imbalance between individuals and companies, as well as other issues affecting our fundamental rights, there is a need for a more structural approach, involving other policies and legislation. Competition law is often referred to as one of the tools that could redress this power imbalance, because it controls and regulates market power, including in the digital economy.

During her keynote speech at the International Association of Privacy Professionals (IAPP) conference in November 2019, Margrethe Vestager, European Commissioner for Competition and Executive Vice-President for A Europe Fit for the Digital Age, argued that, “[…] to tackle the challenges of a data-driven economy, we need both competition and privacy regulation, and we need strong enforcement in both. Neither of these two things can take the place of one another, but in the end, we’re dealing with the same digital world. Privacy and competition are both fundamentally there for the same reason: to protect our rights as consumers”.

Privacy and competition law are different policies

Competition and privacy law (which includes data protection and online privacy legislations) are governed by different legal texts and overseen by different authorities with distinct mandates.

According to Wojciech Wiewiórowski, the European Data Protection Supervisor (EDPS), “the main purpose of these two kinds of oversight is […] very different, because what the competition authorities want to achieve is the well-working fair market, what we want to achieve is to defend the fundamental rights [to privacy and data protection]”.

This means that, in assessing competition infringements, competition authorities do not go beyond competition issues. They have to assume that companies are or will be in compliance with their other legal obligations, including their privacy obligations.

The Court of Justice of the European Union confirmed this difference of mandates in 2006. In the Facebook/WhatsApp merger case, the Commission concluded that privacy-related concerns “do not fall within the scope of the EU competition law rules but within the scope of the EU data protection rules”. Facebook was later fined for “misleading” the competition authority.

Since then, Europe has seen the development of a data-driven economy and its fair share of privacy scandals and data breaches, too. And despite numerous investigations into problematic behaviours, Big Tech companies keep on growing.

But this goes far beyond competition issues, as the dominant position of Big Tech companies also gives them the power and the incentive to limit our freedoms, and to infringe on our fundamental rights. Their dominance is even a threat to our democracies.

As a way to tackle these issues, more people are calling for the alignment of enforcement initiatives of data protection authorities as well as competition and consumer authorities. This has led to debates about the silos between competition and data protection law, their differences but also their common objectives.

Data protection and competition against Big Tech powers

Both competition and data protection law impact economic activities and, at EU level, both are used to to ensure the further deepening of the EU single market. The General Data Protection Regulation (GDPR), as well as ensuring a high level of the protection of personal data, aims to harmonise the Member States’ legislations to remove obstacles to a common European market. Similarly, competition law prevents companies from enacting barriers to trade between competitors.

Moreover, data protection can be considered as an element of competition in cases where companies compete for who can better satisfy privacy preferences. There is, in this case, a common objective of allowing the individual to have control (as a consumer or as a data subject).

In her keynote speech, Vestager explained: “competition and competition policy have an important role to play … because the idea of competition is to put consumers in control. For markets to serve consumers and not the other way around,” she said, “it means if you don’t like the deal we’re getting, we can walk away and find something that meets our needs in a better way. And consumers can also use that power to demand something we really … care about, including maybe our privacy.”

Indeed, giving consumers a genuine choice to use privacy-friendly companies would help uphold standards in terms of privacy. Although now it is hard to believe, once upon a time Facebook prioritized privacy as a way to distinguish itself from MySpace, its biggest competitor back then.

However, the issue in the world of Big Tech today is that privacy is not a leverage. The dominant positions of the few players controlling the market leave no room for others proposing privacy-friendly products. As a result, there is no other choice but to use the services of Big Tech to stay connected online – the consumer is no longer in control.

One way to remedy this power imbalance between individuals and these giant companies could be through a greater cooperation between regulatory authorities. BEUC, the European Consumer Organisation, has called, regarding Facebook’s exploitation of consumers, for a “coherent enforcement approach for the data economy between regulators and across Member States” and wants the “European Commission to explore – with relevant authorities – how to deal with a concrete commercial behaviour that simultaneously breaches different areas of EU law”.

In 2016, The EDPS launched the Digital Clearinghouse, a voluntary network of regulators involved in the enforcement of legal regimes in digital markets, with a focus on data protection, and consumer and competition law. National competition authorities are also looking into competition and data, while in 2019, the European Commission published a report on Competition Policy for the Digital Era, to which EDRi member Privacy International contributed.

Greater cooperation between regulators, inclusion of data protection principles in competition law, and many other ideas are being discussed to redress this issue of power imbalance. Some of them will be explored in the next articles of this series.

Regarding antitrust law, we will look at discussions regarding new sets of rules designed especially for the Big Tech market, as well as the development of the right to portability and interoperability. As for merger control, we will focus on to what extent privacy could be considered as a theory of harm.

Opinion 8/2016 – EDPS Opinion on coherent enforcement of fundamental rights in the age of big data (2016)
https://edps.europa.eu/sites/edp/files/publication/16-09-23_bigdata_opinion_en.pdf

Competition and data
https://privacyinternational.org/learning-topics/competition-and-data

Factsheet – Competition in the digital era (2020)
https://www.beuc.eu/publications/beuc-x-2020-007_competition_in_digital_era.pdf

Report of the European Commission – Competition Policy for the digital era (2019)
https://ec.europa.eu/competition/publications/reports/kd0419345enn.pdf

Family ties: the intersection between data protection and competition in EU Law (2017)
http://eprints.lse.ac.uk/68470/7/Lynskey_Family%20ties%20the%20intersection%20between_Author_2016_LSERO.pdf

close
12 Feb 2020

Cloud extraction: A deep dive on secret mass data collection tech

By Privacy International

Mobile phones remain the most frequently used and most important digital source for law enforcement investigations. Yet it is not just what is physically stored on the phone that law enforcement are after, but what can be accessed from it, primarily data stored in the “cloud”. This is why law enforcement is turning to “cloud extraction”: the forensic analysis of user data which is stored on third-party servers, typically used by device and application manufacturers to back up data. As we spend more time using social media and messaging apps, store files with the likes of Dropbox and Google Drive, as our phones become more secure, locked devices harder to crack, and file-based encryption becomes more widespread, cloud extraction is, as a prominent industry player says, “arguably the future of mobile forensics.”

The report “Cloud extraction technology: the secret tech that lets government agencies collect masses of data from your apps” brings together the results of Privacy International’s open source research, technical analyses and freedom of information requests to expose and address this emerging and urgent threat to people’s rights. 

Phone and cloud extraction go hand in hand

EDRi member Privacy International has repeatedly raised concerns over risks of mobile phone extraction from a forensics perspective and highlighted the absence of effective privacy and security safeguards. Cloud extraction goes a step further, promising access to not just what is contained within the phone, but also to what is accessible from it. Cloud extraction technologies are deployed with little transparency and in the context of very limited public understanding. The seeming “wild west” approach to highly sensitive data carries the risk of abuse, misuse and miscarriage of justice. It is a further disincentive to victims of serious offences to hand over their phones, particularly if we lack even basic information from law enforcement about what they are doing. 

The analysis of data extracted from mobile phones and other devices using cloud extraction technologies increasingly includes the use of facial recognition capabilities. If we consider the volume of personal data that can be obtained from cloud-based sources such as Instagram, Google photos, iCloud, which contain facial images, the ability to use facial recognition on masses of data is a big deal. Because of this, greater urgency is needed to address the risks that arise from such extraction, especially as we consider the addition of facial and emotion recognition to software which analyses the extracted data.  The fact that it is potentially being used on vast troves of cloud-stored data without any transparency and accountability is a serious concern.

What you can do

There is an absence of information regarding the use of cloud extraction technologies, making it unclear how this is lawful and equally how individuals are safeguarded from abuse and misuse of their data.  This is part of a dangerous trend by law enforcement agencies and we want to ensure globally the existence of transparency and accountability with respect to new forms of technology they use. 

If you live in the UK, you can submit a Freedom of Information Act Request to your local police to ask them about their use of cloud extraction techonoligies using this template: https://privacyinternational.org/action/3324/ask-your-local-uk-police-force-about-cloud-extraction. You can also use it to send a request if you are based in another country which has Freedom of Information legislation.

Privacy International
https://privacyinternational.org/

Cloud extraction technology: the secret tech that lets government agencies collect masses of data from your apps (07.01.2020)
https://privacyinternational.org/long-read/3300/cloud-extraction-technology-secret-tech-lets-government-agencies-collect-masses-data

Phone Data Extraction
https://privacyinternational.org/campaigns/phone-data-extraction

Push This Button For Evidence: Digital Forensics
https://privacyinternational.org/explainer/3022/push-button-evidence-digital-forensics

Can the police limit what they extract from your phone? (14.11.2019)
https://privacyinternational.org/news-analysis/3281/can-police-limit-what-they-extract-your-phone

Facial recognition and fundamental rights 101 (04.12.2019)
https://edri.org/facial-recognition-and-fundamental-rights-101/

Ask your local UK police force about cloud extraction
https://privacyinternational.org/action/3324/ask-your-local-uk-police-force-about-cloud-extraction

(Contribution by Antonella Napolitano, EDRi member Privacy International)

close
12 Feb 2020

Digitalcourage fights back against data retention in Germany

By Digitalcourage

On 10 February 2020, EDRi member Digitalcourage published the German government’s plea in the data retention case at the European Court of Justice (ECJ). Dated 9 September 2019, the document from the government explains the use of retained telecommunications data by secret services, the question whether the 2002 ePrivacy Directive might apply to various forms of data retention, which exceptions from human rights protections apply to secret service operations, and justifies its plans for the use of data retention to solve a broad range of crimes with the example of a case of the abduction of a Vietnamese man in Berlin by Vietnamese agents. However, this case is very specific and, even if then the retained data was “useful”, that is not a valid legal basis for mass data retention, and therefore can not justify drastic incisions into the basic rights of all individuals in Germany. Finally, the German government also argues that the scope and time period of the storage makes a difference regarding the compatibility of data retention laws with fundamental rights.

Digitalcourage calls for all existing illegal data retention laws to be declared invalid in the EU. There are no grounds for blanket and suspicion-less surveillance in a democracy and under the rule of law. Whether it is content data or metadata that is being stored, data retention (blanket and mass collection of telecommunications data) is inappropriate, unnecessary and ineffective, and therefore illegal. Where the German government argues that secret services need to use telecommunications data to protect state interests, Digitalcourage agrees with many human rights organisations that activities of secret services can be a direct threat to the core trust between the general public and the state. The ECJ has itself called for the storage to be reduced to the absolutely required minimum – and that, according to Digitalcourage, can be only be fulfilled if no data is stored without individual suspicion.

Digitalcourage
https://digitalcourage.de/

Press release: EU data retention: Digitalcourage publishes and criticises the position of the German government (only in German, 10.02.2020)
https://digitalcourage.de/pressemitteilungen/2020/bundesregierung-eugh-eu-weite-vorratsdatenspeicherung

(Contribution by Sebastian Lisken, EDRi member Digitalcourage, Germany)

close
12 Feb 2020

PI and Liberty submit a new legal challenge against MI5

By Privacy International

On 1 February 2020, EDRi member Privacy International (PI) and civil rights group Liberty filed a complaint with the Investigatory Powers Tribunal, the judicial body that oversees the intelligence agencies in the United Kingdom, against the security service MI5 in relation to how they handle vast troves of personal data.

In mid-2019, MI5 admitted, during a case brought by Liberty, that personal data was being held in “ungoverned spaces”. Much about these ungoverned spaces, and how they would effectively be “governed” in the future, remained unclear. At the moment, they are understood to be a “technical environment” where personal data of an unknown number of individuals was being “handled”. The use of “technical environment” suggests something more than simply a compilation of a few datasets or databases.

The longstanding and serious failings of MI5 and other intelligence agencies, in relation to these “ungoverned spaces” first emerged in PI’s pre-existing case that started in November 2015. The case challenges the processing of bulk personal datasets and bulk communications data by the UK Security and Intelligence Agencies.

In the course of these proceedings, it was revealed that PI’s data were illegally held by MI5, among other intelligence and security agencies. MI5 deleted PI’s data while the investigation was ongoing. With the new complaint PI also requested the reopening of this case in relation to MI5’s actions.

In parallel proceedings brought by Liberty against the bulk surveillance powers contained in the Investigatory Powers Act 2016 (IPA), MI5 admitted that personal data was being held in “ungoverned spaces”, demonstrating a known and continued failure to comply with both statutory and non-statutory safeguards in relation to the handling of bulk data since at least 2014. Importantly, documents disclosed in that litigation and detailed in the new joint complaint showed that MI5 had sought and obtained bulk interception warrants on the basis of misleading statements made to the relevant authorities.

The documents reveal that MI5 not only broke the law, but for years misled the Investigatory Powers Commissioner’s Office (IPCO), the body responsible for overseeing UK surveillance practices.

In this new complaint, PI and Liberty argue that MI5’s data handling arrangements result in the systematic violation of the rights to privacy and freedom of expression (as protected under Articles 8 and 10 of the European Convention of Human Rights) and under EU law. Furthermore, they maintain that the decisions to issue warrants requested by MI5, in circumstances where the necessary safeguards were lacking, are unlawful and void.

Privacy International
https://privacyinternational.org/

MI5 ungoverned spaces challenge
https://privacyinternational.org/legal-action/mi5-ungoverned-spaces-challenge

Bulk Personal Datasets & Bulk Communications Data challenge
https://privacyinternational.org/legal-action/bulk-personal-datasets-bulk-communications-data-challenge

The Investigative Tribunal case no. IPT/15/110/CH
https://privacyinternational.org/sites/default/files/2019-08/IPT-Determination%20-%2026September2018.pdf

Reject Mass Surveillance
https://www.libertyhumanrights.org.uk/our-campaigns/reject-mass-surveillance

MI5 law breaking triggers Liberty and Privacy International legal action (03.02.2020)
https://www.libertyhumanrights.org.uk/news/press-releases-and-statements/mi5-law-breaking-triggers-liberty-and-privacy-international-legal

(Contribution by EDRi member Privacy International)

close
03 Feb 2020

Support our work by investing in a piece of e-clothing!

By EDRi

Your privacy is increasingly under threat. European Digital Rights works hard to have you covered. But there’s only so much we can do.

Help us help you. Help us get you covered.

Click the image to watch the video!

Check out our 2020 collection!*

*The items listed below are e-clothes. That means they are electronic. Not tangible. But still very real – like many other things online.

Your winter stock(ings) – 5€
A pair of hot winter stockings can really help one get through cold and lonely winter days. Help us to fight for your digital rights by investing in a pair of these superb privacy–preserving fishnet stockings. This delight is also a lovely gift for someone special.


A hat you can leave on – 10€
Keep your head undercover with this marvelous piece of surveillance resistance. Adaptable to any temperature types and – for the record – to several CCTV models, the item really lives up to its value. This hat is an indispensable accessory when visiting your favourite public space packed with facial recognition technologies.


Winter/Summer Cape – 25€
Are you feeling heroic yet? Our flamboyant Winter/Summer cape is designed to keep you warm and cool. This stylish accessory takes the weight off your shoulders – purchase it and let us take care of fighting for your digital rights!


Just another White T-Shirt – 50€
A white t-shirt can do wonders when you’re trying to blend in with a white wall. This wildly unexciting but versatile classic is one of the uncontested fundamental pillars of your privacy enhancing e-wardrobe.


THE privacy pants ⭐️ – 100€
This ultimate piece of resistance is engineered to keep your bottom warm in the coldest winter, but also aired up during the hottest summer days. Its colour guarantees the ultimate tree (of knowledge) look. The item comes with a smart zipper.


Anti-tracksuit ⭐️ – 250€
Keep your digital life healthy with the anti-tracking tracksuit. The fabric is engineered to bounce out any attempt to get your privacy off track. Plus, you can impress your beloved babushka too.


Little black dress ⭐️ – 500€
Whether at a work cocktail party, a funeral, shopping spree or Christmas party – this dress will turn you into the center of attention, in a (strangely) privacy-respecting manner.


Sew your own ⭐️ – xxx€
Unsure of any of the items above? Let your inner tailor free, customise your very own unique, designer garment, and put a price tag of your choice on it.



⭐️ The items of value superior to 100€ are delivered with an (actual, analog, non-symbolic) EDRi iron-on privacy patch that you can attach on your existing (actual, analog, non-symbolic) piece of clothing or accessory. If you wish to receive this additional style and privacy enhancer, don’t forget to provide us with your postal address (either via the donation form, or in your bank transfer message)!


Question? Remark? Idea? Please contact us brussels [at] edri [dot] org !

close
29 Jan 2020

CJEU to decide on processing of passenger data under PNR Directive

By Gesellschaft für Freiheitsrechte

On 20 January 2020, the District Court of Cologne, Germany, submitted to the Court of Justice of the European Union (CJEU) the question whether the European Passenger Name Record (PNR) Directive violates fundamental rights. EDRi member Gesellschaft für Freiheitsrechte (GFF, Society for Civil Rights) initiated the proceedings against the directive, which allows for authorities to analyse and store personal data of all people who take an international flight in Europe.

GFF considers the PNR Directive to violate the right to the protection of personal data and the right to respect for private and family life. The PNR Directive (Directive 2016/681) requires airlines to automatically transfer their passengers’ data records to state authorities. These data records contain a large amount of sensitive information, including the date of birth, the names of accompanying persons, the means of payment used to purchase the flight ticket and an unspecified text field which the airline fills in independently.

The data is usually stored with police authorities. In Germany, the Federal Criminal Police Office intends to automatically compare the data records with pre-determined “criteria” in the future, for example criteria that describe the flight behavior of known criminals. As a result, any person whose profile happens to appear suspicious will have to expect increased police checks or even arrests. This is because the error rates of the algorithms will be considerable.

Strategic litigation aimed at the highest European court

In 2019, GFF together with EDRi member epicenter.works took legal action against the PNR Directive before German and Austrian courts and authorities. Since it is not possible to appeal against the Directive directly before the CJEU, the lawsuits were chosen with a strategic view to having the complaint submitted to the highest European court.

In Germany, GFF supports several individuals filing complaints against the airline Deutsche Lufthansa AG transferring their data to the German Federal Criminal Police Office. The plaintiffs bringing charges before the Cologne District Court include Kathalijne Buitenweg, a member of the Dutch parliament, as well as the German net activist Kübra Gümüşay, and the lawyer Franziska Nedelmann.

As expected, the Cologne District Court has now referred the case to the CJEU due to its evident implications of EU law. In our view, the PNR Directive is incompatible with the European Charter of Fundamental Rights, and the CJEU has already stopped a similar PNR agreement between the EU and Canada with its Opinion 1/15 of 26 July 2017. With the matter referred to the CJEU, the mass processing of passenger data in the EU might come to an end.

The basic funding for the project is provided by the Digital Freedom Fund.

Gesellschaft für Freiheitsrechte (GFF, Society for Civil Rights)
https://freiheitsrechte.org/english/

PNR campaign site: No PNR
https://www.nopnr.eu

Passenger surveillance brought before courts in Germany and Austria (22.05.2019)
https://edri.org/passenger-surveillance-brought-before-courts-in-germany-and-austria/

EU Directive 2016/681 (PNR Directive)
https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32016L0681&from=DE

(Contribution by EDRi member Gesellschaft für Freiheitsrechte, Germany)

close
15 Jan 2020

Your face rings a bell: Three common uses of facial recognition

By Ella Jakubowska

Not all applications of facial recognition are created equal. As we explored in the first and second instalments of this series, different uses of facial recognition pose distinct but equally complex challenges. Here we sift through the hype to analyse three increasingly common uses of facial recognition: tagging pictures on Facebook, automated border control gates, and police surveillance.

The chances are that your face has been captured by a facial recognition system, if not today, then at least in the last month. It is worryingly easy to stroll through automated passport gates at an airport, preoccupied with the thought of seeing your loved ones, rather than consider potential threats to your privacy. And you can quite happily walk through a public space or shop without being aware that you are being watched, let alone that your facial expressions might be used to label you a criminal. Social media platforms increasingly employ facial recognition, and governments around the world have rolled it out in public. What does this mean for our human rights? And is it too late to do something about it?

First: What the f…ace? – Asking the right questions about facial recognition!

As the use of facial recognition skyrockets, it can feel that there are more questions than answers. This does not have to be a bad thing: asking the right questions can empower you to challenge the uses that will infringe on your rights before further damage is done.

A good starting point is to look at impacts on fundamental rights such as privacy, data protection, non-discrimination and freedoms, and compliance with international standards of necessity, remedy and proportionality. Do you trust the owners of facial recognition systems (or indeed other types of biometric recognition and surveillance) whether public or private, to keep your data safe and to use it only for specific, legitimate and justifiable purposes? Do they provide sufficient evidence of effectiveness, beyond just the vague notion of “public security”?

Going further, it is important to ask societal questions like: does being constantly watched and analysed make you feel safer, or just creeped out? Will biometric surveillance substantially improve your life and your society, or are there less invasive ways to achieve the same goals?

Looking at biometric surveillance in the wild

As explored in the second instalment of this series, many public face surveillance systems have been shown to violate rights and been deemed illegal by data protection authorities. Even consent-based, optional applications may not be as unproblematic as they first seem. This is our “starter for ten” for thinking through the potentials and risks of some increasingly common uses of facial verification and identification – we’ll be considering classification and other biometrics next time. Think we’ve missed something? Tweet us your ideas @edri using #FacialRecognition.

Automatic tagging of pictures on Facebook

Facebook uses facial recognition to tag users in pictures, as well as other “broader” uses. Under public pressure, in September 2019, they made it opt-in – but this applies only to new, not existing, users.

Potentials:

  • Saves time compared to manual tagging
  • Alerts you when someone has uploaded a picture of you without your knowledge

Risks:

  • The world’s biggest ad-tech company can find you on photos or videos across the web – forever
  • Facebook will automatically scan, analyse and categorise every photo uploaded
  • You will automatically be tagged in photos you might want to avoid
  • Errors especially for people with very light or very dark skin

Evidence:

Creepy, verging on dystopian, especially as the feature is on by default for some users (here’s how to turn it off: https://www.cnet.com/news/neons-ceo-explains-artificial-humans-to-me-and-im-more-confused-than-ever/). We’ll leave it to you to decide if the potentials outweigh the risks.

Automated border control (ePassport gates)

Automated border control (ABC) systems, sometimes known as e-gates or ePassport gates, are self-serve systems that authenticate travellers against their identity documents – a type of verification.

Potentials:

  • Suggested as a solution for congestion as air travel increases
  • Matches you to your passport, rather than a central database – so in theory your data isn’t stored

Risks:

  • Longer queues for those who cannot or do not want to use it
  • Lack of evidence that it saves time overall
  • Difficult for elderly passengers to use
  • May cause immigration issues or tax problems
  • Normalises face recognition
  • Disproportionately error-prone for people of colour, leading to unjustified interrogations
  • Supports state austerity measures

Evidence:

  • Stats vary wildly, but credible sources suggest the average border guard takes 10 seconds to process a traveler, faster than the best gates which take 10-15 seconds
  • Starting to be used in conjunction with other data to predict behaviour
  • High volume of human intervention needed due to user or system errors
  • Extended delays for the 5% of people falsely rejected
  • Evidence of falsely criminalising innocent people
  • Evidence of falsely accepting people with wrong passport

Evidence of effectiveness can be contradictory, but the impacts – especially on already marginalised groups – and the ability to combine face data with other data to induce additional information about travellers bear major potential for abuse. We suspect that offline solutions such as funding more border agents and investing in queue management could be equally efficient and less invasive.

Police surveillance

Sometimes referred to as face surveillance, police forces across Europe – often in conjunction with private companies – are using surveillance cameras to perform live identification in public spaces.

Potentials:

  • Facilitates the analysis of video recordings in investigations

Risks:

  • Police hold a database of faces and are able to track and follow every individual ever scanned
  • Replaces investment in police recruitment and training
  • Can discourage use of public spaces – especially those who have suffered disproportionate targeting
  • Chilling effect on freedom of speech and assembly, an important part of democratic participation
  • May also rely on pseudo-scientific emotion “recognition”
  • Legal ramifications for people wrongly identified
  • No ability to opt out

Evidence:

Increased public security could be achieved by measures to tackle issues such as inequality or antisocial behaviour or generally investing in police capability rather than surveillance technology.

Facing reality: towards a mass surveillance society?

Without intervention, facial recognition is on a path to omniscience. In this post, we have only scratched the surface. However, these examples identify some of the different actors that may want to collect and analyse your face data, what they gain from it, and how they may (ab)use it. They have also shown that benefits of facial surveillance are frequently cost-cutting reasons, rather than user benefit.

We’ve said it before: tech is not neutral. It reflects and reinforces the biases and world views of its makers. The risks are amplified when systems are deployed rapidly, without considering the big picture or the slippery slope towards authoritarianism. The motivations behind each use must be scrutinised and proper assessments carried out before deployment. As citizens, it is our right to demand this.

Your face has a significance beyond just your appearance – it is a marker of your unique identity and individuality. But with prolific facial recognition, your face becomes a collection of data points which can be leveraged against you and infringe on your ability to live your life in safety and with privacy. With companies profiting from the algorithms covertly built using photos of users, faces are literally commodified and traded. This has serious repercussions on our privacy, dignity and bodily integrity.

Facial Recognition and Fundamental Rights 101 (04.12.2019)
https://edri.org/facial-recognition-and-fundamental-rights-101/

The many faces of facial recognition in the EU (18.12.2019)
https://edri.org/the-many-faces-of-facial-recognition-in-the-eu/

Stalked by your digital doppelganger? (29.01.2020)
https://edri.org/stalked-by-your-digital-doppelganger/

Data-Driven Policing: The Hardwiring of Discriminatory Policing Practices across Europe (05.11.2019)
https://www.enar-eu.org/IMG/pdf/data-driven-profiling-web-final.pdf

Facial recognition technology: fundamental rights considerations in the context of law enforcement (27.11.2019)
https://fra.europa.eu/sites/default/files/fra_uploads/fra-2019-facial-recognition-technology-focus-paper.pdf

What the “digital welfare state” really means for human rights (08.01.2020)
https://www.openglobalrights.org/digital-welfare-state-and-what-it-means-for-human-rights/

Resist Facial Recognition
https://www.libertyhumanrights.org.uk/resist-facial-recognition

(Contribution by Ella Jakubowska, EDRi intern)

close