04 May 2020

COVID-19 & Digital Rights: Document Pool


The Coronavirus (COVID-19) pandemic poses a global public health challenge of unprecedented proportions. In order to tackle it, countries around the world need to engage in coordinated, evidence-based responses grounded in solidarity, support and respect for human rights. This means that measures cannot lead to disproportionate and unnecessary actions. It is also vital that measures are not extended once we are no longer in a state of emergency. Otherwise, the actions taken under exceptional circumstances today can have significant repercussions on human rights both today and tomorrow.

In this document pool we will be listing relevant articles and documents related to the intersection of the COVID-19 crisis and digital rights. This will allow you to follow the developments of surveillance measures, content moderation, tracking and privacy-threatening actions in Europe as they relate to the coronavirus pandemic, as well as offer the set of perspectives and recommendations put forth by a host of digital rights watchdog organisations across Europe and the world. The document pool is updated regularly to ensure the delivery of the most up-to-date information.

  1. EDRi’s Analysis and Recommendations
  2. EDRi Articles, blog posts and press releases
  3. Mapping Exercise
  4. Official EU Documents
  5. Other Useful Resources

1. EDRi’s Analysis and Recommendations

Official EDRi statement on COVID-19 and Digital Rights

EDRi Members’ Responses and Recommendations on COVID-19

Analysing Tracking & Tracing Apps

2. EDRi’s Articles, blog posts and press release

EDRi Reporting

#COVIDTech – An EDRi Blog Series

3. Mapping Exercises

EDRi Members Mapping

Other Mapping Excercises

4. Official EU Documents

5. Other Useful Resources

With huge thanks to the individuals and organisations across the EDRi network who have shared resources for this document pool.

15 Apr 2020

COVID-19 pandemic adversely affects digital rights in the Balkans

By Metamorphosis

Cases of arbitrary arrests, surveillance, phone tapping, privacy breaches and other digital rights violations have drastically increased in Central and Southeast Europe as governments started imposing emergency legislation to combat the COVID-19 outbreak. Belgrade-based Balkan Investigative Reporting Network (BIRN) and the digital rights organization SHARE Foundation have started a blog titled “Digital Rights in the Time of COVID-19” documenting these developments.

In response to the coronavirus pandemic, some governments across Europe are enhancing surveillance, increasing censorship, and restricting the free flow of information. In many cases, the government-imposed restrictions flouted human rights standards. But they seldom truly protect digital rights. In the Balkans, as mentioned above, incidents of digital rights violations have steadily increased. Bojan Perkov, policy researcher at the SHARE Foundation, wrote a summary of their findings, noting the following:

“The information gathered by the two organizations so far shows that the most problematic [violations] are, essentially, multiple issues involving the privacy of people who are put under quarantine, the spread of disinformation and the dangerous misconceptions regarding the virus in the online and social media networks, as well as the increase of internet scams.”

The data gathered by the two organizations through the blog’s database feature indicate that in just over the last two weeks, 80 people have been arrested, some of them jailed, for spreading fake news and disinformation, with the most draconian examples in Turkey, Serbia, Hungary and Montenegro.

One such noteworthy example occurred in the Serbian city of Novi Sad where Nova,rs journalist Ana Lalić was arrested for “upsetting the public.” Lalić had published an article describing the chaotic conditions of the Clinical Center of Vojvodina, their “chronic lack of equipment” and under-preparedness. It was the Center who then filed the complaint against her and which led to her 48-hour sentence. Her arrest provoked the reaction from organisations across Europe like EDRi member Article 19 or Freedom House.

Governments in Montenegro and Moldova made public the personal health data of people infected with COVID-19, while official websites and hospital computer systems suffered cyber-attacks in Croatia and Romania. Some countries like Slovakia are considering lifting rights enshrined under the EU General Data Protection Regulation (GDPR), while Serbia imposed surveillance and phone tracking to limit freedom of movement.

Potentially infected citizens have been obliged to submit to new forms of control by law. In Serbia since the declaration of a state of emergency was declared and all citizens arriving from abroad must undergo quarantine. During a March 19 press conference, Serbian President Aleksandar Vučić stated that the police is “following” Italian telephone numbers, checking which citizens use roaming and constantly tracking their locations. This was specifically aimed at members of the Serbian diaspora who returned from Italy and are supposed to self-isolate in their homes. He also warned the people who leave their phones behind that the state has “another way” of tracking them if they violate quarantine, but didn’t explain the method.

In neighboring Montenegro, the National Coordination Body for Infectious Diseases decided to publish the names and surnames of people who must undergo quarantine online, after it determined that certain persons violated the measure, and as a result “exposing the whole Montenegro to risk.” Civic alliance challenged this measure through a complaint submitted to the Constitutional Court of Montenegro.

In Croatia, concerned citizens developed a website samoizolacija.hr (meaning “Self-isolation”), which allegedly enabled anyone to anonymously report quarantine violators to the police. The site was been subsequently shut down, and the Ministry of Interior initiated criminal investigations against suspected violators of privacy rights.

Crisis Headquarters of the Federation of Bosnia and Herzegovina issued a recommendation on how to publish the personal data of citizens who violate the prevention measures, as government institutions at cantonal and local level started publishing data about people in isolation and self-isolation, including lists of people identified as infected by the coronavirus. In response, on March 24, the Personal Data Protection Agency of Bosnia and Herzegovina issued a decision forbidding the publication of personal data of citizens tested positive for the coronavirus or those subjected to isolation and self-isolation measures.

Perkov also raised the issue of whether these measures are effective, in particular because this puts people in danger. In Montenegro, infected people whose identities were revealed on social networks, have been subjected to hate speech.

“Furthermore, is the idea behind such measures the public shaming of people who disrespect the obligation for self-isolation or the reduction of number of violations? The criteria of proportionality and necessity have not been properly respected and their adequacy had not been justified.”

The above cases of publication of health data online involve direct violation of the laws that designate them as protected at the highest legal level. In other words, these violations go against laws of the highest order that protect fundamental rights in the digital environment, and they are doing so under the guise of the COVID-19 crisis response, as if it were an open invitation to break the rules of free and protected societies.

Read more:

Digital Rights in the time of COVID-19 (23.03.2020)

Serbian government revokes controversial COVID-19-related decree used as pretext to arrests journalists (02.04.2020)
globalvoices.org/2020/04/07/serbian-government-revokes-controversial-covid-19-related-decree-used-as-pretext-to-arrests-journalists/ (opens in a new tab)” href=”globalvoices.org/2020/04/07/serbian-government-revokes-controversial-covid-19-related-decree-used-as-pretext-to-arrests-journalists/” target=”_blank”>globalvoices.org/2020/04/07/serbian-government-revokes-controversial-covid-19-related-decree-used-as-pretext-to-arrests-journalists/

Europe’s other Coronavirus victim: information and data rights (24.03.2020)
balkaninsight.com/2020/03/24/europes-other-coronavirus-victim-information-and-data-rights/ (opens in a new tab)” href=”balkaninsight.com/2020/03/24/europes-other-coronavirus-victim-information-and-data-rights/” target=”_blank”>balkaninsight.com/2020/03/24/europes-other-coronavirus-victim-information-and-data-rights/

Montenegrin Coronavirus patients’ identities exposed online (18.03.2020)

(In Serbian) Vučić: Ne ostavljajte telefone, nećete nas prevariti! ZNAMO da se krećete (19.03.2020)

(In Serbian) Podnijeli inicijativu za ocjenu ustavnosti Odluke NKT-a (23.03.2020)

(Contribution by Filip Stojanovski from EDRi member Metamorphosis)

27 Mar 2020

Open letter: Civil society urges Member States to respect the principles of the law in Terrorist Content Online Regulation


On 27 March 2020, European Digital Rights (EDRi) and 12 of its member organisations sent an open letter to representatives of Member States in the Council of the EU. In the letter, we voice our deep concern over the proposed legislation on the regulation of terrorist content online and what we view as serious potential threats to fundamental rights of privacy, freedom of expression, etc.

You can read the letter here (pdf) and below

Brussels, 27 March 2020

Dear representatives of Member States in the Council of the EU,

We hope that you are keeping well in this difficult time.

We are writing to you to voice our serious concerns with the proposed Regulation on preventing the dissemination of terrorist content online (COM/2018/640 final). We have raised these concerns before and many similar critiques have been expressed in letters opposing the Regulation from human rights officials, civil society groups, and human rights advocates.i

We firmly believe that any common position on this crucial file must respect fundamental rights and freedoms, the constitutional traditions of the Member States and existing Union law in this area. In order for this to happen, we urge you to ensure that the rule of law in cross-border cases is respected, that the competent authorities tasked with ordering the removal of illegal terrorist content are independent, to refrain from adopting mandatory (re)upload filters and guarantee that the exceptions for certain protected forms of expression, such as education, journalistic and research materials, are maintained in the proposal. We explain why in more detail further below.

First, we ask you to respect the principles of territoriality and ensure access to justice in cases of cross-border takedowns by ensuring that only the Member State in which the hosting service provider has its legal establishment can issue removal orders. The Regulation should also allow removal orders to be contested in the Member State of establishment to ensure meaningful access to an effective remedy. As recent CJEU case law has established “efficiency” or “national security” reasons cannot lead to short-cuts to rule of law mechanisms and safeguards.ii

Secondly, the principle of due process demands that the legality of content be determined by a court or independent administrative authority. This important principle should be reflected in the definition of ‘competent authorities’. For instance, we note that in the Digital Rights Ireland case, the Court of Justice of the European Union considered that the Data Retention Directive was invalid, inter alia, because access to personal data by law enforcement authorities was not made dependent on a prior review carried out by a court or independent administrative authority.iii In our view, the removal of alleged terrorist content entails a very significant interference with freedom of expression and as such, calls for the application of the same safeguards.

Thirdly, the Regulation should not impose the use of upload or re-upload filters (automated content recognition technologies) to those services under the scope of the Regulation. As the coronavirus crisis makes abundantly clear, filters are far from accurate. Only in recent days, Twitter, Facebook and YouTube have moved to full automation of removal of content, leading to bad scores of legitimate articles about coronavirus being removed.iv The same will happen if filters are applied to alleged terrorist content. There is also mounting data suggesting that algorithms are biased and have a discriminatory impact, which is a particular concern for communities affected by terrorism and whose counter-speech has proven to be vital against radicalisation and terrorist propaganda. Furthermore, a provision imposing specific measures on platforms should favour a model that gives room for manoeuvre to service providers on which actions to take to prevent the dissemination of illegal terrorist content, taking into account their capacities and resources, size and nature (whether non-for-profit, for-profit or community-led).

Finally, it is crucial that certain protected forms of expression, such as educational, artistic, journalistic and research materials are exempted from the proposal, and that it includes feasible measures to ensure how this can be successfully implemented. The determination of whether content amounts to incitement to terrorism or even glorification of terrorism is highly context specific. Research materials should be defined to include content that serves as evidence of human rights abuses. The jurisprudence of the European Court of Human Rights (ECtHR)v specifically requires a particular caution to ,such protected forms of speech and expression. It is vital that these principles are reflected in the Terrorist Content Regulation, including through the adoption of specific provisions protecting freedom of expression as outlined above.

We remain at your disposal for any support you may need from us in the future.

Access Now – https://www.accessnow.org/
Bits of Freedom – https://www.bitsoffreedom.nl/
Centrum Cyfrowe – https://centrumcyfrowe.pl
CDT – https://cdt.org
Committee to Protect Journalists (CPJ) – https://cpj.org/
Daphne Keller – Director Program on Platform Regulation Stanford University
Digitale Gesellschaft – https://digitalegesellschaft.de/
Digitalcourage – https://digitalcourage.de/
D3 – Defensa dos Dereitos Digitais –
Državljan D – https://www.drzavljand.si/
EDRi – https://edri.org/
Electronic Frontier Foundation (EFF) – https://www.eff.org/
Epicenter.Works – https://epicenter.works
Free Knowledge Advocacy Group EU- https://wikimediafoundation.org/
Hermes Center – https://www.hermescenter.org/
Homo Digitalis – https://www.homodigitalis.gr/en/
IT-Political Association of Denmark – https://itpol.dk/
Panoptykon Foundation – https://en.panoptykon.org
Wikimedia Spain – https://wikimedia.es





  • See Digital Rights Ireland v. Minister for Communications, Marine and Natural Resources, Joined Cases C‑293/12 and C‑594/12, 08 April 2014 at para. 62.



  • In cases involving the dissemination of “incitement to violence” or terrorism by the press, the ECtHR’s starting point is that it is “incumbent [upon the press] to impart information and ideas on political issues just as on those in other areas of public interest. Not only does the press have the task of imparting such information and ideas: the public also has a right to receive them.” See Lingens v Austria, App. No. 9815/82,8 July 1986, para 41.
  • The ECtHR also repeatedly held that the public enjoyed the right to be informed of different perspectives, e.g. on the situation in South East Turkey, however unpalatable they might be to the authorities. See also Özgür Gündemv. Turkey, no. 23144/93, 16 March 2000, para.60 and 63 and the Council of Europe handbook on protecting the right to freedom of expression under the European Convention on Human Rights, summarizing the Court’s case law on positive obligations of States with regards to the protection of journalists (p.90-93), available at: https://rm.coe.int/handbook-freedom-of-expression-eng/1680732814
Twitter_tweet_and_follow_banner close
11 Mar 2020

Germany: Invading refugees’ phones – security or population control?

By Gesellschaft für Freiheitsrechte

In its new study, EDRi member Society for Civil Rights (GFF) examines how German authorities sniff out refugees’ phones. The aim of “data carrier evaluation” is supposed to be determining a person’s identity and their country of origin. However, in reality, it violates refugees’ rights and does not produce any meaningful results.

If an asylum seeker in Germany cannot present either a passport or documents replacing it, the Federal Office for Migration and Refugees (BAMF), is authorised to carry out a “data carrier evaluation” – to extract and analyse data from the asylum seeker’s phones and other devices to check their owner’s stated origin and identity. Data that is analysed includes the country codes of their contacts, incoming and outgoing calls and messages, browser history, geodata from photos, as well as email addresses and usernames used in applications such as Facebook, booking.com or dating apps. Notably, BAMF carries out this data analysis regardless of any concrete suspicion that the asylum-seekers made untruthful statements regarding their identity or country of origin.

The study “Invading Refugees’ Phones: Digital Forms of Migration Control” examines and assesses how BAMF evaluates refugees’ data and what kinds of results data carrier evaluation has produced so far. For the study, journalist Anna Biselli and GFF lawyer Lea Beckmann comprehensively researched and evaluated numerous sources. These include data carrier evaluation reports, asylum files, internal BAMF regulations, such as a user manual for reading mobile data carriers, and training documents for BAMF employees, as well as information that was made public by parliamentary inquiries.

High costs, useless results

The study found that evaluating data carriers is not an effective way of establishing a person’s identity and country of origin. Since data carrier evaluations started in 2017, BAMF has examined about 20 000 mobile phones of asylum seekers. When invading refugees privacy via data carrier evaluation produces results, it usually only confirms what the persons themselves stated during their interviews with BAMF employees.

In 2018, only 2% of the successful data carrier evaluations revealed contradictions to the asylum seekers’ statements. Graphic: GFF/Julia Zé

There were already doubts about the effectiveness of data carrier evaluation before the law on Better Enforcement of the Obligation to leave the Country was passed. The law aims to speed up deportations. By introducing data carrier evaluations, legislators hoped to verify a person’s identity, country of origin and grounds for protection more quickly than before. In practice, the procedure has fallen short of these expectations. It has also turned out to be very expensive. 

In relation to the limited benefit of data carrier evaluations, the costs of the procedure are clearly disproportionate. In February 2017, the Federal Ministry of the Interior stated that installation costs of 3,2 million euros were to be expected. By the end of 2018, however, 7,6 million euros had already been spent on the system, more than twice as much as originally estimated.

Total costs of the BAMF for reading and evaluating data media: From just under 7 million euros in 2017 to an expected 17 million euros in 2022. graph: GFF/Julia Zé

A blatant violation of fundamental rights

Examining refugees’ phones can be seen as a human rights violation. Despite that, Germany has spent millions of euros on introducing and developing this practice. Data carrier evaluations circumvent the basic right to informational self-determination, which has been laid down by the German Federal Constitutional Court. Refugees are subject to second-class data protection. At the same time, they are especially vulnerable and lack meaningful access to legal remedies.

Germany is not the only country to experiment with digital forms of migration control. BAMF’s approach is part of a broader, international trend towards testing new surveillance and monitoring technologies on marginalised populations, including refugees. Individual people, as well as their individual histories, are increasingly being reduced to data records. GFF will combat this trend with legal means: We are currently preparing legal action against the BAMF’s data carrier evaluation. 

We thank the Digital Freedom Fund for their support.

Gesellschaft für Freiheitsrechte (GFF, Society for Civil Rights)

Invading Refugees’ Phones: Digital Forms of Migration Control

The human rights impacts of migration control technologies (12.02.2020)

Immigration, iris-scanning and iBorderCTRL (26.02.2020)

(Contribution by EDRi member Gesellschaft für Freiheitsrechte – GFF, Germany)

26 Feb 2020

Romania: Mandatory SIM registration declared unconstitutional, again


On 18 February 2020, the Romanian Constitutional Court unanimously declared unconstitutional a new legislative act adopted in September 2019 introducing mandatory SIM card registration. The legislative act in question was an emergency ordinance issued by the Government which wanted to introduce this obligation as a measure “to improve the operation of the 112 emergency service number”. This is the second time the court issues an unconstitutionality decision on mandatory SIM card registration proposals.

The court dismissed the law on procedural grounds, as the government failed to demonstrate the emergency of adopting the ordinance. It also highlighted in its press release that this act of introducing the SIM card registration provision has actually been postponed twice, therefore the urgency to issue an emergency ordinance was non-existent.

Although this is the sixth attempt to introduce legislation on mandatory SIM card registration in Romania, the battle is far from over as, this time, the court did not go into a substantive analysis.

Coincidentally or not, two days later, a false bomb threat (the first in years) was reported in the media. A man called the 112 emergency number claiming that he placed a bomb inside a shopping mall. The call was made from a prepaid SIM card, a fact that the 112 service specifically highlighted in their press release, while amping up about the number of staff involved in responding to this call, subtly suggesting the implications of this types of calls are massive, and if they are proven false, there is nobody that can be taken responsible as prepaid SIM card calls can’t be traced back to an individual.

The Constitutional Court’s decision is a welcomed victory for now, but given the track record of proposing a new legislative proposal on this topic every year or two, we can expect similar legislative proposals in the future.


After a tragic failure by the police to save a teenage girl who was abducted but managed to call the 112 emergency number three times before she was murdered, the Romanian Government adopted an Emergency Ordinance which introduced the obligation to register prepaid SIM cards.

EDRi member ApTI, together with the Association for the Defence of Human Rights in Romania – the Helsinki Committee (APADOR-CH), asked the Romanian Ombudsman to send the law to the Constitutional Court. The Ombudsman challenged the case in court and ApTI sent an amicus curiae in support the unconstitutionality claims, showing that:

  1. there was no reason to justify the adoption of an emergency ordinance instead of going through the normal parliamentary procedure;
  2. restricting individual freedoms by requesting to register prepaid SIM cards is a measure disproportionate to the goal that was intended – to limit fake calls to the 112 emergency number;
  3. no data protection impact assessment has been carried out and the national data protection authority did not support the law;
  4. the Constitutional Court already decided in 2014 that the measure to introduce mandatory SIM card registration limits the fundamental rights and freedoms and such measures can only be introduced if they are necessary and proportionate.

The 6th attempt to introduce mandatory SIM card registration in Romania (23.10.2020)

Constitutional Court press release (only in Romanian, 18.02.2020)

Timeline of legislative initiatives to introduce mandatory SIM card registration (only in Romanian)

Fake bomb threat at a shopping mall in Romania (only in Romanian, 20.02.2020)

Constitutional Court decision nr. 461/2014 (only in Romanian)

(Contribution by Valentina Pavel, EDRi member ApTI, Romania)

19 Feb 2020

The impact of competition law on your digital rights

By Laureline Lemoine

This is the first article in a series dealing with competition law and Big Tech. The aim of the series is to look at what competition law has achieved when it comes to protecting our digital rights, where it has failed to deliver on its promises, and how to remedy this.

This series will first look at how competition and privacy law interact, to then focus on how they can support each other in tackling data exploitation and other issues related to Big Tech companies. With a potential reform of competition rules in mind, this series is also a reflection on how competition law could offer a mechanism to regulate Big Tech companies to limit their increasing power over our democracies.

Our personal data is seen by Big Tech companies as a commodity with economic value, and they cannot get enough of it. They track us online and harvest our personal data, including sensitive health data. Data protection and online privacy legislations aim to protect individuals against invasive data exploitation. Even though well-enforced privacy and data protection legislation are a must-have in our connected societies, there are other avenues that could be explored simultaneously. Because of the power imbalance between individuals and companies, as well as other issues affecting our fundamental rights, there is a need for a more structural approach, involving other policies and legislation. Competition law is often referred to as one of the tools that could redress this power imbalance, because it controls and regulates market power, including in the digital economy.

During her keynote speech at the International Association of Privacy Professionals (IAPP) conference in November 2019, Margrethe Vestager, European Commissioner for Competition and Executive Vice-President for A Europe Fit for the Digital Age, argued that, “[…] to tackle the challenges of a data-driven economy, we need both competition and privacy regulation, and we need strong enforcement in both. Neither of these two things can take the place of one another, but in the end, we’re dealing with the same digital world. Privacy and competition are both fundamentally there for the same reason: to protect our rights as consumers”.

Privacy and competition law are different policies

Competition and privacy law (which includes data protection and online privacy legislations) are governed by different legal texts and overseen by different authorities with distinct mandates.

According to Wojciech Wiewiórowski, the European Data Protection Supervisor (EDPS), “the main purpose of these two kinds of oversight is […] very different, because what the competition authorities want to achieve is the well-working fair market, what we want to achieve is to defend the fundamental rights [to privacy and data protection]”.

This means that, in assessing competition infringements, competition authorities do not go beyond competition issues. They have to assume that companies are or will be in compliance with their other legal obligations, including their privacy obligations.

The Court of Justice of the European Union confirmed this difference of mandates in 2006. In the Facebook/WhatsApp merger case, the Commission concluded that privacy-related concerns “do not fall within the scope of the EU competition law rules but within the scope of the EU data protection rules”. Facebook was later fined for “misleading” the competition authority.

Since then, Europe has seen the development of a data-driven economy and its fair share of privacy scandals and data breaches, too. And despite numerous investigations into problematic behaviours, Big Tech companies keep on growing.

But this goes far beyond competition issues, as the dominant position of Big Tech companies also gives them the power and the incentive to limit our freedoms, and to infringe on our fundamental rights. Their dominance is even a threat to our democracies.

As a way to tackle these issues, more people are calling for the alignment of enforcement initiatives of data protection authorities as well as competition and consumer authorities. This has led to debates about the silos between competition and data protection law, their differences but also their common objectives.

Data protection and competition against Big Tech powers

Both competition and data protection law impact economic activities and, at EU level, both are used to to ensure the further deepening of the EU single market. The General Data Protection Regulation (GDPR), as well as ensuring a high level of the protection of personal data, aims to harmonise the Member States’ legislations to remove obstacles to a common European market. Similarly, competition law prevents companies from enacting barriers to trade between competitors.

Moreover, data protection can be considered as an element of competition in cases where companies compete for who can better satisfy privacy preferences. There is, in this case, a common objective of allowing the individual to have control (as a consumer or as a data subject).

In her keynote speech, Vestager explained: “competition and competition policy have an important role to play … because the idea of competition is to put consumers in control. For markets to serve consumers and not the other way around,” she said, “it means if you don’t like the deal we’re getting, we can walk away and find something that meets our needs in a better way. And consumers can also use that power to demand something we really … care about, including maybe our privacy.”

Indeed, giving consumers a genuine choice to use privacy-friendly companies would help uphold standards in terms of privacy. Although now it is hard to believe, once upon a time Facebook prioritized privacy as a way to distinguish itself from MySpace, its biggest competitor back then.

However, the issue in the world of Big Tech today is that privacy is not a leverage. The dominant positions of the few players controlling the market leave no room for others proposing privacy-friendly products. As a result, there is no other choice but to use the services of Big Tech to stay connected online – the consumer is no longer in control.

One way to remedy this power imbalance between individuals and these giant companies could be through a greater cooperation between regulatory authorities. BEUC, the European Consumer Organisation, has called, regarding Facebook’s exploitation of consumers, for a “coherent enforcement approach for the data economy between regulators and across Member States” and wants the “European Commission to explore – with relevant authorities – how to deal with a concrete commercial behaviour that simultaneously breaches different areas of EU law”.

In 2016, The EDPS launched the Digital Clearinghouse, a voluntary network of regulators involved in the enforcement of legal regimes in digital markets, with a focus on data protection, and consumer and competition law. National competition authorities are also looking into competition and data, while in 2019, the European Commission published a report on Competition Policy for the Digital Era, to which EDRi member Privacy International contributed.

Greater cooperation between regulators, inclusion of data protection principles in competition law, and many other ideas are being discussed to redress this issue of power imbalance. Some of them will be explored in the next articles of this series.

Regarding antitrust law, we will look at discussions regarding new sets of rules designed especially for the Big Tech market, as well as the development of the right to portability and interoperability. As for merger control, we will focus on to what extent privacy could be considered as a theory of harm.

Opinion 8/2016 – EDPS Opinion on coherent enforcement of fundamental rights in the age of big data (2016)

Competition and data

Factsheet – Competition in the digital era (2020)

Report of the European Commission – Competition Policy for the digital era (2019)

Family ties: the intersection between data protection and competition in EU Law (2017)

12 Feb 2020

Cloud extraction: A deep dive on secret mass data collection tech

By Privacy International

Mobile phones remain the most frequently used and most important digital source for law enforcement investigations. Yet it is not just what is physically stored on the phone that law enforcement are after, but what can be accessed from it, primarily data stored in the “cloud”. This is why law enforcement is turning to “cloud extraction”: the forensic analysis of user data which is stored on third-party servers, typically used by device and application manufacturers to back up data. As we spend more time using social media and messaging apps, store files with the likes of Dropbox and Google Drive, as our phones become more secure, locked devices harder to crack, and file-based encryption becomes more widespread, cloud extraction is, as a prominent industry player says, “arguably the future of mobile forensics.”

The report “Cloud extraction technology: the secret tech that lets government agencies collect masses of data from your apps” brings together the results of Privacy International’s open source research, technical analyses and freedom of information requests to expose and address this emerging and urgent threat to people’s rights. 

Phone and cloud extraction go hand in hand

EDRi member Privacy International has repeatedly raised concerns over risks of mobile phone extraction from a forensics perspective and highlighted the absence of effective privacy and security safeguards. Cloud extraction goes a step further, promising access to not just what is contained within the phone, but also to what is accessible from it. Cloud extraction technologies are deployed with little transparency and in the context of very limited public understanding. The seeming “wild west” approach to highly sensitive data carries the risk of abuse, misuse and miscarriage of justice. It is a further disincentive to victims of serious offences to hand over their phones, particularly if we lack even basic information from law enforcement about what they are doing. 

The analysis of data extracted from mobile phones and other devices using cloud extraction technologies increasingly includes the use of facial recognition capabilities. If we consider the volume of personal data that can be obtained from cloud-based sources such as Instagram, Google photos, iCloud, which contain facial images, the ability to use facial recognition on masses of data is a big deal. Because of this, greater urgency is needed to address the risks that arise from such extraction, especially as we consider the addition of facial and emotion recognition to software which analyses the extracted data.  The fact that it is potentially being used on vast troves of cloud-stored data without any transparency and accountability is a serious concern.

What you can do

There is an absence of information regarding the use of cloud extraction technologies, making it unclear how this is lawful and equally how individuals are safeguarded from abuse and misuse of their data.  This is part of a dangerous trend by law enforcement agencies and we want to ensure globally the existence of transparency and accountability with respect to new forms of technology they use. 

If you live in the UK, you can submit a Freedom of Information Act Request to your local police to ask them about their use of cloud extraction techonoligies using this template: https://privacyinternational.org/action/3324/ask-your-local-uk-police-force-about-cloud-extraction. You can also use it to send a request if you are based in another country which has Freedom of Information legislation.

Privacy International

Cloud extraction technology: the secret tech that lets government agencies collect masses of data from your apps (07.01.2020)

Phone Data Extraction

Push This Button For Evidence: Digital Forensics

Can the police limit what they extract from your phone? (14.11.2019)

Facial recognition and fundamental rights 101 (04.12.2019)

Ask your local UK police force about cloud extraction

(Contribution by Antonella Napolitano, EDRi member Privacy International)

12 Feb 2020

Digitalcourage fights back against data retention in Germany

By Digitalcourage

On 10 February 2020, EDRi member Digitalcourage published the German government’s plea in the data retention case at the European Court of Justice (ECJ). Dated 9 September 2019, the document from the government explains the use of retained telecommunications data by secret services, the question whether the 2002 ePrivacy Directive might apply to various forms of data retention, which exceptions from human rights protections apply to secret service operations, and justifies its plans for the use of data retention to solve a broad range of crimes with the example of a case of the abduction of a Vietnamese man in Berlin by Vietnamese agents. However, this case is very specific and, even if then the retained data was “useful”, that is not a valid legal basis for mass data retention, and therefore can not justify drastic incisions into the basic rights of all individuals in Germany. Finally, the German government also argues that the scope and time period of the storage makes a difference regarding the compatibility of data retention laws with fundamental rights.

Digitalcourage calls for all existing illegal data retention laws to be declared invalid in the EU. There are no grounds for blanket and suspicion-less surveillance in a democracy and under the rule of law. Whether it is content data or metadata that is being stored, data retention (blanket and mass collection of telecommunications data) is inappropriate, unnecessary and ineffective, and therefore illegal. Where the German government argues that secret services need to use telecommunications data to protect state interests, Digitalcourage agrees with many human rights organisations that activities of secret services can be a direct threat to the core trust between the general public and the state. The ECJ has itself called for the storage to be reduced to the absolutely required minimum – and that, according to Digitalcourage, can be only be fulfilled if no data is stored without individual suspicion.


Press release: EU data retention: Digitalcourage publishes and criticises the position of the German government (only in German, 10.02.2020)

(Contribution by Sebastian Lisken, EDRi member Digitalcourage, Germany)

12 Feb 2020

PI and Liberty submit a new legal challenge against MI5

By Privacy International

On 1 February 2020, EDRi member Privacy International (PI) and civil rights group Liberty filed a complaint with the Investigatory Powers Tribunal, the judicial body that oversees the intelligence agencies in the United Kingdom, against the security service MI5 in relation to how they handle vast troves of personal data.

In mid-2019, MI5 admitted, during a case brought by Liberty, that personal data was being held in “ungoverned spaces”. Much about these ungoverned spaces, and how they would effectively be “governed” in the future, remained unclear. At the moment, they are understood to be a “technical environment” where personal data of an unknown number of individuals was being “handled”. The use of “technical environment” suggests something more than simply a compilation of a few datasets or databases.

The longstanding and serious failings of MI5 and other intelligence agencies, in relation to these “ungoverned spaces” first emerged in PI’s pre-existing case that started in November 2015. The case challenges the processing of bulk personal datasets and bulk communications data by the UK Security and Intelligence Agencies.

In the course of these proceedings, it was revealed that PI’s data were illegally held by MI5, among other intelligence and security agencies. MI5 deleted PI’s data while the investigation was ongoing. With the new complaint PI also requested the reopening of this case in relation to MI5’s actions.

In parallel proceedings brought by Liberty against the bulk surveillance powers contained in the Investigatory Powers Act 2016 (IPA), MI5 admitted that personal data was being held in “ungoverned spaces”, demonstrating a known and continued failure to comply with both statutory and non-statutory safeguards in relation to the handling of bulk data since at least 2014. Importantly, documents disclosed in that litigation and detailed in the new joint complaint showed that MI5 had sought and obtained bulk interception warrants on the basis of misleading statements made to the relevant authorities.

The documents reveal that MI5 not only broke the law, but for years misled the Investigatory Powers Commissioner’s Office (IPCO), the body responsible for overseeing UK surveillance practices.

In this new complaint, PI and Liberty argue that MI5’s data handling arrangements result in the systematic violation of the rights to privacy and freedom of expression (as protected under Articles 8 and 10 of the European Convention of Human Rights) and under EU law. Furthermore, they maintain that the decisions to issue warrants requested by MI5, in circumstances where the necessary safeguards were lacking, are unlawful and void.

Privacy International

MI5 ungoverned spaces challenge

Bulk Personal Datasets & Bulk Communications Data challenge

The Investigative Tribunal case no. IPT/15/110/CH

Reject Mass Surveillance

MI5 law breaking triggers Liberty and Privacy International legal action (03.02.2020)

(Contribution by EDRi member Privacy International)

03 Feb 2020

Support our work by investing in a piece of e-clothing!


Your privacy is increasingly under threat. European Digital Rights works hard to have you covered. But there’s only so much we can do.

Help us help you. Help us get you covered.

Click the image to watch the video!

Check out our 2020 collection!*

*The items listed below are e-clothes. That means they are electronic. Not tangible. But still very real – like many other things online.

Your winter stock(ings) – 5€
A pair of hot winter stockings can really help one get through cold and lonely winter days. Help us to fight for your digital rights by investing in a pair of these superb privacy–preserving fishnet stockings. This delight is also a lovely gift for someone special.

A hat you can leave on – 10€
Keep your head undercover with this marvelous piece of surveillance resistance. Adaptable to any temperature types and – for the record – to several CCTV models, the item really lives up to its value. This hat is an indispensable accessory when visiting your favourite public space packed with facial recognition technologies.

Winter/Summer Cape – 25€
Are you feeling heroic yet? Our flamboyant Winter/Summer cape is designed to keep you warm and cool. This stylish accessory takes the weight off your shoulders – purchase it and let us take care of fighting for your digital rights!

Just another White T-Shirt – 50€
A white t-shirt can do wonders when you’re trying to blend in with a white wall. This wildly unexciting but versatile classic is one of the uncontested fundamental pillars of your privacy enhancing e-wardrobe.

THE privacy pants ⭐️ – 100€
This ultimate piece of resistance is engineered to keep your bottom warm in the coldest winter, but also aired up during the hottest summer days. Its colour guarantees the ultimate tree (of knowledge) look. The item comes with a smart zipper.

Anti-tracksuit ⭐️ – 250€
Keep your digital life healthy with the anti-tracking tracksuit. The fabric is engineered to bounce out any attempt to get your privacy off track. Plus, you can impress your beloved babushka too.

Little black dress ⭐️ – 500€
Whether at a work cocktail party, a funeral, shopping spree or Christmas party – this dress will turn you into the center of attention, in a (strangely) privacy-respecting manner.

Sew your own ⭐️ – xxx€
Unsure of any of the items above? Let your inner tailor free, customise your very own unique, designer garment, and put a price tag of your choice on it.

⭐️ The items of value superior to 100€ are delivered with an (actual, analog, non-symbolic) EDRi iron-on privacy patch that you can attach on your existing (actual, analog, non-symbolic) piece of clothing or accessory. If you wish to receive this additional style and privacy enhancer, don’t forget to provide us with your postal address (either via the donation form, or in your bank transfer message)!

Question? Remark? Idea? Please contact us brussels [at] edri [dot] org !