15 Apr 2020

COVID-Tech: Emergency responses to COVID-19 must not extend beyond the crisis

By Ella Jakubowska

In EDRi’s new series on COVID-19, we will explore the critical principles for protecting fundamental rights while curtailing the spread of the virus, as outlined in the EDRi network’s statement on the virus. Each post in this series will tackle a specific issue at the intersection of digital rights and the global pandemic in order to explore broader questions about how to protect fundamental rights in a time of crisis. In our statement, we emphasised the principle that states must “[i]mplement exceptional measures only for the duration of the crisis”. In this first post of the series, we take a look at what experiences in the UK, Poland and Hungary could teach states as they work out the most effective ways of stopping the spread of coronavirus – without leaving the door ajar for future fundamental rights violations to creep in.

Public health responses to the coronavirus pandemic are putting unprecedented limits on the daily lives of people across the world. A range of important rights, including to life and health, are of course threatened by COVID-19 – but the responses that we are seeing across Europe also threaten fundamental liberties and freedoms, both freedoms to do things (express ourselves, associate in public) and freedom from things (government surveillance, abuses of power, discrimination). In some cases, fundamental rights in the EU are not just under threat, but are already being unjustifiably, disproportionately and unlawfully violated under the guise of public health.

The state of play in Hungary:

On 30 March 2020, the Prime Minister of Hungary was granted sweeping powers to rule the country by decree. Hungary has been under the EU’s spotlight over the last two years for failing to comply with the EU’s core values, with the European Parliament launching infringement proceedings about the deteriorating respect for the rule of law, and civil society raising serious concerns including lack of respect for privacy and data protection, evidence of widespread state surveillance, and infringements on freedom of expression online. Following the enactment of a state of emergency on 11 March and the tabling of the indefinite decree on 23 March, ostensibly in response to the coronavirus pandemic, EU Parliament representatives issued a clear warning to Hungary, stating that “extraordinary measure adopted by the Hungarian government in response to the pandemic must respect the EU’s founding values.” This warning did little to temper Orbán’s ambitions, and leaves the people of Hungary vulnerable to expanded powers which can be abused long after the spread of coronavirus has been checked.

The state of play in Poland:

The European Parliament have also raised concerns about the worsening rule of law in Poland, in particular threats to the independence of the judiciary, with investigations activated in 2016. On 19 March 2020, the country’s efforts to tackle the spread of coronavirus received widespread attention when the government announced the use of a ‘Civil Quarantine’ app which they explained would require people in quarantine to send geo-located selfies within 20 minutes of receiving an alert – or face a visit from the police. according to the announcement, the app even uses controversial facial recognition technology to scan the selfies.

Early in April, the Polish government looked to make the use of the app mandatory, in a move which, as reported by EDRi member Panoptykon, was not proportionate [PL] (due to factors like people’s images being sent to government servers) and additionally not compliant with Poland’s constitution [PL]. Panoptykon emphasise important rules [PL] when implementing technological applications to combat COVID-19 such as minimising the data collected and having strict time periods for its retention, which states must follow in order to comply with fundamental rights.

The state of play in the UK:

The UK’s Coronavirus Act was passed on 25 March 2020, giving the UK government a suite of extraordinary powers for a period of 2 years. Following pressure from civil society, who called the proposed Bill “draconian”, the disproportiontely long period for the restrictions of people’s rights was adjusted to include parliamentary checks every 6 months. Yet NGOs have continued to question why the Bill is not up for renewal every 30 days, given the enormous potential for abuse of power that can happen when people’s fundamental rights protections are suspended or reduced. This is especially important given that, as EDRi member Privacy International has pointed out, the UK already has worryingly wide powers over forms of surveillance including bulk data interception and retention.

The UK has also come under fire for the sharp rise in disproportionate police responses since the introduction of the Bill, including stopping people from using their own gardens or using drones to chastise dog walkers. If not properly limited by law, these powers (and their abuse) have the potential to continue in ordinary times, further feeding the government’s surveillance machine.

POLITICO reports that UK authorities are not alone, with countries across Europe exploiting the climate of fear to encourage people to spy on and report their neighbours, alongside a rise in vigilante attacks, public shamings and even support for police violence. Such behaviour indicates an increasingly hostile, undemocratic and extra-judicial way of enforcing lockdowns. And it is frighteningly reminiscent of some of the most brutal, repressive twentieth-century police states.

What an open door could mean for the future of digital rights:

Allowing states to dispense with the rule of law in times of crisis risks putting those rights in a position of vulnerability in ordinary times, too. The legitimation and normalisation of surveillance infrastructures creates a sense that being watched and analysed all the time is normal (it is not) and contributes to societies filled with suspicion, abuse and mistrust. Before coronavirus was a household name, for example, Hungary’s secretive Szitakötő project was preparing to install 35,000 facial recognition cameras across the country for mass surveillance. This would allow the state to undermine the principle of investigatory warrants, and instead watch and analyse everyone, all the time. The current, indefinite state of emergency could remove any potential checks and balances on the Szitakötő plans, and allow Orbán to push through a wide range of ever-more-violatory measures such as repression of the free media, freedom of expression and political dissent.

Throughout 2019, Poland made its aspirations for global AI leadership clear, having experimented with automating state welfare since at least 2015. As UN Special Rapporteur Philip Alston has emphasised, the global uptake of “automated welfare” is a direct result of government goals to spend less on welfare, control people’s behaviour and punish those who do not conform. There is a risk that the Polish state could exploit new tech, like their quarantine app, to expand an undemocratic agenda and make technology the go-to solution for any public or societal issue. And the UK is already infamous for having one of the highest rates of surveillance cameras per capita in the world. Combined with the fact that the UK’s health service have employed the help of notorious personal-data-exploiting software company Palantir to manage coronavirus data, this suggests that the UK’s pre-existing public-private surveillance economy is the one area profiting from this crisis.

Conclusion:

Desperate times call for desperate measures – or so the saying goes. But this should not undermine the core values of our societies, especially when we have many reasons to be positive: compassionate and brave health workers treating patients; civil society working to protect rights in coronavirus apps and help governments make the right decisions; and global health organisations working to prevent future incidences of the virus and develop vaccines.

As the EU’s Committee for civil liberties state, mass surveillance does not make us safer. It puts undue limits on our liberties and rights which will continue long after the current emergency has been eased. As a result, we will all be less safe – and that really would be desperate times. In the words of Yuval Noah Harari:

[T]emporary measures have a nasty habit of outlasting emergencies, especially as there is always a new emergency lurking on the horizon […] Centralised monitoring and harsh punishments aren’t the only way to make people comply with beneficial guidelines. […] A self-motivated and well-informed population is usually far more powerful and effective than a policed, ignorant population.

Read more:
Fundamental rights implications of COVID-19 (various dates)
https://fra.europa.eu/en/themes/covid-1

Extraordinary powers need extraordinary protections (20/03/2020)
https://privacyinternational.org/news-analysis/3461/extraordinary-powers-need-extraordinary-protections

Use of smartphone data to manage COVID-19 must respect EU data protection rules (07.04.2020)
https://www.europarl.europa.eu/news/en/press-room/20200406IPR76604/use-of-smartphone-data-to-manage-covid-19-must-respect-eu-data-protection-rules

Contract Tracing in the Real World (12.04.2020)
https://www.lightbluetouchpaper.org/2020/04/12/contact-tracing-in-the-real-world/

(Contribution by Ella Jakubowska, EDRi Policy Intern)

close
15 Apr 2020

Control ©: defending free online communication through litigation

By Gesellschaft für Freiheitsrechte

Former Member of the European Parliament Julia Reda has joined the EDRi member German Gesellschaft für Freiheitsrechte (GFF, Society for Civil Rights). The copyright reform activist will coordinate control ©, a new project to defend freedom of communication. Control © will explore the friction between copyright law and fundamental rights such as freedom of expression, freedom of the press, ban on censorship and others. It will operate as a strategic legal guardian, bringing strategic copyright cases to court and ensuring that fundamental rights considerations are at the forefront of those cases.

Last year, Germany was on of the key focal points of protests against the European directive on copyright and related rights in the Digital Single Market (DSM Directive). A strong civil society movement, including EDRi and its members, formed against the most controversial provisions, Article 15 on the neighboring right for press publishers and Article 17 (former Article 13 <saveyourinternet.eu/>) on liability of online content sharing service providers.

The then former MEP and copyright reform activist Julia Reda led the opposition against the DSM Directive within the European Parliament, pointing towards its likely incompatibility with the EU’s Charter of Fundamental rights. To date, it is still disputed whether the requirements of Article 17 can be met without the installation of upload filters. Upload filters restrict free communication, because they invariably lead to the blocking of legal uses under copyright exceptions such as citations. Platforms that fail to cooperate with rightsholders to block the uploads of infringing content can be held liable for copyright infringements by their users.

The German government promised to avoid the introduction of upload filters as they would restrict free online communication to a critical degree, possibly violating freedoms guaranteed by the German basic law. This concerns the right to freedom of opinion and freedom of information and, depending on the functioning of these filters, the rights to data protection and confidentiality of communications.

Securing fundamental rights in copyright through strategic litigation

Whichever approach will be taken by Germany and other EU member states, the DSM Directive will expose existing problems and cause new ones between copyright law and fundamental rights. Government solutions to these problems are likely to benefit large right holders (music companies, film industry), neglecting the rights of users such as scientists, teachers, activists and also of small copyright holders and authors.

Thus, control © will use strategic litigation to fight for an implementation of the DSM Directive that is in conformity with European and German fundamental rights. Strategic litigation has proven an invaluable tool to combat fundamental rights violations in the digital era. Since 2016, GFF has gone to court to defend civil rights in the face of mass-surveillance, excessive secret service powers or opaque state authorities. Control © will make use of this strategic tool in yet another area of law, profiting from GFF’s and Julia Reda’s combined expertise.

Beyond upload filters: cultural heritage and freedom of science

Beyond upload filters, there are more open questions regarding the interplay of copyright law and fundamental rights. The copyright Directive introduces new rules on digitizing making out-of-print works available. This holds the chance to unlock a huge treasure of pictures, music, poems and other works that are no longer in commercial use. We will help institutions such as libraries, museums and archives to make use of these new opportunities in order to strengthen freedom of culture, science and information.

Finally, academic freedom requires the circulation of knowledge as well as legal certainty. Too often, authors of scientific publications are being forced to cede the rights to their studies to commercial scientific publishers. The paywalls they install impede the access to knowledge and scientific progress. Strategic litigation in this field aims to clarify the regulations around Open Access publishing in order to make this option more feasible for more scientists.

Read more:
Introducing control © – Strategic Litigation for Free Communication (13.04.2020)
http://copyrightblog.kluweriplaw.com/2020/04/13/introducing-control-strategic-litigation-for-free-communication/

Control ©: Copyright and freedom of communication (in German only) (13.04.2020)
https://freiheitsrechte.org/urheberrecht/

About GFF (English)
https://freiheitsrechte.org/english/

(Contribution by Luisa Podsadny, from EDRi member GFF)

close
15 Apr 2020

Technology, migration, and illness in the times of COVID-19

By Petra Molnar

In our ongoing work on technology and migration, we examine the impacts of the current COVID-19 pandemic on the rights of people on the move and the increasingly worrying use of surveillance technology and AI at the border and beyond.

Refugees, immigrants, and people on the move have long been linked with bringing disease and illness. People crossing borders whether by force or by choice are often talked about in apocalyptic terms like ‘flood’ or ‘wave,’ underscored by growing xenophobia and racism. Not only are these links blatantly incorrect, they also legitimise far-reaching state incursions and increasingly hardline policies of surveillance and techno-solutionism to manage migration.

These practices become all the more apparent in the current global fight against the COVID-19 pandemic.

In a matter of days, we saw Big Tech present a variety of ‘solutions’ for fighting the coronavirus sweeping the globe. Coupled with extraordinary state powers, the incursion of the private sector leaves open the possibility of grave human rights abuses and far reaching effects on civil liberties, particularly for communities on the margins. While emergency powers can be legitimate if grounded in science and the need to protect health and safety, history shows that states commit abuses in times of exception. New technologies can often facilitate these abuses, particularly against marginalised communities.

As more and more states move increasingly towards a model of bio-surveillance to contain the spread of the pandemic, we are seeing an increase of tracking, automated drones, and other types of technologies developed by the private sector purporting to help manage migration and stop the spread of the virus. However, if previous use of technology is any indication, refugees and people on the move will be disproportionately targeted. Once tools like virus-killing robots, cellphone tracking, and‘artificially intelligent thermal cameras’ are turned on, they will be used against people crossing borders and bring far reaching ramifications. Our research has repeatedly shown that migration technological experiments are often discriminatory, breach privacy, and even endanger lives.

Pandemic responses are political. Making people on the move more trackable and detectable justifies the use of more technology and data collection in the name of public health and national security. Even before the current pandemic, we have already been documenting a worldwide roll-out of migration “techno-solutionism.” These technological experiments occur at many points in a person’s migration journey. Well before you even cross a border, Big Data analytics are used to predict your movement and biometric data is collected about refugees. At the border, AI lie detectors and facial recognition have started to scan people’s faces for signs of deception. Beyond the border, algorithms have made their way into complex decision-making in immigration and refugee determinations, normally undertaken by human officers. A host of people’s fundamental human rights are impacted, including freedom from discrimination, privacy issues, and even life and liberty.

In some cases, increased technology at the border has sadly already meant increased deaths. In late 2019, the European Border and Coast Guard Agency, commonly known as Frontex, announced a new border strategy called ECOSUR which relies on increased staff and new technology like drones. This strategy is similar to the Horizon 2020 ROBORDER project which ‘aims to create a fully functional autonomous border surveillance system with unmanned mobile robots including aerial, water surface, underwater and ground vehicles.’ In the U.S., similar ‘smart-border’ technologies have been called a more ‘humane’ alternative to the Trump Administration’s calls for a physical wall. However, these technologies can have drastic results. For example, border control policies that use new surveillance technologies along the US–Mexico border have actually doubled migrant deaths and pushed migration routes towards more dangerous terrain through the Arizona desert, creating what anthropologist Jason De Leon calls a ‘land of open graves’. Given that the International Organization for Migration (IOM) has reported that due to recent shipwrecks, over 20,000 people have died trying to cross the Mediterranean since 2014, we can only imagine how many more bodies will wash upon the shores of Europe as the situation worsens up in Greece and Turkey.

The COVID pandemic will greatly also affect refugees living in informal settlements or securitised camps. Cases have already been reported on the Greek Island of Lesbos, which has been hosting hundreds of thousands of refugees since the start of the Syrian War in 2011. Italy has also just announced that it has closed its ports to refugee ships because of the coronavirus until July 31. However, the answer to stopping the spread of the virus is not increased surveillance through new technology, building new detention camps, and preventing access into the camps for NGO workers and medical personnel. Instead, we need a redistribution of vital resources, free access to healthcare for all regardless of immigration status, and more empathy and kindness towards people crossing borders.

While technology can offer the promise of novel solutions for an unprecedented global crisis, we must ensure that COVID technology does not unfairly target refugees, racialised communities, the Indigenous communities, and other marginalised groups, and make discriminatory inferences that can lead to detention, family separation, and other irreparable harms. Technological tools can quickly become tools of oppression and surveillance, denying people agency and dignity and contributing to a global climate that is increasingly more hostile to people on the move. Most importantly, technological solutions do not address the root causes of displacement, forced migration, and economic inequality, all of which exacerbate the spread of global pandemics like COVID-19. Unless all of us are healthy, including marginalised communities, no one is.

In times of exception like a global pandemic, the hubris of Big Tech thinking it has all the answers is not the solution to a complex global health crisis.

Read more:
Press Release: EDRi calls for fundamental rights-based responses to COVID-19 (01.04.2020)
https://edri.org/edri-calls-for-fundamental-rights-based-responses-to-covid-19/

Accountable Migration Tech: Transparency, governance and oversight (11.03.2020)
https://edri.org/accountable-migration-tech-transparency-governance-and-oversight

Emerging Voices: Immigration, Iris-Scanning and iBorderCTRL–The Human Rights Impacts of Technological Experiments in Migration (19.08.2019)
http://opiniojuris.org/2019/08/19/emerging-voices-immigration-iris-scanning-and-iborderctrl-the-human-rights-impacts-of-technological-experiments-in-migration/

The Privatization of Migration Control (24.02.2020)
https://www.cigionline.org/articles/privatization-migration-control

(Contribution by Petra Molnar, Mozilla Fellow, EDRi)

close
15 Apr 2020

COVID-19 pandemic adversely affects digital rights in the Balkans

By Metamorphosis

Cases of arbitrary arrests, surveillance, phone tapping, privacy breaches and other digital rights violations have drastically increased in Central and Southeast Europe as governments started imposing emergency legislation to combat the COVID-19 outbreak. Belgrade-based Balkan Investigative Reporting Network (BIRN) and the digital rights organization SHARE Foundation have started a blog titled “Digital Rights in the Time of COVID-19” documenting these developments.

In response to the coronavirus pandemic, some governments across Europe are enhancing surveillance, increasing censorship, and restricting the free flow of information. In many cases, the government-imposed restrictions flouted human rights standards. But they seldom truly protect digital rights. In the Balkans, as mentioned above, incidents of digital rights violations have steadily increased. Bojan Perkov, policy researcher at the SHARE Foundation, wrote a summary of their findings, noting the following:

“The information gathered by the two organizations so far shows that the most problematic [violations] are, essentially, multiple issues involving the privacy of people who are put under quarantine, the spread of disinformation and the dangerous misconceptions regarding the virus in the online and social media networks, as well as the increase of internet scams.”

The data gathered by the two organizations through the blog’s database feature indicate that in just over the last two weeks, 80 people have been arrested, some of them jailed, for spreading fake news and disinformation, with the most draconian examples in Turkey, Serbia, Hungary and Montenegro.

One such noteworthy example occurred in the Serbian city of Novi Sad where Nova,rs journalist Ana Lalić was arrested for “upsetting the public.” Lalić had published an article describing the chaotic conditions of the Clinical Center of Vojvodina, their “chronic lack of equipment” and under-preparedness. It was the Center who then filed the complaint against her and which led to her 48-hour sentence. Her arrest provoked the reaction from organisations across Europe like EDRi member Article 19 or Freedom House.

Governments in Montenegro and Moldova made public the personal health data of people infected with COVID-19, while official websites and hospital computer systems suffered cyber-attacks in Croatia and Romania. Some countries like Slovakia are considering lifting rights enshrined under the EU General Data Protection Regulation (GDPR), while Serbia imposed surveillance and phone tracking to limit freedom of movement.

Potentially infected citizens have been obliged to submit to new forms of control by law. In Serbia since the declaration of a state of emergency was declared and all citizens arriving from abroad must undergo quarantine. During a March 19 press conference, Serbian President Aleksandar Vučić stated that the police is “following” Italian telephone numbers, checking which citizens use roaming and constantly tracking their locations. This was specifically aimed at members of the Serbian diaspora who returned from Italy and are supposed to self-isolate in their homes. He also warned the people who leave their phones behind that the state has “another way” of tracking them if they violate quarantine, but didn’t explain the method.

In neighboring Montenegro, the National Coordination Body for Infectious Diseases decided to publish the names and surnames of people who must undergo quarantine online, after it determined that certain persons violated the measure, and as a result “exposing the whole Montenegro to risk.” Civic alliance challenged this measure through a complaint submitted to the Constitutional Court of Montenegro.

In Croatia, concerned citizens developed a website samoizolacija.hr (meaning “Self-isolation”), which allegedly enabled anyone to anonymously report quarantine violators to the police. The site was been subsequently shut down, and the Ministry of Interior initiated criminal investigations against suspected violators of privacy rights.

Crisis Headquarters of the Federation of Bosnia and Herzegovina issued a recommendation on how to publish the personal data of citizens who violate the prevention measures, as government institutions at cantonal and local level started publishing data about people in isolation and self-isolation, including lists of people identified as infected by the coronavirus. In response, on March 24, the Personal Data Protection Agency of Bosnia and Herzegovina issued a decision forbidding the publication of personal data of citizens tested positive for the coronavirus or those subjected to isolation and self-isolation measures.

Perkov also raised the issue of whether these measures are effective, in particular because this puts people in danger. In Montenegro, infected people whose identities were revealed on social networks, have been subjected to hate speech.

“Furthermore, is the idea behind such measures the public shaming of people who disrespect the obligation for self-isolation or the reduction of number of violations? The criteria of proportionality and necessity have not been properly respected and their adequacy had not been justified.”

The above cases of publication of health data online involve direct violation of the laws that designate them as protected at the highest legal level. In other words, these violations go against laws of the highest order that protect fundamental rights in the digital environment, and they are doing so under the guise of the COVID-19 crisis response, as if it were an open invitation to break the rules of free and protected societies.

Read more:

Digital Rights in the time of COVID-19 (23.03.2020)
https://bird.tools/mapping-digital-rights-during-coronavirus-outbreak/

Serbian government revokes controversial COVID-19-related decree used as pretext to arrests journalists (02.04.2020)
globalvoices.org/2020/04/07/serbian-government-revokes-controversial-covid-19-related-decree-used-as-pretext-to-arrests-journalists/ (opens in a new tab)” href=”globalvoices.org/2020/04/07/serbian-government-revokes-controversial-covid-19-related-decree-used-as-pretext-to-arrests-journalists/” target=”_blank”>globalvoices.org/2020/04/07/serbian-government-revokes-controversial-covid-19-related-decree-used-as-pretext-to-arrests-journalists/

Europe’s other Coronavirus victim: information and data rights (24.03.2020)
balkaninsight.com/2020/03/24/europes-other-coronavirus-victim-information-and-data-rights/ (opens in a new tab)” href=”balkaninsight.com/2020/03/24/europes-other-coronavirus-victim-information-and-data-rights/” target=”_blank”>balkaninsight.com/2020/03/24/europes-other-coronavirus-victim-information-and-data-rights/

Montenegrin Coronavirus patients’ identities exposed online (18.03.2020)
https://bird.tools/montenegrin-coronavirus-patients-identities-exposed-online/

(In Serbian) Vučić: Ne ostavljajte telefone, nećete nas prevariti! ZNAMO da se krećete (19.03.2020)
https://mondo.rs/Info/Drustvo/a1298105/Aleksandar-Vucic-policija-telefonski-brojevi-policijski-sat-upozorenje-krecu-se.html

(In Serbian) Podnijeli inicijativu za ocjenu ustavnosti Odluke NKT-a (23.03.2020)
http://www.rtcg.me/koronavirus/crnagora/273340/podnijeli-inicijativu-za-ocjenu-ustavnosti-odluke-nkt-a.html

(Contribution by Filip Stojanovski from EDRi member Metamorphosis)

close
15 Apr 2020

EDRi is looking for a consultant on anti-discrimination in digital environments

By EDRi

European Digital Rights (EDRi) is a network of over 42 civil and human rights organisations from across Europe. We defend rights and freedoms in the digital environment.

Project Description

EDRi seeks a consultant to conduct research (a “mapping” exercise) into the existing points of engagement with anti-discrimination issues within the digital rights field. This project will involve providing a comprehensive picture of the key actors, initiatives and organisations related to this topic, and forming an analysis of the progress needed to reach a robust level of engagement with anti-discrimination. The deadline to apply is Thursday 30th April 2020.

Objective: To advance EDRi’s understanding of the intersections between digital rights and anti-discrimination by outlining the most relevant work being conducted in the field, specifically the key actors, organisations and activities.

Person specification: The ideal candidate has experience conducting similar sectoral mapping exercises; a familiarity with the European digital rights field; and a sound understanding of discrimination issues in the European context.

Activities

This project will consist of the following:

  • Conducting a mapping exercise outlining the key actors, organisations and activities across Europe focused on anti-discrimination issues in digital spaces. The scoping should span a wide range of data points including, but not limited to:
    • actors, from key individuals, projects and initiatives, firms, institutions, organisations and collectives;
    • forms of discrimination and exclusion, including on race/ethnicity, migration status, class, age, gender, sexual orientation, gender identity, disability, religion, from an intersectional perspective;
    • geographies, (the mapping should include input from as many European countries as possible);
    • related areas, we foresee a non-exhaustive possibility of areas the consultant may uncover, from social welfare, policing, employment, issues digital platforms, digital inclusion, diversity.
  • Providing an assessment of the extent of engagement with anti-discrimination issues in the digital rights field according to the research findings.
  • Engaging on a regular basis with EDRi to report on progress and findings.

Outputs: We ask that the consultant provide one report summarising the results of the mapping exercise and analysis.

Working methods: We propose that the research is conducted by combination of:

  • desk research
  • (online) interviews
  • other methodologies in addition according to the researchers’ discretion.

Timeline

The deadline to apply to the call is Thursday 30th April 2020. The project duration is 2 months, ideally between May and June 2020. The final timeline is to be decided upon agreement. The key phases of the project are:

  • Scoping (early May 2020): consultant and EDRi undergo introductions, define the scope and methodologies
  • Research and drafting (May 2020): consultant carries out and drafts mapping
  • Feedback and Revisions (June 2020): feedback from EDRi and revisions.

Background

This project forms part of a broader process in which EDRi aims to progress toward greater inclusivity in in the European digital rights movement, ensuring interconnections with a broad range of social justice issues and fully delivering on our commitment to protect the digital rights of all.

Great strides have been made to highlight the need for technology and digital rights field to reflect on broader social justice issues, such as discrimination, state violence inequalities, social exclusion and how they relate to digital rights. However, many such initiatives reside in the Unites States, with minimal reflection in mainstream the digital rights environment.

Further details: The consultant will report directly to Sarah Chander (Senior Policy Adviser at EDRi). Remuneration is negotiable depending on experience.

How to apply

Send an email to Sarah at sarah.chander(at)edri(dot)org by Thursday 30th April 2020 with “Anti-Discrimination Consultant” in the subject line, with a CV and a brief paragraph outlining your suitability for the project.

We are an equal opportunities employer with a strong commitment to transparency and inclusion. We strive to have a diverse and inclusive working environment. We therefore encourage individual members of groups at risk of discrimination to apply for this post.

close
09 Apr 2020

DSA: Platform Regulation Done Right

By EDRi

In 2019, the President of the European Commission has committed to upgrade the Union’s liability and safety rules for digital platforms, services and products, with a new Digital Services Act (DSA). The upcoming proposal, expected at the end of the year 2020, would, among others, regulate how platforms should deal with potentially illegal content that they host on their servers.

In its position paper Digital Services Act: Platform Regulation Done Right, European Digital Rights (EDRi) releases its first fundamental rights-based recommendations for the upcoming DSA. The recommendations represent the voice of 42 digital rights organisations active in Europe.

The DSA is as a unique opportunity to improve the functioning of platforms as public space in our democratic societies, to uphold people’s rights and freedoms, and to shape the internet as an open, safe and accountable infrastructure for everybody.

These recommendations are the results of 8 months of collaboration in the EDRi network and beyond, including with groups that represent victims of illegal content. We look forward to engaging on this very important piece of legislation in the next period. EDRi encourages other civil society organisations and citizens to reply to the upcoming Commission consultation and support the protection of fundamental rights online.

Read more:

Full Paper: ‘Digital Services Act: Platform Regulation Done Right’ (09. 04. 2020)
https://edri.org/wp-content/uploads/2020/04/DSA_EDRiPositionPaper.pdf


Summary ‘Digital Services Act: Platform Regulation Done Right’ (09. 04. 2020)
https://edri.org/wp-content/uploads/2020/04/DSA_EDRiPositionPaper_Summary.pdf

close
01 Apr 2020

Press Release: EDRi calls for fundamental rights-based responses to COVID-19

By EDRi

In a recent statement released on 20 March 2020, European Digital Rights (EDRi) calls on the Member States and institutions of the European Union (EU) to ensure that, while developing public health measures to tackle COVID-19, they:

  • Strictly uphold fundamental rights;
  • Protect data for now and the future;
  • Limit the purpose of data for COVID-19 crisis only;
  • Implement exceptional measures for the duration of the crisis only;
  • Condemn racism and discrimination;
  • Defend freedom of expression and information.

EDRi’s Head of Policy, Diego Naranjo, explains that:

EDRi supports necessary, proportionate measures, fully in line with national and international human rights and data protection and privacy legislation, taken in order to tackle the COVID – 19 global pandemic. These measures must not, however, set a precedent for rolling back the fundamental rights obligations enshrined in European law.

EDRi recognises that Coronavirus (COVID-19) disease poses a global public health challenge of unprecedented proportions. The use of good-quality data can support the development of evidence-based responses. However, we are witnessing a surge of emergency-related policy initiatives, some of them risking the abuse of sensitive personal data in an attempt to safeguard public health. When acting to address such a crisis, measures must comply with international human rights law and cannot lead to disproportionate and unnecessary actions. It is also vital that measures are not extended once we are no longer in a state of emergency.

EDRi’s Executive Director, Claire Fernandez, emphasises that:

In times of crisis, our authorities and communities must show responsibility, resilience, solidarity, and offer support to healthcare systems in order to protect our lives. States’ emergency responses to the COVID-19 pandemic must be proportionate, however, and be re-evaluated at specified intervals. By doing this, states will prevent the normalisation of rights-limiting measures, scope creep, data retention or enhanced surveillance that will otherwise be harmful long after the impacts of the pandemic have been managed.

In these times of pandemic and emergency measures, EDRi expresses solidarity towards collective protection and support for our health systems. We will continue monitoring and denouncing abuses of human rights in times when people are particularly vulnerable.

Read full statement: EDRi calls for fundamental rights-based responses to COVID-19:https://edri.org/covid19-edri-coronavirus-fundamentalrights/

EDRi Members and Observers’ Responses to COVID-19:

Joint civil society statement – “States use of digital surveillance technologies to fight pandemic must respect human rights.” https://edri.org/wp-content/uploads/2020/04/Joint-statement-COVID-19-and-surveillance-FINAL1.pdf

noyb – Active overview of projects using personal data to combat SARS-CoV-2. https://gdprhub.eu/index.php?title=Data_Protection_under_SARS-CoV-2

Access Now – “Protect digital rights, promote public health: toward a better coronavirus response.” https://www.accessnow.org/protect-digital-rights-promote-public-health-towards-a-better-coronavirus-response/

Article 19 – “Coronavirus: New ARTICLE 19 briefing on tackling misinformation.” https://www.article19.org/resources/coronavirus-new-article-19-briefing-on-tackling-misinformation/

Bits of Freedom – “Privacy is geen absoluut recht, maar wel een noodzaak.” https://www.bitsoffreedom.nl/2020/03/20/privacy-is-geen-absoluut-recht-maar-wel-een-noodzaak/

Defesa dos Dereitos Digitais (D3) – ” A pandemia COVID19 e os direitos digitais.” https://direitosdigitais.pt/comunicacao/noticias/88-a-pandemia-covid19-e-os-direitos-digitais

Digitalcourage – “Coronavirus: Tipps fürs Onlineleben und Grundrechtsfragen.” https://digitalcourage.de/corona

Digitale Gesellschaft – “Menschenrechte gelten nicht nur in „guten“ Zeiten.” https://digitalegesellschaft.de/2020/03/menschenrechte-gelten-nicht-nur-in-guten-zeiten/

EFF – “EFF and COVID-19: Protecting Openness, Security, and Civil Liberties.” https://www.eff.org/deeplinks/2020/03/eff-and-covid-19-protecting-openness-security-and-civil-liberties

epicenter.works – “Digital rights implications of the COVID-19 crisis.” https://en.epicenter.works/content/digital-rights-implications-of-the-covid-19-crisis

GFF – “Corona und Grundrechte: Fragen und Antworten.” https://freiheitsrechte.org/corona-und-grundrechte/

Hermes Center – “Il Centro Hermes chiede al governo una risposta all’emergenza COVID-19 nel pieno rispetto dei diritti umani.” https://www.hermescenter.org/hermes-governo-emergenza-covid19-rispetto-privacy-diritti-umani/

Homo Digitalis – “Homo Digitalis για την πανδημία του Κορωνοϊού.” https://www.homodigitalis.gr/posts/5340

noyb – “Data protection in times of corona: not a question of if, but of how.” https://noyb.eu/en/data-protection-times-corona

Open Rights Group – “In the Coronavirus crisis, privacy will be compromised—but our right to know must not be.” https://www.openrightsgroup.org/blog/2020/in-the-coronavirus-crisis-privacy-will-be-compromised-but-our-right-to-know-must-not-be

Panoptykon – “Wolność i prywatność w dobie koronawirusa.” https://panoptykon.org/wiadomosc/wolnosc-i-prywatnosc-w-dobie-koronawirusa

Privacy International – “Extraordinary powers need extraordinary protections.” https://privacyinternational.org/news-analysis/3461/extraordinary-powers-need-extraordinary-protections

SHARE Foundation – “Digitalna prava, pandemija i Balkan.” https://www.sharefoundation.info/sr/digitalna-prava-pandemija-i-balkan/

close
01 Apr 2020

Surveillance by default: PATRIOT Act extended?

By Rafael Hernandez

On 15 March, Section 215 of the USA PATRIOT Act, and several other similar legal provisions, were due to expire and begin the process of reform and review to incorporate new legal protections of privacy. However, as a result of a coordinated effort by both chambers of the US Congress, the provisions may be extended for at least 77 days.

Section 215 was originally introduced in 2001 as part of the USA PATRIOT Act, a landmark piece of legislation passed soon after the September 11th attacks as an amendment to the Foreign Intelligence Surveillance Act of 1978 (FISA). The PATRIOT Act was designed to strengthen national security and law enforcement capabilities. It gave federal agencies like the Federal Bureau of Investigation (FBI) new and expanded competences like the permission to search a home or business without consent from the owner, indefinite detention of immigrants, etc.

Section 215 is a provision of the PATRIOT Act known as the “business records” provision. It allows the government and law enforcement agencies to order third parties to produce “specific and tangible” things such as books, records, papers, documents, and other items, when the FBI is conducting either an investigation into a “foreign intelligence,” or an investigation to protect against “international terrorism” or “clandestine intelligence activities” (even if the investigation targets US citizens). It has been at the centre of many controversies of government overreaching and privacy violations. As EDRi member the Electronic Frontier Foundation (EFF) explained:

In the hearings last year, witnesses confirmed that the 215 ‘business records’ provision may allow the government to collect sensitive information, like medical records, location data, or even possibly footage from a Ring camera.

Section 215 had been the centrepiece of Edward Snowden’s leaks to The Guardian in 2013, where he revealed that the Bush and Obama administrations had been abusing the aforementioned provision to obtain phone data of US citizens in bulk. It was the most egregious violation of privacy by the US government in recent history; and it happened in secret. The Snowden leaks provoked a legislative reaction by Congress with the passage of the USA FREEDOM Act, which took several measures to curtail the authority of law enforcement agencies, though extended Section 215 almost in its entirety to the end of 2019, and later to March 2020.

The threat has not gone away

Section 215, along with at least two other provisions (the roving wiretap and lone wolf surveillance authorities), were meant to be included in FISA reform legislation designed to introduce amendments and changes that would increase protections of individual privacy against governmental intrusion. This was the hope of a host of activist groups, non-profit organizations, etc., that saw the expiration of these provisions as a chance to overhaul the information access system in the US. The reforms were timed to take advantage of FISA’s expiration date of March 15, 2020.

However, last week the House of Representatives passed a bill that essentially extended Section 215 for three more years through 2023 – though this House bill did include several minor changes that took some of the criticism into account, like extending prison penalties for engaging in secret surveillance. When the bill went to the Senate for final approval, however, Majority Leader Mitch McConnell (Republican) and the Senate, instead of voting on the bill and debating its proposed changes, decided to punt any decision regarding this legislative proposal and unanimously passed an extension of Section 215 of the USA PATRIOT Act for 77 days, though it would still be subject to opposition from recessed House members and to presidential approval. What would this extension mean? It would essentially delay any kind of discussion on whether Section 215 will be allowed to expire and what kind of replacement parameters will be introduced.

What happens now?

It remains unclear what will happen to Section 215, now that the COVID-19 crisis has thrown the political landscape into disarray. But, as the USA FREEDOM Act bipartisan effort demonstrates, the push to maintain this overbearing and invasive legislation endures. EDRi member EFF, who has been regularly advocating for privacy and legislative reform, is actively pushing for change:

It is past time for reform. Congress has already extended these authorities without reform once, without debate and without consideration of any meaningful privacy and civil liberties safeguards. If Congress attempts to extend these authorities again without significant reform, we urge members to vote no and to allow the authorities to sunset entirely.

What matters now is that this landmark legislative provision is allowed to sunset, and the reform process for the authority to access private data by law enforcement agencies begins anew. Whether we will see this hope come to fruition, however, remains to be seen.

Read more:

Reform or Expire (26.02.2020)
https://www.eff.org/deeplinks/2020/02/reform-or-expire

Enough is enough: Let it expire (18.03.2020)
https://www.eff.org/Enough-is-enough-let-215-expire

Congress extends Section 215 surveillance program (29.11.2019)
https://epic.org/2019/11/congress-extends-section-215-s.html

EPIC to Congress: End Section 215 Surveillance Program (10.12.2019)
https://epic.org/2019/12/epic-to-congress-end-section-2-1.html

Three FISA authorities sunset in December: Here’s what you need to know (16.01.2019)
https://www.lawfareblog.com/three-fisa-authorities-sunset-december-heres-what-you-need-know

What happened to FISA reform? (17.03.2020)
https://www.lawfareblog.com/what-happened-fisa-reform

(Contribution by Rafael Hernández, communications intern, EDRi)

close
01 Apr 2020

Competition law: what to do against Big Tech’s abuse?

By Laureline Lemoine

This is the second article in a series dealing with competition law and Big Tech. The aim of the series is to look at what competition law has achieved when it comes to protecting our digital rights, where it has failed to deliver on its promises, and how to remedy this.

Read the first article on the impact of competition law on your digital rights here.

Almost everybody uses products or online services from Big Tech companies. These companies make up a considerable part of our online life.

This concentration of power in some sectors of the digital market (think search, social media, operating systems) by a small number of companies is having devastating effects on our rights. These companies are able to grow exponentially by constantly watching us and harvesting our personal data, which they then sell to data brokers, governments and dodgy third parties. With billions of users, these companies acquire an unprecedented level of knowledge about people’s most intimate lives.

They were able to achieve this by nudging people into giving up their personal data and by artificially creating powerful network effects linked to their dominant position that keeps users on a platform despite its intrusiveness. Accessing large quantities of data and creating locked-in user communities gives dominant platforms a strong competitive advantage while creating barriers of entry for competitors.

While being in a dominant position is not illegal, abusing that position is. And most Big Tech companies have been fined for abuses or are currently under investigation. Google alone had to pay 8 billion euros of fines in only three years.

And yet, in an interview given in December of 2019, Competition Commissioner Margrethe Vestager admitted that her fines have been unable to restore competition between Big Tech and smaller rivals because companies had “already won the market”.

So if fines do not work, what does? Have current antitrust laws reached their limits?

Traditional antitrust law assess the abuse of a dominant position ex-post, when the harm has been done and through lengthy investigations. Several ideas to get antitrust law up to speed with the digital economy are being discussed and are worth considering.

Giving back the freedom to choose

Speed alone, however, is unlikely to solve the problem. Policy recommendations at EU and national levels highlight the need for new ex-ante measures “to ensure that markets characterised by large platforms with significant network effects acting as gate-keepers, remain fair and contestable for innovators, businesses, and new market entrants”.

The new Digital Services Act (DSA) announced by the European Commission provides an opportunity for the EU to put in place the most urgent ex-ante measures without having to go through a full reform of its long-standing antitrust rules. One key measure that EDRi and many others have been pointing at is to make dominant social media and messaging platforms interoperable. Interoperability would require platforms to open their ‘walled gardens’ to other comparable services so that different users from different platforms can connect and communicate with each other.

This would enable competitors to challenge the huge user bases of incumbent social media platforms which permit the dominance to persist, and allow a fairer redistribution of power between competitors as well as with users. Combined with the right to data portability under the General Data Protection Regulation (GDPR), consumers could regain control over their personal data as they would not feel obliged to use a second-best service just because all their friends and family use it. Interoperability has already been used as a competition remedy in the past: in the Microsoft case, the European Commission required Microsoft to open up its operating system in view of enabling third parties to offer Windows-compatible software programmes.

Moreover, mandatory interoperability would directly strengthen healthy competition among platforms and could even create whole new markets of online services built downstream or upstream, such as third-party client apps or content moderation plug-ins.

The DSA presents a huge opportunity for the EU to decide how central aspects of the internet will look like in the coming decade. By including requirements for Big Tech such as interoperability, the DSA would inspire new competition and drive into a broken market, limit the downsides of user lock-in and reduce negative network effects.

A special status for Big Tech?

Interoperability measures could also be implemented as part of a broader mechanism or scheme for dominant players.

In its contribution to the debate on competition policy and digital challenges, the French competition authority draws on suggestions from several reports and the current reform bill being discussed in Germany to propose a new mechanism for “structuring players”.

They suggest to define these players in three cumulative stages: 1. companies providing online intermediation services; 2. which hold structural market power and 3. which play a role in access to and in the functioning of certain markets in regards to competitors, users or third parties.

This new status could also allow for new ex-post measures. Whenever one of these players would implement a practice that raises competitive concerns, competition authority would be able to intervene, penalise the company, or prohibit the practice in the future. Such triggering practices could consist of hindering access to markets, preferencing their own services, using data to hamper access to a market or make interoperability or data portability more difficult.

Beyond competition law, because of the effect they have on our rights, these companies should be required to limit some of their harmful practices such as data extraction or message amplification. To this effect, they could be imposed other sets of obligations, such as obligations of transparency, access, non-discrimination or device neutrality. Some of these obligations already exist in the P2B regulation addressing relations between online platforms and businesses and could be extended for public scrutiny. Others should be explicitly written into the planned Digital Services Act. Together with strong ex-ante measures, they will help the EU to limit the most damaging behaviour of dominant platforms do today.

Read more:

The European Commission – Shaping Europe’s digital future.
https://ec.europa.eu/info/sites/info/files/communication-shaping-europes-digital-future-feb2020_en_4.pdf

The Autorité de la concurrence’s contribution to the debate on competition policy and digital challenges.
https://www.autoritedelaconcurrence.fr/sites/default/files/2020-03/2020.03.02_contribution_adlc_enjeux_numeriques_vf_en_0.pdf

EU competition chief struggles to tame ‘dark side’ of big tech despite record fines.
https://news.sky.com/story/eu-competition-chief-struggles-to-tame-dark-side-of-big-tech-despite-record-fines-11893440

(Contribution by Laureline Lemoine, EDRi)

close
01 Apr 2020

Facial recognition: Homo Digitalis calls on Greek DPA to speak up

By Homo Digitalis

In the spring of 2019, the Hellenic Police signed a €4 million contract with Intracom Telecom, a global telecommunication systems and solutions vendor, for a smart policing project. Seventy five percent of the project is funded by the Internal Security Fund (ISF) 2014-2020 of the European Commission. The Hellenic Police published a press release for the signature of this contract in December 2019, while the vendor had publicly announced it earlier, in July 2019.

Based on the technical specifications of the contract, the vendor will develop and deliver to the Hellenic Police smart devices with integrated software enabling facial recognition and automated fingerprint identification, among other functionalities. The devices will be in the size of a smartphone, and police officers will be able to use them during police stops and patrols in order check and identify on the spot individuals who do not carry identification documents with them. The police officers will also be able to take a close-up photograph of an individual’s face and collect her/his fingerprints. Then, the fingerprints and the photographs collected will immediately be compared with data already stored in central databases after which the police officers will get the identification results on their devices.

The Hellenic Police claims that this will be a more “efficient” way to identify individuals in comparison to the current procedure, i.e. bringing any individuals who do not carry identification documents to the nearest police station. Based on the timetable for the implementation of the project, the devices and the related systems should be fully functional and ready for use within 20 months of signing the contract. Thus, it is anticipated that the Hellenic Police will be able to use these devices by the beginning of 2021.

Once the Hellenic Police published its press release in December 2019, EDRi observer Homo Digitalis addressed an Open Letter to the corresponding Greek minister requesting clarifications about the project. More precisely, based on the provisions of the Directive 2016/680 (LED) and the Greek Law 4624/2019 implementing it, Homo Digitalis asked the Minister of Citizen’s Protection whether or not the Hellenic Police has consulted the Hellenic Data Protection Authority (DPA) on this matter and/or conducted a related Data Protection Impact Assessment (DPIA) and what the applicable safeguards are, as well as to clarify the legal provisions that allow for such data processing activities by the Hellenic Police.

In February 2020, the Hellenic Police replied but neither confirmed nor denied that a prior consultation with the Hellenic DPA took place or that a DPIA was conducted. Moreover, Homo Digitalis claims that the Hellenic Police did not adequately reply about the applicable safeguards and the legal regime that justifies such data processing activities.

As a result of this inaction from public authorities, on March 19, 2020 Homo Digitalis filed a request for opinion to the Hellenic DPA regarding this smart policing contract. The request is based on the national provisions implementing article 47 of the LED which provides for the investigatory, corrective and advisory powers of the DPAs.

With this request, Homo Digitalis claims that the processing of biometric data, such as the data described in the contract, is allowed only when three criteria are met: 1. it is authorised by Union or Member State law, 2. it is strictly necessary, 3. and it is subject to appropriate safeguards for the rights and freedoms of the individual concerned. None of the above mentioned criteria is applicable in this case. Specifically, there are no special legal provisions in place allowing for the collection of such biometric data during police stops by the Hellenic police. Moreover, the use of these devices cannot be justified as strictly necessary since the identification of an individual is adequately achieved by the current procedure used. Nevertheless, such processing activities are using new technologies, and are very likely to result in a high risk to the rights and freedoms of the data subjects. Therefore, the Hellenic Police is obliged to carry out, prior to the processing, a data protection impact assessment and to consult the Hellenic DPA.

Read more:
Homo Digitalis’ request for opinion to the Hellenic DPA (only in Greek, 19.03.2020)
https://www.homodigitalis.gr/wp-content/uploads/2020/03/HomoDigitalis.pdf

Press Release of Hellenic Police (only in Greek, 14.12.2019)
http://www.astynomia.gr/images/stories/2019/prokirikseis19/14122019anakoinosismartpolicing.pdf

Press Release of Intracom Telecom (02.07.2019)
http://www.intracom-telecom.com/en/news/press/press2019/2019_07_02.htm

The technical specifications of the smart policing contract (Only in Greek, 12.04.2018)
http://www.astynomia.gr/images/stories/2018/prokirikseis18/12042018-texn_prod.pdf

Homo Digitalis’ Open Letter to the Minister of Citizen’s Protection (only in Greek, 16.12.2019)
https://www.homodigitalis.gr/posts/4662

Reply to Homo Digitalis’ Open Letter by the Hellenic Police (only in Greek, 14.02.2020)
https://www.homodigitalis.gr/wp-content/uploads/2020/02.pdf

(Contribution by Eleftherios Chelioudakis, EDRi observer Homo Digitalis, Greece)

close