Why COVID-19 is a Crisis for Digital Rights
The COVID-19 pandemic has triggered an equally urgent digital rights crisis. New measures being hurried in to curb the spread of the virus, from “biosurveillance” and online tracking to censorship, are potentially as world-changing as the disease itself.
The COVID-19 pandemic has triggered an equally urgent digital rights crisis.
New measures being hurried in to curb the spread of the virus, from “biosurveillance” and online tracking to censorship, are potentially as world-changing as the disease itself. These changes aren’t necessarily temporary, either: once in place, many of them can’t be undone.
That’s why activists, civil society and the courts must carefully scrutinise questionable new measures, and make sure that – even amid a global panic – states are complying with international human rights law.
Human rights watchdog Amnesty International recently commented that human rights restrictions are spreading almost as quickly as coronavirus itself. Indeed, the fast-paced nature of the pandemic response has empowered governments to rush through new policies with little to no legal oversight.
There has already been a widespread absence of transparency and regulation when it comes to the rollout of these emergency measures, with many falling far short of international human rights standards.
Tensions between protecting public health and upholding people’s basic rights and liberties are rising. While it is of course necessary to put in place safeguards to slow the spread of the virus, it’s absolutely vital that these measures are balanced and proportionate.
Unfortunately, this isn’t always proving to be the case. What follows is an analysis of the impact of the COVID-19 pandemic on the key subset of policy areas related to digital rights:
a) The Rise of Biosurveillance
A panopticon world on a scale never seen before is quickly materialising.
“Biosurveillance” – which involves the tracking of people’s movements, communications and health data –
has already become a buzzword, used to describe certain worrying measures being deployed to contain the virus.
The means by which states, often aided by private companies, are monitoring their citizens are increasingly extensive: phone data, CCTV footage, temperature checkpoints, airline and railway bookings, credit card
information, online shopping records, social media data, facial recognition, and sometimes even drones.
Private companies are exploiting the situation and offering rights-abusing products to states, purportedly to help them manage the impact of the pandemic. One Israeli spyware firm has developed a product it claims can track the spread of coronavirus by analysing two weeks’ worth of data from people’s personal phones, and subsequently matching it up with data about citizens’ movements obtained from national phone companies.
In some instances, citizens can also track each other’s movements – leading to not only vertical, but also
horizontal sharing of sensitive medical data.
Not only are many of these measures unnecessary and disproportionately intrusive, they also give rise to secondary questions, such as: how secure is our data? How long will it be kept for? Is there transparency around how it is obtained and processed? Is it being shared or repurposed, and if so, with who?
b) Censorship and Misinformation
Censorship is becoming rife, with many arguing that a “censorship pandemic” is surging in step with COVID-19.
Oppressive regimes are rapidly adopting “fake news” laws. This is ostensibly to curb the spread of misinformation about the virus, but in practice, this legislation is often used to crack down on dissenting voices or otherwise suppress free speech. In Cambodia, for example, there have already been at least 17
arrests of people for sharing information about coronavirus.
At the same time, many states have themselves been accused of fuelling disinformation to their citizens to create confusion, or are arresting those who express criticism of the government’s response.
As well as this, some states have restricted free access to information on the virus, either by blocking access to health apps, or cutting off access to the internet altogether.
c) AI, Inequality and Control
The deployment of AI can have consequences for human rights at the best of times, but now, it’s regularly being adopted with minimal oversight and regulation.
AI and other automated learning technology are the foundation for many surveillance and social control tools. Because of the pandemic, it is being increasingly relied upon to fight misinformation online and process the huge increase in applications for emergency social protection which are, naturally, more urgent than ever.
Prior to the COVID-19 outbreak, the digital rights field had consistently warned about the human rights implications of these inscrutable “black boxes”, including their biased and discriminatory effects. The adoption of such technologies without proper oversight or consultation should be resisted and challenged
through the courts, not least because of their potential to exacerbate the inequalities already experienced by those hardest hit by the pandemic.
d) Eroding Human Rights
Many of the human rights-violating measures that have been adopted to date are taken outside the framework of proper derogations from applicable human rights instruments, which would ensure that emergency measures are temporary, limited and supervised.
Legislation is being adopted by decree, without clear time limitations, and technology is being deployed in a context where clear rules and regulations are absent.
This is of great concern for two main reasons.
First, this type of “legislating through the back door” of measures that are not necessarily temporary avoids going through a proper democratic process of oversight and checks and balances, resulting in de facto authoritarian rule.
Second, if left unchecked and unchallenged, this could set a highly dangerous precedent for the future. This is the first pandemic we are experiencing at this scale – we are currently writing the playbook for global crises to come.
If it becomes clear that governments can use a global health emergency to instate human rights infringing measures without being challenged or without having to reverse these measures, making them
permanent instead of temporary, we will essentially be handing over a blank cheque to authoritarian regimes to wait until the next pandemic to impose whatever measures they want.
Therefore, any and all measures that are not strictly necessary, sufficiently narrow in scope, and of a
clearly defined temporary nature, need to be challenged as a matter of urgency. If they are not, we will not be able to push back on a certain path towards a dystopian surveillance state.
e) Litigation: New Ways to Engage
In tandem with advocacy and policy efforts, we will need strategic litigation to challenge the most egregious measures through the court system. Going through the legislature alone will be too slow and, with public gatherings banned, public demonstrations will not be possible at scale.
The courts will need to adapt to the current situation – and are in the process of doing so – by offering new
ways for litigants to engage. Courts are still hearing urgent matters and questions concerning fundamental rights and our democratic system will fall within that remit. This has already been demonstrated by the first
cases requesting oversight to government surveillance in response to the pandemic.
These issues have never been more pressing, and it’s abundantly clear that action must be taken.
If you want to read more on the subject, follow EDRi’s new series #COVIDTech here: https://edri.org/emergency-responses-to-covid-19-must-not-extend-beyond-the-crisis/
This article was originally published at:
Tracking the Global Response to COVID-19:
Russia: doctor who called for protective equipment detained (03.04.2020)
A project to demystify litigation and artificial intelligence (06.12.2019)
Making Accountability Real: Strategic Litigation (30.01.2020)
Accessing Justice in the Age of AI (09.04.2020)
(Contribution by Nani Jansen Reventlow, Digital Freedom Fund)