Blogs | Privacy and data protection | Privacy and confidentiality | Surveillance and data retention

COVID-Tech: Surveillance is a pre-existing condition

In EDRi’s series on COVID-19, COVIDTech, we will explore the critical principles for protecting fundamental rights while curtailing the spread of the virus, as outlined in the EDRi network’s statement on the virus.

By EDRi · May 27, 2020

In EDRi’s series on COVID-19, COVIDTech, we will explore the critical principles for protecting fundamental rights while curtailing the spread of the virus, as outlined in the EDRi network’s statement on the virus. Each post in this series will tackle a specific issue about digital rights and the global pandemic in order to explore broader questions about how to protect fundamental rights in a time of crisis. In our statement, we emphasised that “measures taken should not lead to discrimination of any form, and governments must remain vigilant to the disproportionate harms that marginalised groups can face.” In this third post of the series, we look at surveillance – situating the measures in their longer term trajectory – particularly of marginalised communities.

One minor highlight in this otherwise bleak public health crisis is that privacy is trending. Now more than ever, conversations about digital privacy are reaching the general public. This is a vital development as states and private actors pose ever greater threats to our digital rights in their responses to COVID-19. The more they watch us, the more we need to watch them.

One concern, however, is that these debates have siphoned this new attention to privacy into a highly technical, digital realm. The debate is dominated by the mechanics of digital surveillance, whether we should
have centralised or decentralised contact tracing apps, and how zoom traces us as we work, learn and do yoga at home.

Although important, this is only a partial framing of how privacy and surveillance are experienced during the pandemic. Less prominently featured are the various other privacy infringements being ushered in as a result of COVID-19. We should not forget that for many communities, surveillance is not a COVID-19 issue – it was already there.

The other sides of COVID surveillance

Very real concerns about digital measures proposed as pandemic responses should not overshadow the broader context of mass-scale surveillance emerging before our eyes. Governments across Europe are increasingly rolling out measures to physically track the public, via telecommunications and other data, without explicit reference to how this will impede the spread of the virus, or when the use and storage of this data will end.

We are also seeing the emergence of bio-surveillance dressed in a public health response’s clothing. From
the Polish government’s app mandating the use of geo-located selfies, to talks of using facial biometrics to create immunity passports to facilitate the return of of workers in the UK, governments have, and will continue to, use the pandemic as a cover to get into our homes, and closer to us.

Yet, less popular in media coverage are physical surveillance techniques. Such measures are – in many European countries – coupled with heightened punitive powers for law enforcement. Police have deployed drones in France, Belgium and Spain, and communities in cities across Europe are feeling the pressure of increased police presence in their communities. Heightened measures of physical surveillance cannot be accepted at face value or ignored. Instead, they must be viewed in tandem with new digital developments.

Who can afford privacy?

These measures are not neutrally harmful. In unequal societies, surveillance will always target racialised1 people, migrants, and the working classes. These people bear the burden of heightened policing powers and punitive ‘public health’ enforcement – being more likely to need to leave the house for work, take public transport, live in over-policed neighbourhoods, and in general be perceived as suspicious, criminal, necessitating surveillance.

This is a privacy issue as much as it is about inequality. Except, for some, the consequences of intensified surveillance under COVID-19 means heightened exposure to the virus through direct contact with police, increased monitoring of their social media, the anxiety of constant sirens, and in the worst cases, the real bodily harm of police brutality.

In the last few days, Romani communities in Slovakia reported numerous cases of police brutality, some against children playing outside. Black, brown and working class communities across Europe are experiencing the physical and psychological effects of being watched even more than normal. In Brussels, where EDRi is based, a young man has died in contact with the police during raids.

This vulnerability is economic, too – for many, privacy is a sparse commodity.It is purchased by those who live in affluent neighbourhoods, by those with ‘work from home’ jobs. Those who cannot afford privacy in this more basic sense will, unfortunately, not be touched by debates about contact tracing. For many, digital exclusion means that measures such as contact-tracing apps are completely irrelevant. Worse, if future measures in response to COVID-19 are designed with the assumption that we all use smart phones, or have identity documents, they will be immensely harmful.

These measures are being portrayed as ‘new’, at least in our European ‘liberal’ democracies. But for many, surveillance is not new. Governmental responses to the virus have simply brought to the general public a reality reserved for people of colour and other marginalised communities for decades. Prior to COVID-19, European governments have deployed technology and other data-driven tools to identify, ‘risk-score’ and experiment on groups at the margins, whether by way of predicting crime, forecasting benefit fraud, or assessing whether or not asylum applicants are telling the truth by their facial movements.

We need to integrate these experiences of surveillance into the mainstream privacy debate. These
conversations have been sidelined or explained away with the logic of individual responsibility. For
example, last year, in a public debate on technology and surveillance of marginalised communities, one participant swiftly moved the conversation away from police profiling and toward privacy literacy. They
asked the room of anti-racist activists “does everybody here use a VPN?”

Without a holistic picture of how surveillance affects people differently – the vulnerabilities of communities and the power imbalances that produce this – we will easily fall into the trap that quick fix solutions can guarantee our privacy, and that surveillance can be justified.

Is surveillance a price worth paying?

If we don’t root our arguments in people’s real life experiences of surveillance, not only do we devalue the right to privacy for some, but we also risk losing the argument to those who believe that surveillance is a price worth paying.

This narrative is a direct consequence of an abstract, technical and neutral framing of surveillance and its harms. Through this lens, infringements of privacy are minor, necessary evils. As a result, privacy will always lose the the false ‘privacy vs health’ trade-off. We should challenge the trade-off itself, but we can also ask: who will really will pay the price of surveillance? How do people experience breaches of privacy?

Another question we need to ask is who profits from surveillance? Numerous companies have shown their willingness to enter public-private alliances, using COVID-19 as the opportunity to market surveillance based ‘solutions’ to issues of health (often with dubious claims). Yet, again, this is not new – companies like Palantir, contracted by the UK government to process confidential health data during COVID-19, have a much longer-standing role in the surveillance of migrants and people of colour and facilitating deportations. Other large tech companies will use COVID-19 to continue their expansion into areas like ‘digital welfare’. Here, deeply uneven power relationships will be further cemented with the introduction of digitalised tools, making them harder to challenge and posing ever greater risks to those who rely on the state. If unchallenged, this climate of techno-solutionism will only increase the risk of new technology testing and data-extraction from marginalised groups for profit.

A collective privacy

There is a danger to viewing surveillance as exceptional; a feature of COVID-19 times. It suggests that protecting privacy is only newsworthy when it is about ´everyone’ or ‘society as a whole’. What that means, though is that actually we don’t mind if a few don’t have privacy.

Surveillance measures and other threats to privacy have countless times been justified for the ‘public good’. Privacy – framed in abstract, technical and individualistic terms – simply cannot compete, and ever greater surveillance will be justified. This surveillance will be digital and physical and everything in between, and profits will be made. Alternatively, we can fight for privacy as a collective vision – something everybody should have. Collective privacy is not exclusive or abstract – it means looking further than how individuals might adjust their privacy settings, or how privacy can be guaranteed in contact tracing apps.

A collective vision of privacy means contesting ramped-up police monitoring, the use of marginalised groups as guinea pigs for new digital technologies, as well as ensuring new technologies have adequate privacy protections.

It also requires us to think about who will be the first to feel the impact of surveillance? How do we support
them? To answer these questions, we need to recognise surveillance in all its manifestations, including way
before the outbreak of COVID-19.

Original illustration by Miguel Brieva, licensed under CBNA 2020, La Imprenta, included in “Que No Haya Sido en Vano

Read more:

Telco data and Covid-19: A primer (21.04.20)
https://privacyinternational.org/explainer/3679/telco-data-and-covid-19-primer

Slovak police officer said to have beaten five Romani children in Krompachy settlement and threatened to shoot them (29.04.20)
http://www.romea.cz/en/news/world/slovak-police-officer-said-to-have-beaten-five-romani-children-in-krompachy-settlement-and-threatened-to-shoot-them

Amid COVID-19 Lockdown, Justice Initiative Calls for End to Excessive Police Checks in France (27.03.20)
https://www.justiceinitiative.org/newsroom/amid-covid-19-lockdown-justice-initiative-calls-for-end-to-excessive-police-checks-in-france

Digital divide ‘isolates and endangers’ millions of UK’s poorest (28.04.20)
https://www.theguardian.com/world/2020/apr/28/digital-divide-isolates-and-endangers-millions-of-uk-poorest

The EU is funding dystopian Artificial Intelligence projects (22.01.20)
https://www.euractiv.com/section/digital/opinion/the-eu-is-funding-dystopian-artificial-intelligence-projects

A Price Worth Paying: Tech, Privacy and the Fight Against Covid-19 (24.04.20)
https://institute.global/policy/price-worth-paying-tech-privacy-and-fight-against-covid-19

COVID-Tech: Emergency responses to COVID-19 must not extend beyond the crisis (15.04.20)
https://edri.org/emergency-responses-to-covid-19-must-not-extend-beyond-the-crisis/

COVID-Tech: COVID infodemic and the lure of censorship (13.04.2020)
https://edri.org/covid-infodemic-and-the-lure-of-censorship/

Footnotes

  1. This term refers to racial, ethnic and religious minorities, emphasising that racialisation is a structural process inflicted on people, groups and communities.

(Contribution by Sarah Chander, EDRi senior policy advisor)