Roma & Sinti rights, Resistance & Facial Recognition: RYF in Conversation…

For communities that have been historically sidelined, the promises of digitalisation can instead become a vessel for yet more discrimination and unequal treatment. Facial recognition in particular has a sinister and dark history which links to the persecution of Romani communities. If you missed our webinar on Roma and Sinti rights and the rise of facial recognition across Europe, you can catch up here and learn what the digital rights community can and should do!

By EDRi · August 18, 2021

In April 2021, EDRi and the Reclaim Your Face campaign (RYF) hosted a webinar on “Facial recognition, Resistance and Roma & Sinti rights”. The purpose of this event was to explore, alongside Roma and Sinti rights activists/researchers who are also experts in data and digitalisation, the intersection of Romani rights with the rise of facial recognition and other forms of biometric mass surveillance across Europe.

The phrase of the day was “paradigm shift”, as Benjamin Ignac (a Romani researcher and activist from Croatia, living and working in Berlin) and Roxanna-Lorraine Witt (a Sinti person, formerly working for the Council of German Sinti and Roma until last year and the founder of Save Space e.V.) took attendees through a powerful exploration of how technological developments are transforming every aspect of our lives, and why this matters for Romani and non-Romani people alike.

From criminal justice systems to policy-making, they urged everyone that cares about equality and justice to educate themselves about the risks of automation and artificial intelligence technologies to embed and further perpetuate systemic biases; of the need to tackle digital literacy divides; why we should empower Romani people to take up technical and leadership roles in STEM and policy fields; and the need to build radical coalitions across intersectional lines that will help us challenge facial recognition’s discriminatory gaze, and shift the paradigm in order to tackle digital exclusion in its many inter-connected forms.

Romani rights are digital rights

Roma and Sinti rights are important for the digital rights community because whilst there are many experts within Romani communities working on issues of data and digitalisation, other parts of the digital rights field often have little to no awareness of this important work. And as Benjamin points out, digitalisation can pose an additional challenge for Romani people that have long faced prejudice and marginalisation. He adds that if we don’t address this gap as a policy priority soon, it will only serve to keep Romani people on the sidelines:

Roma and Sinti people are not only socially and economically excluded, but also digitally excluded, in Europe…. The pandemic has exacerbated this. The EU is neglecting the fact that the majority of Roma … are still poorly connected to the internet, we have reduced access to digital technologies, lower levels of digital literacy. … It still baffles me how Roma people who are disconnected digitally from these new technological threats, how these people are not part of the conversation at the European level, at the national level, the NGOs level. The lack of investment in this digital aspect in Roman and Sinti rights is just going to keep us at the bottom, in a way. In the race of growth and development, we will be lagging behind if we are out of the loop.”

Roxy adds that this lack of understanding of digital technologies and what they can mean for our societies affect not just Romani individuals, but often the groups that represent them, as well as politicians:

When we are talking about digital literacy, it is not only the average Romani person, but also big organisations representing us at an international level … high level politicians. It’s a lot of burden of shame to admit that you have no clue about this – but it’s not only Romani people but also Gadjo people [non-Romani people]…How to access the least privileged people among our communities. … as well as high-profile politicians in international politics who are not really engaged in such important topics, and how to give them digital literacy.”

Facial recognition and the legacy of historical oppression

In the first part of the webinar, we examined how facial recognition reproduces and gives a false legitimacy to techniques that come from eugenics, sometimes referred to as Nazi “race science”. As Roxy explains:

The premise of facial recognition technology and biometric data collection is that you can tie certain facial features […] to an identity. But identity is a social construction, a political, social, economic construct that is not exclusively tied to facial features. You cannot tie this to the mathematics of your face or of your genes.

I should mention the history of biometric data collection […] It has always touched discrimination and even the mass murder of Romani people. When we think about the history of biometric data collection, we have to go back to the Holocaust. […] The first human experiments were done on Roma and Sinti people in Auschwitz. How were these people identified? People […] who they estimated to be Romani and then measuring their faces with rulers. And then suddenly people were taken into concentration camps […] [as part of] an ideology and a regime that aims to completely erase Romani people from earth. […] The use of certain technology [like facial recognition] is based on premises that are – and I’m sorry to say it – but are just bullshit.”

Uses of biometric mass surveillance today – like facial recognition in public spaces – are reproducing these same assumptions and prejudices. Marginalised and minoritised groups are repeatedly used as the ‘testing grounds’ for new technologies. Benjamin continued by explaining how these sorts of false assumptions about identity are linked to wider issues of discrimination and persecution, including on the basis of people’s personal data:

I hate that I need to live in a world where I feel like I have to hide my Roma identity because this very identity can be used against me […] Having governments using this identity or data about Roma in that way is totally unacceptable. We should be proud of our identity […] [But] we have plenty of examples that in the wrong hands, data about Roma will be used against us.”

He also tied this mass gathering of people’s facial and bodily data to structural issues of government surveillance and informed by the philosophy of the “panopticon”:

European governments have a fetish, let’s say, for surveillance. In Germany, police retrospectively identified footage from the G20 protests to identify protesters. Why have the freedom to go to a protest if this can be used against you? Control is easier if you watch people and they know they are being watched.

The scary thing is that the infrastructure is already here….it’s already being used … in many cases, we do not even know it is happening.”

This is leading to a perfect storm of racialised policing and racist justice systems; a recent report has shown the extent of anti-Roma discrimination in European criminal justice systems.

Whose safety are we talking about?

Another key issue exposed in this webinar was the fact that whilst discourse about the need for public facial recognition is often centered on “safety” and “security”, this framing exacerbates existing structures of inequality, in which certain lives are assigned less value, less importance, less deserving of safety – and even constructed as a threat to other people’s safety.

Everything from facial recognition tech to search algorithms are coded to tie Romani identities to criminality showing that the proliferation of biometric mass surveillance technologies does not have Romani people’s safety in mind. In fact, it can exacerbate structural racism and other forms of discrimination against marginalised groups. Roxy explains:

The argument for the collection of the data, for that surveillance … is safety. Whose safety are we talking about? Romani people’s safety it isn’t. … We are not the perpetrators in history.

[These practices are] based on the premises of the developers and the safety definition of the developers. … This is why the intersectional perspective is so important, [otherwise] it will always be based on the safety of white people. … We need to start to discover and dismantle the ideologies underlying the technology… “The whole system is made for white people and white people’s safety. … We have to shift the paradigm of who is criminal and who is not”

This allowed us to think through some really deep structural questions: what does a non-biased world look like? Is it possible? What does non-biased even mean in reality, and how would we get there? Both Benjamin and Roxy used this opportunity to flag how issues of biometric mass surveillance intersect with other structural inequalities: Ben, for example, has researched the problems relating to algorithmic risk scoring and deciding of prison sentences. And Roxy pointed to the inhumanity of using facial recognition technology at borders to refuse safe transit for people fleeing from threats to their lives – and this can include Romani people if they are marked out as ‘criminal’ by facial recognition algorithms.

We also know that biometric mass surveillance repeatedly focuses on identifying and persecuting petty crimes (parking fines, littering, loitering), over serious state and white-collar crimes (financial crimes, genocide, war). By unpacking this, we may be able to start shifting the discriminatory gaze that has been encoded through the use of facial recognition tech.

The Future

Through this conversation we were able to identify many threats and risks posed by facial recognition technology, the discussion about discrimination and inequality also allowed us to look at the points at which we might be able to drive positive change. Benjamin notes that one way to do this could be by resisting the use of algorithms for criminal justice purposes and other harmful uses (as EDRi has also advocated):

There’s a lot of strain on the legal system in Europe … which makes justice not always fast. With the technological boom that we’re in, this paradigmatic shift, it’s going to make things slower. [The criminal justice system is] not ready to deal with the incoming cases. Roma NGOs and civil society are equally disconnected from the decision-making, which makes us in a way sitting ducks in this situation. We are just waiting for the things to happen, and it’s particularly challenging. Technological racism, technological discrimination is particularly challenging for realising Roma rights”

Roxy further points to the opportunities offered by taking an intersectional approach, from which the digital rights community can learn:

Romani people can be black, can be LGBTIQ people, can be women, can be children, can be whatever. So there is a broad range of communities tied to this issue when we are talking about Romani rights and the digital environment”

So whose responsibility is it to include the Romani communities? It’s the responsibility of those who are privileged enough. So tech experts, tech companies, maybe from the side of private companies, maybe from the side of policy-makers, they need to empower Romani communities to have a say in this, and everything that that means: to give up resources – or to share resources with them – share privileges, share digital literacy resources, support those communities and legislators in their digital emancipation and take away the shame from it. … [It’s ridiculous that] Elon Musk is trying to get to Mars while other people are like ‘I barely know how to use my iPhone!’”

So why should anyone that cares about equality and justice care about the rise in facial recognition and biometric mass surveillance? Benjamin draws a comparison to the climate crisis, which humans have known about for 100 years and have only recently started to act upon en masse, but should have done so much sooner:

Now is the time to protect our communities, build our justice system, contribute to legislation. We all need to have a voice in which direction we want technology [to go]: efficiency-centred, or human-centred with fundamental rights as a cornerstone?”

Benjamin and Roxy concluded that there are many tangible steps that we can take to achieve this. From ensuring that Romani people are in policy positions and leadership roles, to increasing digital literacy and removing the shame for those that currently feel left behind. From more research to truly understand the gaps, to funding, coalition-building, education and empowerment of Romani NGOs. By better understanding the issues facing Romani communities, we will all be better equipped to fight for digital rights.

If you enjoyed this webinar, you might like our other resources on Romani rights and biometric mass surveillance, including a podcast from the Romani tea room, and the first Reclaim Your Face video in a Romani dialect. See our blog “Roma rights and biometric mass surveillance” for all this and more.

Ella Jakubowska

Policy Advisor

Twitter: @ellajakubowska1