Digital rights for civil society and civil society for digital rights: how surveillance technologies shrink civic spaces

Digital technology has transformed civic spaces - online and offline. In our digital societies, characterised by injustice and power imbalances, technology contributes to shrinking civic spaces. And to defend civic spaces against surveillance, we need strong and resourced civil society organisations and movements.

By EDRi · June 28, 2023

Digital technology has transformed civic spaces – the environment that enables people to come together and shape their societies – online and offline. We can now organise with considerable new means that have much potential for civil society to influence change. However, the way technology is used by state and private actors can affect how we access and prioritise information, our ability to communicate and organise, how we make informed decisions and challenge decisions that affect our lives. In our digital societies, characterised by injustice and power imbalances, technology contributes to shrinking civic spaces. And to defend civic spaces against surveillance, we need strong and resourced civil society organisations and movements.

Protecting and advancing digital rights needs open and safe civic spaces

The spaces for civil society to operate in Europe have been ‘shrinking’. This has been recognised by the European Commission and the European Parliament. From Hungary passing a law that imposes strict restrictions on foreign-funded NGOs to push independent civil society organisations (CSOs) out of the country, to France dissolving the anti-Islamophobia NGO Collective Against Islamophobia in France and the climate justice group‘Les Soulevements de la Terre’.

Even at the EU level, the European Commission proposed introducing restrictions on NGOs receiving funding from third-country donors as part of the ‘Defence of Democracy Package’. Their attempts have been delayed thanks to an outcry from many CSOs from across the continent.

Civil society’s space is shrinking, as we challenge the power of state and industry actors abusing their power. In many fields, such as technology policy, this comes in a context where industry has privileged access and increasing budget to influence decision-making. Just this week, the Irish government introduced an proposal that would ‘muzzle critics of the Data Protection Commission’ – a move very much in favour of Big Tech. In Brussels,  as regulators were becoming aware of the risks and harms of AI, OpenAI lobbied to water down the EU AI Act (and succeeded in parts). Big Tech’s spending on lobbying EU lawmakers has increased to a level where it now exceeds all other lobby spending in the private sector. Meanwhile, people bearing the brunt of technological harms – marginalised and over-criminalised groups – seldom have access to decision-making on tech policy . Unlike Big Tech, Digital rights NGOs struggle to access information, meet EU Commissioners or influence negotiations held behind closed doors.

Attacks on digital rights are bad news for civic spaces

Online platforms – and in particular social media dominated by a handful companies – have now been recognised as having a clear impact on information democracy and civic spaces. Commercial digital infrastructure has amplified disinformation and toxic content. This prevents the safe expression and participation of all, especially women of colour, human rights defenders, journalists and politicians. The EU Digital Services Act and Digital Markets Act have brought new opportunities to limit the power of Big Tech, but we need more systemic change to transform the currently dominant surveillance business model.

Some AI systems are also contributing to deterring safe organising and harming our ability to make decisions. Remote biometric identification in public spaces has a chilling effect on freedom of expression and assembly. The use of broader surveillance technologies in public spaces by police and intelligence agencies puts limits on people’s ability to participate in public, social or democratic activities. Law enforcement using AI to predict human behaviour and the likelihood of crime (predictive policing) or emotional recognition are discriminatory uses of AI. They deny the essence of our agency and our dignity, and should be banned in democratic societies. EDRi and a broad civil society coalition will continue pushing for accountability in the EU AI Act negotiations for actors deploying automated decision making.

Private and safe online communications, privacy-enhancing tools and technologies are also prerequisite for democracy and enablers of rights and civic space. For journalists, human rights defenders, organisers and over-criminalised groups, digital security is vital. Yet, the European Commission and some Member States are plotting to undermine end-to-end encryption, and proposing the scanning of our devices in the name of combating crime and protecting children. We must resist providing technosolutionist tools of control to state or private actors for complex societal issues. This doesn’t just deter from effectively and comprehensively addressing these issues, it also disrupts civil society and further restricts space for marginalised groups.

EDRi will continue to operate as the civil society independent watchdog in Europe pushing for technology policy in the interest on people. We want meaningful participation of civil society, transparency of decision-making processes, access to documents and political will to rein in power.