Digital rights as a security objective: Fighting disinformation
Violations of human rights online, most notably the right to data protection, can pose a real threat to electoral security and societal polarisation.
Violations of human rights online, most notably the right to data protection, can pose a real threat to electoral security and societal polarisation. Yet, we are still missing a comprehensive understanding that commercial and political misuse of digital tools are two sides of the same coin – and that we can neither afford weighing digital rights against economic benefits, nor encroaching on citizen’s liberties through ever-new ways of filtering and monitoring. In this series of blogposts, we’ll explain the relationship between digital rights and security objectives. The first part of the series focuses on the role of strong privacy regimes in resilience against disinformation.
The Cambridge Analytica Scandal, which exposed the use of the information from millions of Facebook users for a perverse form of granular political micro-targeting, revealed that the tracking, monitoring and assessment of citizens’ online behaviour could not only be used for stalking people with commercial advertising. In a much more sinister twist, we had to realise that targeting individuals with pinpoint displays of advertising based on inferences from their personal data opened the door for election interference and disinformation campaigns on an unprecedented scale.
Moreover, motivating one’s most dedicated followers with glorifying information about the own candidacy or organisation, while simultaneously targeting people more inclined to support another movement to suppress their voting intention, suddenly came into the realm of the technically possible. To a great extent, such practices exploit and amplify the already in itself worrying tendency of social networks and internet platforms to create “echo chambers” in which citizens are only presented with news and content that mirrors their own political views, but does not reflect and balance the diversity of opinions within societal debate. When such echo chambers are targeted with dedicated political messages, their already explosive societal potential is set aflame.
In October 2018, the European heads of states called the European Commission to “protect the Union’s democratic systems and combat disinformation, including in the context of the upcoming European elections, in full respect of fundamental rights”. Ironically, many of these same governments are currently blocking the ePrivacy proposal, the exact Regulation that was designed to prevent the most pervasive commercial surveillance such as online tracking and snooping on emails and chat messages. in the negotiations in the Council of the European Union. Even big technology corporations are starting to realise this dangerous potential. However, there seems to be remarkably little reflection in EU policy circles that ultimately it is the commercialisation of personal data on the internet that has brought about these developments. Everyone seems to agree that disinformation should be tackled, but the same online tracking, clickbait news and profiling in social media that is used to create disinformation and radicalisation should not be regulated more tightly – not to harm the “European” (in fact mostly US based) data economy.
Tracking cookies are one of the primary tools of behavioural targeting on the internet, including for political purposes, and metadata constitutes one of the most sensitive and easiest to process forms of data. Not dealing with these issues while pretending to care about disinformation is absurd.
To stop medicating the symptoms and tackle the root cause of the problem instead, a strong privacy regime must thus finally be considered a measure of individual and public safety.
Council continues limbo dance with the ePrivacy standards (24.10.2018)
Five Reasons to be concerned about the Council ePrivacy draft (26.09.2018)
EU Council considers undermining ePrivacy (25.07.2018)
Your ePrivacy is nobody else’s business (30.05.2018)
e-Privacy revision: Document pool (10.01.2017)
(Contribution by Yannic Blaschke, EDRi intern)