Informing the Disinfo Debate: A Policy Guide for Protecting Human Rights

 Today, 20 December 2021, EDRi, Access Now and Civil Liberties Union for Europe publish a joint report as a continuation of its 2018 predecessor, Informing the “Disinformation” Debate. The main outcome of this report is a set of policy recommendations addressed to the EU co-legislators focusing on: how to effectively mitigate fundamental rights risks that result from the manipulative methods deployed by large online platforms that exploit people’s vulnerabilities and their sensitive data; and how to combat disinformation in a manner that is fully compliant with fundamental rights standards.

By EDRi · December 20, 2021

The 2016 US presidential election and the Brexit vote campaigns shed light on how impactful online disinformation and propaganda can be. EU policymakers sought solutions to mitigate the effects of online disinformation ahead of the 2019 EU elections, continuing to do so during the COVID-19 pandemic and in preparation for the 2024 European elections. However, it is not the phenomenon of disinformation that is a novelty, but the role digital technologies play in helping to create, disseminate, and amplify disinformation. 

Today, 20 December 2021, EDRi, Access Now and Civil Liberties Union for Europe publish a joint report as a continuation of its 2018 predecessor, Informing the “Disinformation” Debate. The 2018 report is among the first by civil society organisations to point to platforms’ problematic business models as a fundamental factor behind the online manipulation of people’s economic and political choices. 

There is now a growing consensus that regulatory approaches must address the business model as a foundational matter, as a large number of policy analyses argue. In this report, we unpack the main methods of manipulation that platforms engage in that harm fundamental rights. These methods stem directly from the platforms’ business models and have a severe impact on the absolute freedom to form an opinion and freedom of thought. They are: 

  1. Surveillance-based advertisement, including political advertising; and
  2. Amplification of disinformation online via content recommender systems and personalisation of news content. 

The main outcome of this report is a set of policy recommendations addressed to the EU co-legislators focusing on: how to effectively mitigate fundamental rights risks that result from the manipulative methods deployed by large online platforms that exploit people’s vulnerabilities and their sensitive data; and how to combat disinformation in a manner that is fully compliant with fundamental rights standards.

The complexity of tackling disinformation has been widely recognised by academia, policymakers, civil society organisations, and human rights advocates. Disinformation and online disinformation is a multifaceted societal issue that cannot be resolved with quick fixes. So far the European Union has failed to put forward effective solutions. The EU’s focus — including in the Code of Practice on Disinformation — on identifying concrete categories of online content for removal and promoting metrics of success that include the quick takedown of a high quantity of content has clearly missed the mark. The approach is evolving as there are now a number of legislative proposals in the EU and beyond for a regulatory response that targets how content is being distributed, personalised, and curated by very large online platforms as part of manipulative, data-driven business strategies to increase profit. These proposals include the recently launched, EU-proposed Regulation on political advertisement. 

We urge EU co-legislators to adopt a holistic approach when developing a new model of human rights-centric platform governance that consists of effective enforcement of existing legislation, mainly the General Data Protection Regulation (GDPR); swift adoption of the proposed e-Privacy Regulation; and making sure that fundamental rights safeguards are fully incorporated in negotiations of the Digital Services Act (DSA) and Digital Markets Act (DMA).