Digital Services Act: The EDRi guide to 2,297 amendment proposals
Various committees in the European Parliament have tabled such a large number of amendments for the Digital Services Act (DSA) that today, EDRi publishes a guide to support Members of the European Parliament in navigating those that would help create a successful, open, and rights-respecting European digital sphere.
This EDRi policy paper provides a guide to support Members of the European Parliament in understanding which amendments are beneficial for people and thereby an open rights-respecting European digital sphere. At the same time, it also points out the amendments that would merely protect the profits of Big Tech or enable abuses of power. The paper builds on previous policy positions established by the EDRi network and takes these forward to anchor them into the current debates and proposals.
As the starting point of any rights-respecting internet regulation, EDRi strongly recommends protecting the conditional liability regime. Although online intermediaries and especially social media platforms need to be regulated, pushing them towards unverifiable and hyper-fast removal of online content that someone has alleged to be illegal on the internet is not going to help the EU fight the harm that these platforms are causing.
The DSA should enable people to control the kind of online content they wish to read, watch and share. Currently, there are no limits as to how platforms’ algorithms disseminate, amplify or suppress potentially harmful online content for their billions of users worldwide. As a result, these algorithms are optimised for maximising user ‘engagement’, that is to keep people clicking, liking, and sharing no matter the consequences.
Facebook’s own research has shown “that content that is hateful, that is divisive, that is polarizing, it’s easier to inspire people to anger than it is to other emotions”, explains Facebook whistleblower Frances Haugen, who has alerted the public about the harms these algorithms are inflicting. Algorithms optimised for user engagement will tend to promote hateful content and disinformation, even when platforms take specific measures to remove such content (as Facebook claims to do). “Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they’ll click on fewer ads, they’ll make less money,” Haugen says.
The DSA is a huge opportunity for the EU to become a global leader in modern online platform regulation. To achieve this, however, it must look beyond mandating quick content deletion and fix what’s really broken: the attention-grabbing, user-exploiting business model of most of today’s monopolistic, hyper-centralised, and ad-driven social media platforms. We can do better than this. But to build a healthier online ecosystem for Europe, the DSA must not tear down our fundamental rights.
- EDRi: “Digital Services Act / Digital Markets Act: Document pool” (last update 22.10.2021)
- Amnesty International: “Facebook Files: How a ban on surveillance advertising can fix Facebook” (20.10.2021)
- Panoptykon Foundation: “Algorithms of trauma: New case study shows that Facebook doesn’t give users real control over disturbing surveillance ads” (06.10.2021)
- EDRi, Access Now & Civil Liberties: “Warning: the EU’s Digital Services Act could repeat TERREG’s mistakes” (06.10.2021)
- Panoptykon Foundation: “Can the EU Digital Services Act contest the power of Big Tech’s
algorithms?” (02.08.2021) - ARTICLE 19: “At a glance: Does the EU Digital Services Act protect freedom of
expression?” (10.03.2021)
( Contribution by: )