Blogs | Highlights | Open internet and inclusive technology | Privacy and data protection | Online tracking industry / AdTech | Platform regulation | Surveillance and data retention | Transparency

The EU’s attempt to regulate Big Tech: What it brings and what is missing

This week, the European Commission has proposed two long-awaited pieces of digital legislation, the Digital Services Act and the Digital Markets Act. Despite a number of good provisions, there are also major shortcomings which must be addressed to guarantee the protection of digital rights.

By EDRi · December 18, 2020

 


Designed by: Bitteschoen.tv/Hertie School

This week, the European Commission has proposed two long-awaited pieces of digital legislation that are so important that some have called it a “new constitution” for the internet. Building on the General Data Protection Regulation (GDPR) legacy, they have been presented as a way to address the accumulation of political, economical and social power in the hands of a few Big Tech companies.

The first proposal, dubbed the Digital Services Act (DSA), aims at creating a rulebook for how platforms such as Facebook and YouTube should handle content that has been signalled to them as illegal. The second proposal, called the Digital Markets Act (DMA), wants to empower European authorities to better prevent anticompetitive behaviour from the so-called digital gatekeepers. According to the DMA, gatekeepers are dominant platform providers with a “significant impact on the internal market” that serve as an “important gateway for business users to reach end users” such as search engines, social networking services, certain messaging services, operating systems, and online intermediation services.

DSA: Delete first, think later

As a horizontal legislation, the DSA attempts to address illegal content and govern the content moderation practices of social media platforms – in answer to legitimate concerns that the internet is not always a safe place for all.

After the legislative disasters of the Copyright Directive and the Terrorist Content Online Regulation, the Commission now seems to be more aware of the risks that regulation of the freedom of expression entails. The DSA proposal thus maintains the current rule according to which companies that host other people’s digital content are not liable for that content unless they actually know it is illegal.

Unfortunately, however, the Commission appears to have created a far-reaching exception to that rule: As soon as anybody on the internet flags any content as potentially illegal, liability kicks and would require the hosting company to “expeditiously” remove or disable access to the content in question. Removing or disabling content that has been flagged therefore becomes the most commercially reasonable action for companies in order to avoid the legal liability risk that comes with an actual legality assessment. This heavy-handed approach would create a system of privatised content control with arbitrary rules beyond judicial and democratic scrutiny. We will work closely with the European Parliament and the Council to insert the safeguards needed to better protect human rights.

The DSA proposal follows the principle “delete first, think later”.

We welcome that the Commission has picked up our demand to require online platforms to work with independent, certified dispute settlement bodies to which users can lodge complaints when they believe their content has been wrongfully removed. The transparent certification scheme by Digital Services Coordinators can ensure that only trustworthy organisations are allowed to provide out-of-court settlement to users and online platforms. The DSA needs to make sure that access to those settlement bodies remains affordable and will not turn into a new big business for law firms and consulting corporations.

Effective moderation of harmful online content is intrinsically linked to the way that content is amplified. While the DSA proposal contains a number of good provisions on transparency around online advertising, it completely fails to address the problems inherent in the toxic ad tech business. Without any limitations to the micro-targeted online manipulation through ad tech (and with a strong ePrivacy Regulation nowhere to be seen) the constant surveillance of people across the internet for the purpose of online advertising remains the norm. We count on the European Parliament to follow through on its own demand to phase out hyper-invasive surveillance advertising in Europe.

DMA: Opening the gates

We like the Commission’s introduction of an unequivocal list of prohibited practices for gatekeepers. It is high time that the law prevents already dominant tech companies such as Apple, Facebook and Google to abuse their virtually unlimited resources and gatekeeper power to crush (or buy out) competitors.

The prohibition for gatekeepers to self-preference their own offerings (think: Google Search listing Gmail first when you search for email providers), or to re-use people’s personal data in other products (like Facebook copying your WhatsApp address book over to Facebook) are crucial steps to reign in the biggest platforms’ power over us.

It is disappointing, however, that the list of prohibited conduct is drafted to cater solely to the needs of “business users” while ignoring the rights and needs of ordinary people. That might also explain, why the Commission somewhat included EDRi’s and other experts’  recommendation to open up some of the gatekeepers’ core services to competing firms, but only in very restricted cases. This provision will not enable users from privacy-friendly social networks, for example, to contact their friends on Facebook without having a Facebook account on their own.

The DMA proposal is an important step but won’t help people break free from the centralised platform economy that dominates their online experiences today. More work will be needed to achieve systemic changes and put people in control of their life in the digital world.”

Strong interoperability requirements with the people’s needs in mind, should allow users to connect across services instead of being held in BigTech’s cages. Luckily, there are easy ways for the European Parliament and the Council to fix this omission and turn the DMA into a real law for the people.

EDRi looks forward to working with the EU co-legislators to address these gaps and deliver on a promise that meets the needs of people and democracies.

Correction: An earlier version of this article said the DSA would oblige users to use the platforms’ internal complaint mechanism before being allowed to appeal to out-of-court settlement schemes. This does not appear to be the case and has been corrected in this article.

Contribution by: