Regulating Big Tech in Europe with the Digital Services Act & Digital Markets Act

The EU’s latest flagship laws Digital Services Act (DSA) and Digital Markets Act (DMA) are in force, the regulatory structure is (slowly) being set up, the first Big Tech companies are suing in court, and the European Commission throws a party (yes, really). But what does this mean for people in their role as platform users and what’s coming next?

By EDRi · August 2, 2023

It was not your usual grey Brussels backroom roundtable under Chatham House rule. When the European Commission invited hundreds of people, corporate lobbyists, civil society, reporters, academics, and regulators, to come to a DSA Stakeholder Event entitled “Shaping the Future of Digital Services”, they were thinking big. 

On 27 June 2023, the European Commission threw a full-day party with a colourful, pop music-filled main stage, a large networking area, and over a dozen thematic breakout rooms filled with heavyweight expertise and lobbying power. Given our great contribution to and impact on the laws, EDRi staff were there as well, among others as a speaker at a panel on protecting freedom of speech and media pluralism. It was a nice day, but isn’t it too early to celebrate?

What do DSA and DMA have in stock for me as a user?

As the General Data Protection Regulation (GDPR) sets the global standards for data protection law around the world, the DSA and DMA are arguably the most important pieces of digital legislation on regulating platforms in the last years. The DSA has been called the “constitution for the internet,” while the DMA represents the European landmark attempt to rein in digital market abuses. For years, EDRi has been working extensively alongside European Union (EU) policymakers to ensure a human rights-centric regulations and reaffirm the open internet as a public good. This includes the implementation and enforceability of both pieces of legislation.

In April, the EU Commission formally designated 17 online platforms and search engines to be “very large” under the DSA, i.e. those with 45+ million European users. Those include obvious candidates like YouTube, Facebook, and Google Search but also some lesser-known services like Aliexpress, Pinterest and Zalando.

On top of the entire list of obligations under the DSA—like enabling users to flag potentially illegal content, and applying terms of conditions that people without a law degree can understand—those very large players now also need to:

  • Run through annual compliance audits done by external auditing companies;

  • Provide the EU Commission with annual risk assessments that analyse the platforms’ individual risk factors like the dissemination of illegal content, gender-based violence as well as negative effects on fundamental rights, civic discourse, electoral processes, public security, public health, minors, and generally users’ physical and mental well-being;

  • Allow researchers that have been vetted by independent authorities to access platform data for their research of systemic platform risks; 

  • Stop processing the personal data of known minors as well as so-called ‘special category data’ for targeting online ads.

The above is an incomplete list, but it demonstrates that with the DSA in force, running a global centralised corporate social media platform with billions of users might finally get the price tag it deserves.

In parallel, many of the very same Big Tech companies have been obliged to self-declare as “gatekeepers” under the DMA, which applies to very large corporations that provide important and entrenched core platform services like social networks, video sharing platforms, operating systems, or web browsers. By early July, seven corporations have been said to be gatekeepers, namely Alphabet (Google), Amazon, Apple, Bytedance (Tiktok), Meta (Facebook, Instagram, Whatsapp), Microsoft, and Samsung.

As gatekeepers, companies will have to follow a long list of 21 obligations and prohibitions that are known to have anti-competitive effects. This includes, for example, an obligation for Apple to enable the use of third-party app stores on iPhones, so that iPhone users are free to install the software they want instead of being limited to the apps pre-selected by Apple. It also includes an obligation for Whatsapp and other gatekeeper chat apps to make themselves compatible with smaller rivals so that users of other messaging apps can reach people on Whatsapp without needing to open an account there.

Okay, but really, when does all this start?

In order to make all those new obligations for providers and rights for users a reality, the EU must build enforcement muscle. Technically, the two laws are already in force but in practice, the EU Commission as principal enforcer is still working on setting up that muscle.

As a first step, they have hired dozens of new staff for their enforcement departments and created a European Centre for Algorithmic Transparency, tasked to provide the Commission with expert knowledge on algorithms like content recommendation systems, ad selection, and news feeds.

The Commission also ran a series of public and semi-public workshops around the various enforcement issues in order to gather views from affected parties and experts. Those workshops discussed issues ranging from how to implement messenger interoperability to how to prevent gatekeepers from re-using our personal data across different services and a civil society round-table on risk assessment best practices for the DSA.

The only thing that has not happened yet is an actual enforcement action, and that’s for two reasons: The DMA can only be actually enforced once the EU Commission has formally designated corporations as gatekeepers, which is set to happen until 6 September 2023. Under the DSA, the nitty-gritty of enforcement will require not only the Commission as EU-wide enforcer but also the Digital Services Coordinators, independent national enforcement authorities to be set up by each EU member state. And since capitals have time until early 2024 to appoint those authorities and equip them with resources, before that date little enforcement action is to be expected at the DSA front either.

What will happen next?

That does not mean the EU Commission and Digital Services Coordinators (the ones already appointed) have to sit idle. They can and they absolutely should already start collecting evidence about possible infringements of the new laws. As of September, nothing stops the EU Commission from walking into Google’s HQ to verify if and how our personal data is being re-used across various services in breach of the DMA. Already now they can go and question Facebook staff about what measures the company has taken to stop using sensitive data and personal data about minors for targeting surveillance ads.

To improve this work, enforcers should actively seek out expertise from civil society and affected parties about past and likely future breaches of the rules. Certainly, rival search engines like DuckDuckGo and Mojeek have stories to tell about how Google is using dark patterns to prevent users from switching. Independent app developers (not the Big Tech-funded developer lobby groups) can explain to the EU Commission how Apple and Google make it deliberately hard for them to publish their apps anywhere else than on the gatekeeper app stores. 

And, as recently proposed by technology law professor Dr Suzanne Vergnolle, civil society groups who have studied the use of social media platforms for things like commercial surveillance, disinformation, and the targeted manipulation of public opinion, are important partners to advise the Commission on the complex effects the centralised mega-platforms have on both technology and society.

EDRi and its member organisations will continue to push the EU Commission and national Digital Services Coordinators for the most effective and timely enforcement of the Digital Services Act and the Digital Markets Act to make sure that people can enjoy the fundamental rights protections promised by those laws for real. This should also include ensuring that any new legislative proposals that target content moderation policies, online advertisement rules and digital markets respect and build on the DSA and DMA and that they protect fundamental rights. 

Jan Penfrat

Senior Policy Advisor

Mastodon: @ilumium

Explore EDRi’s work on the EU’s Media Freedom Act and Political Ads Regulation