The EU Media Freedom Act’s suspension period for content moderation will harm users

In December 2023, the European Parliament and Member States’ representatives negotiated a controversial special status for media outlets in the European Media Freedom Act (EMFA): their content cannot be removed from big tech platforms for up to 24 hours, even when it violates community standards intended to protect users.

By Electronic Frontier Foundation (EFF) (guest author) · January 17, 2024

Fostering Media Plurality: Good Intentions

In 2022, the EU Commission presented the European Media Freedom Act (EMFA) as a way to bolster media pluralism in the EU. It promised increased transparency about media ownership, and safeguards against government surveillance and the use of spyware against journalists—real dangers that Electronic Frontier Foundation (EFF) and other EDRi members have warned against for years.

A final deal was struck on 15 December 2023, which calls the Commission’s ambitions into question. In terms of spyware, although a broad national security carve out was ultimately rejected, Member States succeeded to have their Treaty competences in security and defense matters reaffirmed and thus their options to spy on reporters under the pretext of national security. This raises considerable concerns about the (legalised) use of spyware against journalists. The political agreement on EMFA’s content moderation provisions is similarly troubling, as it could erode public trust in media and jeopardise the integrity of information channels. Under the agreement, very large online platforms will have to notify media companies that they intend to delete or restrict their content and give them 24 hours to respond.

Content Hosting by Force: Bad Consequences

Millions of EU users trust that online platforms will take care of content that violates community standards. But contrary to concerns raised by EFF and other civil society groups, Article 17 of the EMFA enforces a 24-hour content moderation exemption for media, effectively making platforms host content by force.

This “must carry” rule prevents large online platforms like X or Meta (owner of Facebook, Instagram, and WhatsApp) from removing or flagging media content that breaches community guidelines. If the deal becomes law, it could undermine equality of speech, fuel disinformation, and threaten marginalised groups. It also poses important concerns about government interference in editorial decisions.

Imagine signing up to a social media platform committed to removing hate speech, only to find that EU regulations prevent platforms from taking any action against it.

According to EMFA, platforms must instead create a special communication channel to discuss content restrictions with news providers before any action is taken. This approach not only undermines platforms’ autonomy in enforcing their terms of use but also jeopardises the safety of marginalized groups, who are often targeted by hate speech and propaganda. This policy could also allow orchestrated disinformation to remain online, undermining one of the core goals of EMFA to provide more “reliable sources of information to citizens”.

Bargaining Hell: Platforms and Media Companies Negotiating Content

Not all media providers will receive this special status. Media actors must self-declare their status on platforms, and demonstrate adherence to recognised editorial standards or affirm compliance with regulatory requirements. Platforms will need to ensure that most of the reported information is publicly accessible. Also, Article 17 includes safeguards related to AI-generated content, with specifics yet to be disclosed. This new mechanism puts online platforms in a powerful yet precarious position of deciding over the status of a wide range of media actors.

“The approach of the EU Media Freedom Act effectively leads to a perplexing bargaining situation where influential media outlets and platforms negotiate over which content remains visible.”

Christoph Schmon, EFF International Policy Director

It’s likely that the must carry approach will lead to a perplexing bargaining situation where influential media outlets and platforms negotiate over which content remains visible.

There are strong pecuniary interests by media outlets to pursue a fast-track communication channel and make sure that their content is always visible, potentially at the expense of smaller providers.

What’s next?

It’s positive that EU representatives listened to some of our concerns and added language to safeguard media independence from political parties and governments. Negotiators also stipulated in Article 17 that the EU Digital Services Act will remain intact and that platforms are free to shorten the suspension period in crisis situations.

However, while the agreement has been lauded as a “huge win for media freedom” (Rapporteur Sabine Verheyen, EPP), we remain concerned about the enforcement reality and the potential exploitation of the self-declaration mechanism. Alongside EDRi, we will track the next legislative steps as technical details are hashed out before the deal will be formally approved by the EU Parliament and the Council in the coming months. Make no mistake, we will closely monitor how the EU Media Freedom Act will be applied in practice and work to ensure that it neither undermines the equality of speech and democratic debate nor endangers vulnerable groups.

Contribution by: Christoph Schmon, International Policy Director, EDRi member, Electronic Frontier Foundation (EFF)

In 2024, we will continue building a world where we use technology to live with dignity and prosperity.

Support our work with a donation