The Digital Omnibus: A step back from the brink, but the risks remain

A first Council compromise on the Digital Omnibus removes several of the most dangerous changes that were originally proposed to the GDPR and the ePrivacy Directive. This is a welcome development, however, important risks remain. Some amendments could still weaken safeguards in practice, while new provisions on AI development and access to terminal equipment remain unresolved. Simplification should strengthen rights, not dilute them. The safest way to protect Europe’s digital rulebook is to reject the Digital Omnibus entirely.

By EDRi · March 17, 2026

The state of play

Recent discussions in the European Council on the Data Omnibus – the part of the Digital Omnibus package that amends the General Data Protection Regulation (GDPR) , the ePrivacy Directive and other aspects related to non-personal data – suggest that some of the most dangerous elements of the European Commission’s proposal may be losing support. That is welcome news. However, even with these improvements, the broader direction of the Digital Omnibus as a whole remains deeply problematic. Simplification should not come at the expense of people’s fundamental rights.

A compromise that removes the worst proposals

The proposal for the Digital Omnibus raised serious concerns among digital rights organisations civil society as a whole, as well as academics, regulators, decision-makers, and people in the EU. Several amendments would alter core elements of the GDPR, including the definition of personal data, the definition of scientific research, the safeguards around automated decision-making, and the legal framework governing pseudonymisation.

Recent discussions among Member States suggest a move away from these proposals, as we could see in a draft of the first compromise text. For example, the changes to the definition of personal data have been dropped. The attempt to expand the definition of scientific research has been removed. The proposal to weaken Article 22 – one of the GDPR’s key safeguards against harmful automated decision-making – is no longer present. Plans to give the European Commission extensive powers to redefine pseudonymised data have also been removed.

These developments show that policymakers are listening to concerns raised by civil society, academics, data protection authorities and many other experts. They are an important step in the right direction.

Yet, dangerous risks remain

Despite these improvements, several elements of the Digital Omnibus still raise concerns.
The Council has taken important steps to limit some of the risks in the Commission’s proposal, and several elements of the compromise text, while improvements, are still problematic. The remaining amendments could create legal uncertainty and could weaken important safeguards in practice. Provisions affecting transparency obligations, the exercise of data subject rights, and the use of sensitive data in AI systems remain particularly concerning. These changes should therefore be deleted. If they remain in the text, they must be substantially strengthened and surrounded by clear safeguards that fully preserve the protections established by the GDPR.

The proposal also still contains new provisions that could fundamentally reshape how personal data is used in the development of AI systems. In particular, the introduction of a specific rule allowing companies to rely on “legitimate interest” when developing and operating AI systems risks creating a presumption that large-scale data reuse is acceptable. This would undermine the careful balance that the GDPR establishes between innovation and the protection of fundamental rights.

Simplification should strengthen rights, not weaken them

The Digital Omnibus is presented as a simplification exercise. In reality, it aligns with a broader deregulation agenda.

The GDPR was designed to provide strong and future-proof safeguards in a ever-changing digital environment. Weakening these protections in the name of simplification risks undermining trust in digital markets and weakening people’s rights at a time when data-driven technologies are becoming more powerful and pervasive. The same logic applies to the ePrivacy Directive, which complements the GDPR by protecting the confidentiality of communications and regulating access to information stored on people’s devices. Diluting these safeguards would weaken one of the core pillars of Europe’s digital rulebook.

Instead, simplification should focus on improving enforcement, supporting regulators, and clarifying existing obligations where necessary. It should not be used as a vehicle to reopen fundamental safeguards that protect people from the harmful uses of their personal data.


A dangerous path for Europe’s digital rulebook

The debate around the Digital Omnibus is also part of a broader political trend. In recent years, the European Union has adopted a number of important digital laws aimed at protecting people’s rights in the digital economy. Reopening the foundations of these frameworks through Omnibus “simplification” packages or “harmonisation” attempts risks weakening the entire digital rulebook.

Even if the current Council discussions remove some of the most harmful proposals, the underlying logic of the Digital Omnibus remains the same: reducing companies’ obligations rather than strengthening accountability. Simplification for whom? Certainly not for regulators, and surely not for people!

For this reason, the European Union should reconsider the approach altogether. Protecting fundamental rights must remain the guiding principle of Europe’s digital policy. The best way to safeguard the integrity of the GDPR, the ePrivacy Directive, and the wider digital rulebook is simple: the Digital Omnibus should be rejected.