EU adopts Digital Trade Agreement with Singapore despite warnings: a setback for digital rights and democratic oversight

The European Parliament has approved the EU–Singapore Digital Trade Agreement, rejecting a motion to seek a Court of Justice opinion on its legality. This decision weakens the Union’s capacity to safeguard privacy, data protection, and accountability over software systems, at a time when deregulation pressures are increasing across Europe.

By EDRi · November 17, 2025

A missed opportunity for legal certainty and accountability

On 13 November 2025, the European Parliament approved the EU–Singapore Digital Trade Agreement (DTA) and rejected a motion that would have referred the deal to the Court of Justice of the European Union (CJEU) for a compatibility check under Article 218(11) Treaty of the Functioning of the European Union (TFEU.) Supported by civil-society organisations, the motion sought clarity on whether the agreement’s provisions on data flows and source code comply with the Charter of Fundamental Rights and the EU’s existing digital laws.

Instead of providing that legal clarity, the Parliament’s decision leaves the Union with a binding trade agreement whose effects on fundamental rights remain uncertain. Once in force, the DTA will commit both parties to avoiding restrictions on cross-border data flows and to prohibiting requirements for access to or transfer of source code, except under narrowly defined trade exceptions.

Data flows treated as a trade issue, not a rights issue

At the heart of the problem lies the DTA’s framing of cross-border data flows. The text adopts a trade-law logic: data must flow freely unless a restriction can be justified as ‘necessary’ and ‘no more trade-restrictive than required.’ This reverses the approach of the General Data Protection Regulation (GDPR),which allows data transfers only when fundamental rights can be guaranteed.

The European Data Protection Supervisor (EDPS) has warned that this structure creates legal uncertainty regarding the Union’s position on protectioning personal data in connection with EU trade agreements. Nevertheless, the Parliament accepted it. By doing so, the EU risks letting future disputes over privacy protections to be decided not under EU constitutional law, but rather under trade panels applying commercial criteria.

This is not a minor technicality. It sets a precedent for upcoming negotiations with other countries and may undermine adequacy decisions, the key mechanism ensuring that EU data protection standards accompany EU data.

Source code secrecy undermines transparency and accountability

The DTA’s Article 11 prohibits the EU and its Member States from requiring access to the source code of software or algorithms as a condition for trade. Although it lists several exceptions, these remain subordinated to trade proportionality tests and could make oversight difficult or even impossible.

This clause directly affects the enforcement of the AI Act, the Digital Services Act (DSA), and the Digital Markets Act (DMA), all of which rely on regulators’ ability to inspect how systems operate. It also risks impeding progress on the right to repair, environmental design rules, and workplace algorithmic accountability.

This provision is not limited to artificial-intelligence systems but covers all software-based mechanisms, from recommendation engines to predictive-policing tools. Without access to code or logic, regulators cannot verify compliance, researchers cannot audit bias, and people affected by automated decisions cannot effectively seek redress.

A dangerous precedent in a period of deregulation

The adoption of the Singapore DTA comes at a moment when the European Commission is advancing a ‘simplification’ or deregulation agenda that threatens to reopen the GDPR and endanger the EU digital rulebook . In this context, embedding weak safeguards in trade law creates a dual lock-in: not only do rights become harder to enforce, but revising the rules would risk violating trade obligations.

This shift turns trade policy into a tool that limits democratic choices. Once the DTA is in force, the EU will no longer be free to adapt its laws on AI accountability, consumer protection, or workers’ rights if those laws are deemed too restrictive for digital trade.

The DTA is therefore not merely a technical trade instrument. It is a structural constraint on future legislation, a way of fixing today’s standards in place even as technology and public needs evolve.

Beyond digital rights: the need to break institutional silos

The debate around the DTA has also revealed a deeper structural problem in how the EU handles complex, cross-cutting issues. Trade, digital policy, and labour rights are treated as separate silos, each handled by different committees and directorates, with little coordination. This institutional fragmentation makes it nearly impossible to address the full social and rights impact of agreements that cut across all these domains.
For instance, provisions on data flows directly affect workers’ rights when employers use algorithmic management or biometric tracking tools . Yet, trade negotiators rarely consult labour experts, and parliamentary committees working on employment or fundamental rights are not systematically involved before a deal reaches ratification.
In an era of pervasive digitalisation, this approach no longer makes sense. Every trade rule that touches data, software, or AI has implications for privacy, non-discrimination, labour protection, and environmental sustainability. Treating these as separate policy silos weakens the Union’s ability to uphold its own legal and ethical standards.

What is at stake

The EU–Singapore DTA is likely to serve as a template for future agreements. Its flaws – vague exceptions, absence of adequacy safeguards, and barriers to transparency – could become the global norm if not challenged now. This would erode the EU’s credibility as a champion of fundamental rights and regulatory autonomy.

For people in the EU, it means fewer guarantees that their personal data will remain protected once transferred abroad. For communities in partner countries, it risks entrenching surveillance-friendly and deregulated digital economies. For regulators and civil society, it means diminished oversight over the algorithms and infrastructures shaping everyday life.

What should happen next

EDRi and its members, alongside with other CSOs, will continue to monitor the implementation of the DTA and advocate for stronger safeguards in all future digital trade negotiations. The European Parliament and Council should demand that any new digital trade text:

  • reinstates the 2018 horizontal provisions on cross-border data flows and privacy;
  • removes or redrafts source-code restrictions to guarantee regulatory access and accountability;
  • and ensures that trade rules are consistent with the Charter of Fundamental Rights and the entire EU digital rulebook.

Ultimately, the EU must stop treating digital trade as a separate technocratic field. Trade policy is not neutral: it shapes the conditions under which fundamental rights, democratic oversight, and social justice can be realised.