How security policy hijacks the Digital Single Market
On 22 August, when Politico leaked an internal European Commission document that outlined policy plans for the upcoming mandate from all corners of the EU’s executive branch, Brussels was at high alert. Although the document is an internal note, not an official Commission position, it isn’t irrelevant: Its purpose is to inform the incoming Commissioners on important policy dossiers and make recommendations about what to do with them. The digital policy chapters it contains are definitely informative.
You won’t be surprised to see that the Directorate‑General for Communications Networks, Content and Technology (DG CNECT), the Commission’s department for all things digital, focuses on completing the Digital Single Market and—perhaps most importantly—it’s own new pet project: the Digital Services Act (DSA).
The DSA will likely become the next major internet regulation battle ground. It regulates social media platforms and all kinds of online services, including the legal responsibilities they will have for content uploaded by users.
Ill-conceived rules threaten people’s freedoms
For a start, DG CNECT formulates a number of positive ideas, for instance on the use of digital tech to improve Europe’s environmental footprint, or an ambitious regulatory framework for artificial intelligence, with a strong focus to protect fundamental rights. We welcome both proposals and encourage EU policy makers to work along with civil society to achieve those goals, with the thorough debate needed and taking into consideration tools we already have at hand, such as strongly enforcing the General Data Protection Regulation (GDPR) and adopting the ePrivacy Regulation.
In addition to this, the document includes a chapter on the planned Digital Services Act in which it suggests to impose a “duty of care” for online services that deal with illegal and “harmful content” on their systems. The stated goal is to tackle things like effective content moderation, political online advertisement, disinformation, and the protection of minors.
While EDRi would prefer that the DSA regulates only illegal content, rather than vaguely defined “harmful” material, we’re glad to see that DG CNECT explicitly recognises the risk that “heavy, ill-conceived rules” pose for media pluralism, freedom of expression and other fundamental rights. Because their colleagues over at the Directorate-General for Migration and Home Affairs (DG HOME) have a very different approach.
Burning the house to roast the pig
Technically, DG HOME is responsible for topics such as migration, human trafficking, cyber crime and terrorism. Yet the department dedicated more than a quarter of their policy idea space to DG CNECT’s Digital Services Act.
DG HOME calls its contribution “For a safe and responsible use of the internet”. According to its authors, today’s internet is mostly a lawless place used for “identity theft, ransomware, child pornography, incitement to terrorism, organising crime making use of more and more encrypted environment.”
To bring order to this online wild west, DG HOME’s internet specialists propose that, in the future, all platform companies should take “proactive measures”, also known as upload filters, to prevent the “criminal abuse of their services.” Sounds familiar? That’s because DG HOME’s “responsible use of the internet” looks dangerously similar to the general monitoring obligation of the new EU Copyright Directive. Apparently it needs to be emphasised, once again, that the European Court of Justice has consistently ruled that such general monitoring obligations are in violation of our fundamental rights, and therefore illegal. Experts frequently add that, taken alone, filters are also ineffective in tackling illegal online content.
DG HOME’s proposal also includes an obligation for online platforms to shut down accounts that display illicit or harmful (yet legal) content such as disinformation — an idea that would turn companies into arbiters of truth.
Both DG CNECT and DG HOME have to consult each other and cooperate on files of shared interest, that is normal procedure. But the apparent attempt to hijack the framing around the DSA, before the legislative proposal is even written, is staggering. A Digital Single Market file with major fundamental rights implications should not be pushed into the security policy sphere where it risks being abused to curtail people’s fundamental rights.
European Commission internal document on proposed priorities
https://www.politico.eu/wp-content/uploads/2019/08/clean_definite2.pdf
More responsibility to online platforms – but at what cost? (19.07.2019)
https://edri.org/more-responsibility-to-online-platforms-but-at-what-cost/
E-Commerce review: Opening Pandora’s box? (20.06.2019)
https://edri.org/e-commerce-review-1-pandoras-box/
Filters Incorporated (09.04.2019)
https://edri.org/filters-inc/