At a glance: Does the EU Digital Services Act protect freedom of expression?
The Digital Services Act is in many ways an ambitious piece of legislation that seeks to make ‘Big Tech’ accountable to public authorities through new significant transparency and due diligence obligations. It also contains many provisions that could help protect users’ fundamental rights. Whether it will be successful at protecting freedom of expression from undue restrictions or reining in the power of Big Tech rather than cementing it, is, however, questionable. EDRi's member ARTICLE 19 share its first thoughts on why.
On 15 December 2020, the European Commission published its long-awaited proposal for a Digital Services Act (DSA). It is the culmination of several years of grappling with the difficulties inherent in the dissemination of illegal content online and growing concerns about the amplification of ‘toxic’ content and disinformation. In particular, the DSA seeks to consolidate various separate pieces of EU legislation and self-regulatory practices that address online illegal or ‘harmful’ content. It also seeks to harmonise the rules applicable to the provision of digital services across the EU rather than having a patchwork of potentially conflicting legislation such as Germany’s NetzDG or a revived version of the Avia Law in France.
The DSA however goes further than consolidation and harmonisation. It is in many ways an ambitious piece of legislation that seeks to make ‘Big Tech’ accountable to public authorities through new significant transparency and due diligence obligations. On this count, it may well succeed. It also contains many provisions that could help protect users’ fundamental rights. Whether it will be successful at protecting freedom of expression from undue restrictions or reining in the power of Big Tech rather than cementing it, is, however, questionable. Here are their first thoughts on why. ARTICLE 19 will explore specific aspects of the DSA in more detail in subsequent blog posts.
What’s good
The DSA contains many positive provisions that, if adopted, could significantly improve the transparency and accountability of internet intermediaries for the decisions they make to remove or otherwise restrict access to content. Indeed, the language used throughout the DSA pays much more heed to the protection of fundamental rights (11 mentions) than previous EU legislation or initiatives in this area. After many years of civil society struggling to be heard on the protection of human rights in files such as the 2016 EU Code on Countering Illegal Hate Speech, the 2019 Copyright Directive or more recently the draft Terrorist Content Online Regulation, the Commission finally seems to be listening.
By far the most positive aspect of the draft DSA is that it retains the cornerstones of free speech protection, namely conditional immunity from liability for hosting providers and the prohibition on general monitoring in the E-Commerce Directive, now rebranded as Articles 5 and 7 of the draft DSA. This is consistent with the various European Parliament reports that were published in the run up to the publication of the Commission’s proposal but was by no means a foregone conclusion. ARTICLE 19 will make sure that it stays that way.
The second positive aspect of the draft DSA is its wide-ranging transparency obligations on Internet intermediaries, online platforms and very large online platforms (VLOPs) respectively:
- All intermediaries will have to produce transparency reports, including information about: (i) the number of orders received from national authorities, categorised by the type of illegal content concerned, and time taken to act on them; (ii) the number of notices received under the notice and action procedure set out in Article 14 of the DSA, any action taken and whether it was taken on the basis of law or the terms and conditions of the provider, and average time needed for taking action; (iii) content moderation engaged at the providers initiative that affect the availability, visibility and accessibility of information categorised by the type of reason and basis for those measures (Article 13).
- In addition, all online platforms must submit information about the number of disputes filed with out-of-court dispute settlement bodies, the outcome of such disputes and time taken to resolve them. They must also provide information about the application of suspension measures in response to the posting of manifestly illegal content, manifestly unfounded notices and the submission of manifestly unfounded complaints. Finally, they must provide information about any use of automatic means for the purposes of content moderation, including a specification of the precise purposes, indicators of the accuracy of the filters used and safeguards applied (Article 23). For those platforms that display advertising on their interfaces, they must ensure that the recipient of the service can identify clearly that the information displayed is an advertisement and the natural or legal person behind it. Recipients of the service must also be given meaningful information about the main parameters used to determine the recipient of such advertisements (Article 24).
- For their part, VLOPs are required to provide users with clear information about the main parameters used in their recommender systems (Article 29). They are also subject to additional transparency online advertising requirements, including the creation of publicly available repositories for the ads they display. They must publish their self-assessments in relation to systemic risks on their platforms, the related mitigation measures they have adopted in response, the independent audit report on those as well as an audit implementation report. Given the Commission’s role in ensuring standardisation, one can anticipate that transparency reporting will become more streamlined, making it easier to compare how social media companies handle content moderation in their platforms. The data access and scrutiny provision (Article 31) is also a significant improvement in the quest for real transparency from platforms and assessing their response to the challenges thrown up by content moderation.
Thirdly, the draft DSA requires all hosting providers to provide a statement of reasons for their decisions. The Commission proposes to manage a database containing those statements of reasons, a potentially very significant proposal to ensure scrutiny of the decisions made by hosting providers. Online platforms are further required to put in place an internal complaints mechanism as well as taking part in out-of-court dispute settlement mechanisms. When they do, this is reflected in their transparency obligations. Again, this reflects important progress on the protection of users’ due process rights.
Finally, the DSA adopts a Good Samaritan type of provision for the first time (Article 6), indicating that it recognises that content moderation is an ongoing, developing effort. ARTICLE 19 had been advocating for this as they believe that content moderation by social media companies has benefits and is indeed desirable in many instances. We believe that companies should be encouraged to innovate in their content moderation practices, such as the use of labels and the provision of contextual information in relation to disinformation, demonetisation, disabling of certain features in certain instances etc.
ARTICLE 19 recognises however that the flipside of this means that social media companies are likely to rely on automation to carry out content moderation at scale. For this reason, they believe that human rights impact assessments and more transparency about the false negatives and false positives of filters is crucial. The DSA proposal seems to recognise this by, among other things, requiring online platforms to be transparent about their use of automated means to carry out content moderation, indicators of accuracy of those means and any safeguards applied (Article 23 (1) (c)). Notwithstanding this, ARTICLE 19 note that Article 6 remains ambiguous since it only promises that internet intermediaries will not lose immunity from liability ‘solely’ because they carry out voluntary measures of their own initiatives or because they seek to comply with the requirements of Union law. That raises the question of the circumstances in which adopting voluntary measures combined with some other, undefined, measures might lead internet intermediaries to lose immunity from liability.
What’s bad
Perhaps most disappointing in the DSA is the new proposed notice-and-action mechanism. Disappointing as it is in striking contrast with the repeated calls from European politicians that social media companies should not be making the types of decisions that Facebook and Twitter took to suspend then President Trump’s account indefinitely following the U.S. Capitol attack of 6 January. Article 14 of the DSA effectively empowers hosting providers to make decisions about the legality of content upon receipt of a substantiated notice of alleged illegality. Since substantiated notices constitute actual knowledge (Article 14.3) for the purposes of the hosting immunity under Article 5, hosting providers have a strong incentive to remove content upon notice.
It is true that the DSA’s proposals contain a number of safeguards for the protection of freedom of expression: for instance, users must explain why they believe the notified content is illegal (Article 14.2.a). In practice, this should mean that when complaining about a defamatory post, complainants should explain why the content is not justified by reference to certain defences (e.g. fair comment). Moreover, online hosts have to provide a statement of reasons for whatever decision they make (Article 15). In the case of online platforms, they must also adopt measures against misuse, including manifestly ill-founded notices (Article 20.2). Nonetheless, it is inevitable that many hosting providers and platforms will simply not be able to hire teams of lawyers to peer through Article 15 notices. In practice, it will be easier for them to remove content to avoid any liability risk.
Another concern with Article 14 is that it applies to hosting providers across the board. That raises the spectre of some infrastructure players, such as cloud computing services, being made subject to this new procedure. This is deeply disquieting since infrastructure services such as cloud hosting providers and others are essential for the exercise of freedom of expression online.
The second most controversial provision in the DSA is Article 26, which lays down new due diligence obligations for VLOPs. In particular, Article 26 provides that VLOPs are required to conduct risk assessments at least annually of their content moderation systems, recommender systems and systems for selecting and displaying advertisements in relation to (a) the dissemination of illegal content, (b) any negative effects for the exercise of fundamental rights, particularly the rights to privacy and data protection, freedom of expression, the prohibition of discrimination and the rights of the child, and (c) intentional manipulation of their service, including by means of inauthentic use or automated exploitation of their service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security.
This is incredibly vague. Although the indirect reference to human rights impact assessments under Article 26 (b) is welcome, there is no guidance on how conflict between Article 26 (a) and (b) might be resolved. In practice, it also leaves an enormous amount of discretion to both companies and ultimately the Commission to decide how those risks should be mitigated. This is borne out by Article 27, which identifies possible measures that might be adopted, such as “adapting content moderation or recommender-systems” or “initiating or adjusting cooperation with other platforms through Codes of Conduct and crisis protocols”. Although reference is made to ‘reasonable’ and ‘proportionate’ measures, it says nothing of the kinds of measures that the Commission might ultimately consider to be proportionate and reasonable to address a given systemic risk.
And here lies another disquieting aspect of the DSA proposal. While Chapter IV provides that oversight would in principle be carried out by independent regulators called Digital Services Coordinators, very large online platforms are subject to supervision by the European Commission. Whilst this appears to reflect lessons learnt by the Commission from the lack of enforcement of the General Data Protection Regulation (‘GDPR’), and is also reminiscent of competition law enforcement, the fact remains that the European Commission is not an independent regulator. It is the EU’s executive arm. In other words, oversight of very large online platforms is ultimately not independent.
What’s missing
For many digital rights groups, the absence of any provisions regarding the business model of platforms based on behavioural advertising is a real missed opportunity. The DSA also fails to include provisions that would ensure the unbundling of hosting from content curation, as advocated by ARTICLE 19. This situation could yet be improved by amending Article 29 on recommender systems. To begin with, the requirements on greater transparency in the main parameters of recommender systems should be applicable to all online platforms. The same is true of being offered options to modify or influence such parameters. Moreover, recommender systems that are not based on profiling should be the default option. Finally, ARTICLE 19 believes that Article 29 could be significantly improved by allowing third-parties to provide alternative recommender systems on VLOPs. According to ARTICLE 19, this would foster real exposure diversity and better choices for users.
What’s next
A key area of action for ARTICLE 19 will be to ensure that discussions about the Digital Services Act are not disconnected from those on the Digital Markets Act. A very real concern with the DSA proposal is that it could help cement the dominance of large players without sufficiently addressing their power or setting limits on their business model based on the massive collection of personal data, profiling and targeted advertising. The challenges thrown up by content moderation are considerably magnified by the fact that the vast majority of public discourse takes place on a very small number of platforms, which hold excessive power over information flows. Content moderation rules are insufficient to address this problem. We need rules to open the markets to new platforms and to decentralise the channels of public discourse. This is why the DSA and the DMA need to be examined together and the EU needs to be far more ambitious and seek to promote a truly decentralised Internet through the use of robust pro-competition tools in the DMA.
The DSA is a commendable effort in seeking to make the largest online platforms accountable for the way in which they carry out content moderation. As the debate moves back again in the European Parliament and Member States continue to develop their position on the DSA, ARTICLE 19 will be engaging with MEPs and national ministries to raise their concerns and make recommendations for the better protection of freedom expression.
The article was first published here.