How tech corporations like Google, Meta and Amazon should assess impacts on our rights
A new policy paper by EDRi members ECNL and Access Now outlines key recommendations on how to make fundamental rights impact assessments meaningful under the European Union's Digital Services Act (DSA).
A new policy paper by EDRi members ECNL and Access Now outlines key recommendations on how to make fundamental rights impact assessments meaningful under the European Union’s Digital Services Act (DSA).
The DSA, the EU regulation on illegal content, transparent advertising, and disinformation on online platforms, sets obligations for very large online platforms or search engines like Google, Meta, and Amazon to safeguard freedom of expression and access to information, and to stop the spread of illegal online content.
New obligations for tech corporations:
It has been almost a year since the DSA was adopted, but companies and policymakers are still deciding on who has to follow which rules laid down by the law. The European Commission has designated 19 tech companies as being either “very large online platforms” (VLOPs) or “very large online search engines” (VLOSEs). Therefore, these platforms and service providers must now fulfil additional obligations to safeguard the freedom of expression and access to information, and to stop the spread of illegal online content.
One of these requirements is for companies to understand, assess, and mitigate any risks to fundamental rights stemming from their services. But ensuring and enforcing this work in a meaningful way will be challenging:
- The DSA does not set harmonised rules for fundamental rights risk assessments (FRIAs).
- There is still no consensus on what constitutes a high-quality assessment.
Without clear expectations towards big tech, there is a risk that FRIAs become merely a formality, rather than a meaningful mechanism.
Human rights are more than a “tick-box” exercise
This is where civil society groups stepped in providing key expertise for both legislators and the industry. ECNL and Access Now have worked together to prepare a new policy paper “Towards meaningful fundamental rights impact assessments under the DSA” to support efforts so that impact assessments are more than a “tick-box” exercise.
For this, policymakers and companies alike must implement safeguards and baselines, particularly when it comes to assessing the specific risks of automated content moderation for civic freedoms – a growing concern in the AI era. It is especially vital that the people and communities impacted by these tools have the chance to input into the impact assessment process.
This article was first published here by ECNL.
Contribution by: EDRi affiliate ECNL & EDRi member Access Now