freedom of expression

Freedom of expression is one of the key benefits of the digial era. The global information society permits interaction on a scale that was previously unheard of – promoting intercultural exchange and democracy. Consequently, the protection of this freedom is central to much of EDRi’s work.

25 Sep 2019

Portugal: Data retention complaint reaches the Constitutional Court

By Guest author

September 2019 brought us long-awaited developments regarding the situation of data retention in Portugal. The Justice Ombudsman decided to send the Portuguese data retention law to the Constitutional Court, following the Court of Justice of the European Union’s (CJEU’s) case law on blanket retention of data that lead to invalidation of Directive 2006/24/EC. This decision comes after a complaint presented by EDRi observer Associação D3 – Defesa dos Direitos Digitais, in December 2017.

The Ombudsman had first decided to issue an official recommendation to the government, urging it to propose a legislative solution for the problematic law that originated from the now invalidated Data Retention Directive. Faced with a refusal from the Minister of Justice to find a solution through legislative means, the Ombudsman has now decided to concede to D3’s original request, and has sent the matter for the appreciation of the Constitutional Court, which will have to provide a ruling on the constitutionality of the Portuguese data retention scheme.

A few days later, the same Constitutional Court partially stroke down, for the second time, a law that granted the intelligence services’ access to retained data. In 2015, the Constitutional Court had already declared the unconstitutionality of a similar law, after the president had requested a preventive ruling by the Court before signing it into law. However, in 2017, a new law that addressed some of the problems raised by the Constitutional Court was approved in the Parliament. As the new president opted not to request a preventive decision, the law came into force. 35 Members of the Parliament (MP) from three parties then requested a Constitutional Court ruling on the law, which was now issued.

The fundamental reasoning of this decision is that the Portuguese Constitution forbids public authorities from accessing citizen’s correspondence and telecommunications, except in the context of a criminal procedure. Given that the intelligence services have no criminal procedure competences, they cannot access such data within the existent Constitutional framework. However, the Court did allow access to user location and identification data (in the context of the fight against terrorism and highly organised crime), as such data was not considered to be covered by the secrecy of communications.

This case has also lead to the resignation of the original judge rapporteur due to disagreements related to the reasoning reflected in the final version of the text of the decision.

Associação D3 – Defesa dos Direitos Digitais
https://www.direitosdigitais.pt/

Portugal: Data retention sent to the Constitutional Court (07.03.2018)
https://edri.org/portugal-data-retention-constitutional-court/

European Court overturns EU mass surveillance law (08.04.2014)
https://edri.org/european-court-overturns-eu-mass-surveillance-law/

(Contribution by Eduardo Santos, Associação D3 – Defesa dos Direitos Digitais, Portugal)


close
23 Sep 2019

Your mail, their ads. Your rights?

By Andreea Belu
  • In the digital space, “postal services” often snoop into your online conversations in order to market services or products according to what they find out from your chats.
  • A law meant to limit this exploitative practice is stalled by the Council of European Union

We all expect our mail to be safe in the hands of a mailman. We have confidence that both the post office and the mailmen working there will not take a sneak-a-peak into our written correspondence. Neither we expect mailmen to act like door-to-door salespersons.

When we say “postal services” snoop, it is important to understand that this refers to both traditional mail services such as Yahoo, but also instant messaging apps like WhatsApp. While targeted ads are no longer popular among mail providers, the practice is gaining momentum in the instant messaging zone after Facebook’s CEO announced plans to introduce ads on WhatsApp’s Status feature.

Not just shoes ads

You might think: ”Well, what’s the harm in having shoes advertised after they’ve read the shopping chats between my friend and me?”. Short answer: it’s not just shoes.

Often targeted ads are the result of you being profiled according to your age, location, gender, sexual orientation, political views or ethnicity. You will receive jobs ads based on your gender, or housing ads based on your ethnicity. Sometimes, you may be targeted because you feel anxious or worthless. Are you sure all of these will benefit you? More, your online mailman might be required to read all of your mail, just in case you get in trouble with the law in the future. We call this mass data retention.

Click to watch the animation

The need for encrypted mail in storage *and* in transit

The WhatsApp case is a good example. Currently, WhatsApp seals the message right after you press “send”. The message goes to WhatsApp’s servers, is stored encrypted, and then sent to its recipient, also encrypted. This means that, technically, the mail is encrypted both in storage and in transit and nobody can reads its content. However, as Forbes points out, future ads plans might modify WhatsApp’s encryption so that they “first identify key words in sentences, like “fishing” or “birthday,” and send them to Facebook’s servers to be processed for advertising, while separately sending the encrypted message.

There’s a law for it, but it’s stalled by the EU Council

The ePrivacy Regulation, which is currently under negotiation, is aimed at ensuring privacy and confidentiality of our electronic communications, by complementing and particularising the rules introduced by the General Data Protection Regulation (GDPR). The EU Parliament adopted a good stand for ePrivacy that would ensure your online messages are protected both in storage and in transit (Art.5), that would consider “consent” as the only legal basis for processing data (Art 6), that would make privacy–by–design and privacy–by–default core principles in software design (Art. 10), and that would protect encryption from measures aimed at undermining it (Art. 17). However, the Council of the European Union is yielding under big tech lobby pressure and drafted an opinion that threatens our rights and freedoms. More, the text adopted by the EU Parliament in October 2017 has been stuck in the EU Council, behind closed-door negotiations for soon two years. We have sent several letters (here, here and here) calling for the safeguarding our communications and for the adoption of this much needed ePrivacy Regulation.

Will our voices be heard? If you are worried about being targeted based on your private conversations, join our efforts and stay tuned for more updates coming soon.


Read more:

Your family is none of their business (23.07.2019)
https://edri.org/your-family-is-none-of-their-business/

Real-time bidding: The auction for your attention (4.07.2019)
https://edri.org/real-time-bidding-the-auction-for-your-attention/

e-Privacy Directive: Frequently Asked Questions
https://edri.org/epd-faq/

e-Privacy: What happened and what happens next (29.11.2017)
https://edri.org/e-privacy-what-happened-and-what-happens-next/

e-Privacy Mythbusting (25.10.2017)
edri.org/files/eprivacy/ePrivacy_mythbusting.pdf

close
11 Sep 2019

Poland challenges copyright upload filters before the CJEU

By Centrum Cyfrowe Foundation

On 24 May 2019, Poland initiated a legal challenge (C-401/19) before the Court of Justice of the European Union (CJEU) against Article 17 of the Directive on copyright in the Digital Single Market. EDRi member Centrum Cyfrowe Foundation has previously tried to get access to the complaint using freedom of information (FOI) requests, without success. Now, the CJEU has finally published the application for this legal challenge.

Bringing the Directive to the Court of Justice is a positive step that can help clear controversies concerning its Article 17. An independent court will assess issues that in the policy debate preceding the adoption of the Directive were typically dismissed by representatives of rights holders as fear-mongering or disinformation.

The Republic of Poland seeks the annulment of Article 17(4)(b) and Article 17(4)(c) of the copyright Directive.Alternatively, should the Court find that the contested provisions cannot be deleted from Article 17 of Directive without substantively changing the rules contained in the remaining provisions of that article, Poland claims that the Court should annul Article 17 of Directive in its entirety.

Poland claims that the Directive infringes the right to freedom of expression and information guaranteed by Article 11 of the Charter of Fundamental Rights of the European Union. The legal challenge mentions as particularly problematic the imposition on online content-sharing service providers to “make best efforts to ensure the unavailability of specific works and other subject matter for which the rights holders have provided the service providers with the relevant and necessary information” and to “make best efforts to prevent the future uploads of protected works or other subject-matter for which the rights holders have lodged a sufficiently substantiated notice make it necessary for the service providers — in order to avoid liability — to carry out prior automatic verification (filtering) of content uploaded online by users, and therefore make it necessary to introduce preventive control mechanisms”. In other words, obliging online platforms to filter all uploads by their users.

Unfortunately, the political context of the challenge has raised some questions. The complaint was submitted just two days before the elections of the European Parliament and Poland’s ruling Law and Justice party (PiS) has been brandishing its opposition to upload filters against the biggest opposition party, Civic Platform.

The EU Member States (and Iceland, Liechtenstein and Norway) have until 2 October 2019 to submit an application to the CJEU to intervene in this case, as defined by Chapter 4 of the CJEU’s Rules of Procedure (RoP). Member States can intervene to support, in whole or in part, either Poland’s position on Article 17 or the Council and Parliament’s position on Article 17.

Centrum Cyfrowe Foundation
https://centrumcyfrowe.pl/en/

The Copyright Directive challenged in the CJEU by Polish government (01.06.2019)
https://www.communia-association.org/2019/06/01/copyright-directive-challenged-cjeu-polish-government/

CJEU Case C-401/19 – Poland v Parliament and Council
http://curia.europa.eu/juris/liste.jsf?language=en&num=C-401/19

Directive (EU) 2019/790 on copyright and related rights in the Digital Single Market, Article 17
https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:32019L0790#017

(Contribution by Natalia Mileszyk, EDRi member Centrum Cyfrowe Foundation, Poland)

close
11 Sep 2019

The Netherlands, aim for a more ambitious copyright implementation!

By Bits of Freedom

All EU Member States are obliged to implement the newly adopted EU Copyright Directive, including its controversial Article 17. But how to interpret it, is up to them. In the Netherlands, there is currently a draft bill, which is unfortunately very disappointing. The government really needs to try much harder to protect the interests of internet users.

What was that again, Article 17 (or 13)?

Article 17 (formerly Article 13) includes a provision that makes platforms directly responsible for the copyright infringement from content that users upload to those services. It does not solve the problem it’s supposed to solve, but it does limit the freedom of internet users tremendously. It will oblige online companies such as Google and SoundCloud to scan and approve everything their users upload. This will likely lead to those companies, in order to avoid legal liability, to refuse many uploads in advance.

The Netherlands found European rules harmful

The Dutch government was crystal clear in the debate that took place at the European level prior to the adoption of the Directive: these rules do more harm than good. In 2017, the Netherlands asked the lawyers of the European Commission critical questions about the legal sustainability of the proposal for the Directive. Much later in the process, the Netherlands voted against the text that was to serve as a basis start of the negotiations of the Member States with the Commission and Parliament. Later again, the Dutch permanent representation stated that the adopted proposal “does not strike the right balance between the protection of right holders and the interests of EU citizens and companies”.

From European to Dutch rules

Since this is a Directive, all Member States must incorporate the rules into national legislation. Now that the Directive has been adopted, and introducing chances at the European level is no more possible, transposition to national laws is the place to limit the damage. In other words: with a minimal transposition the rights of the internet user are protected to the maximum extent. If the Netherlands was so critical of the Directive, you would expect it to also do its utmost to try to limit as much as possible the damage it will cause in the transposition into national legislation. But, unfortunately…

On 2 July 2019, the Dutch Ministry of Justice and Security published a draft bill for this transposition. That proposal is disappointing in its lack of ambition to protect the interests of the internet users It does not limit the harm by providing an adequate limited implementation of Article 17, as it could, and it does not force that the guarantees for users provided in the Directive are properly explained.

A more ambitious proposal is desperately needed

EDRi member Bits of Freedom strongly urges the Dutch government to come up with a more ambitious bill. A transposition in which the damage of the Directive is limited as much as possible and the rights of the internet users are protected as much as possible. Because this is a particularly complex legal matter, Bits of Freedom also recommends that, prior to the drafting of the bill, an investigation be carried out into the scope that a Member State has to limit the damage of the Directive. This research could be carried out by academics with expertise in the field of copyright.

The Netherlands must do better

In short, the Netherlands must do better. The fact that the Directive has been adopted does not mean that the battle is lost. There’s still a lot that can be done to limit its potential negative impacts. Here, too, hard work pays off: you reap what you sow.

Bits of Freedom
https://www.bitsoffreedom.nl/

Come on government, stand up for us! (only in Dutch, 29.08.2019)
https://www.bitsoffreedom.nl/2019/08/29/kom-op-kabinet-kom-op-voor-ons/

Come on government, stand up for us! (11.09.2019)
https://www.bitsoffreedom.nl/2019/09/11/come-on-government-stand-up-for-us/

Bits of Freedom’s advice to the Dutch government (only in Dutch, 26.08.2019)
https://www.bitsoffreedom.nl/wp-content/uploads/2019/08/20190826-inbreng-bitsoffreedom-consultatie-implementatie-auteursrechtrichtlijn.pdf

Does #article13 protect users against unjustified content blockages? (only in Dutch, 25.03.2019)
https://www.bitsoffreedom.nl/2019/03/25/beschermt-artikel13-gebruikers-tegen-onterechte-content-blokkades/

Bill of Implementation Directive Copyright in the Digital Single Market (only in Dutch)
https://www.internetconsultatie.nl/auteursrecht

NGOs call to ensure fundamental rights in copyright implementation (20.05.2019)
https://edri.org/ngos-call-to-ensure-fundamental-rights-in-copyright-implementation/

(Contribution by Rejo Zenger, EDRi member Bits of Freedom; translation from Dutch to English by Bits of Freedom volunteers Celeste Vervoort and Martin van Veen)

close
11 Sep 2019

CJEU: Public documents could be censored because of copyright

By Diego Naranjo

On 29 July 2019, the Court of Justice of the European Union (CJEU) delivered a judgment that could have serious impact on freedom of expression. The case (C‑469/17) concerns Funke Medien NRW GmbH, the editor of the German daily newspaper Westdeutsche Allgemeine Zeitung, and Bundesrepublik Deutschland (Federal Republic of Germany). It follows a request in the proceedings between those two parties concerning the publication of classified documents (with the lowest degree of confidentiality) called “Parliament briefings”, or “UdPs”, by the German publisher.

Copyright or freedom of the press?

The UdPs in question are military status reports regularly sent to Federal ministries about the deployment of the German military abroad. Funke Medien tried initially to obtain a series of documents via a freedom of information request, which was denied. Later on, the publisher obtained the documents via an unknown source and published them as part of the “Afghanistan papers”. The German government took the view that Funke Medien had infringed the copyrights by publishing the UdPs. The Regional Court in which the dispute was being settled referred the case to the CJEU for a preliminary ruling regarding the exception to use copyrighted work for journalistic reasons (Article 5(3) of Directive 2001/29, the “Infosoc Directive”) and the limits of freedom of information and freedom of the media in relation to copyrighted works. Exceptions and limitations allow to use copyrighted works without specific authorisation from the rights holder (the author or the entity owning the rights of reproduction).

CJEU replies: Copyright is king. Or freedom of the press. It depends.

In the preliminary ruling, the Court stated that military reports can be protected by copyright “only if those reports are an intellectual creation of their author which reflect the author’s personality and are expressed by free and creative choices”. Whether military reports (or any other piece of public information) is protected by copyright must be, according to the CJEU, defined case by case by national member states courts. This national interpretation also applies to exceptions in Article 5(3) of the Infosoc Directive, in this case for reproduction rights for journalistic purposes. The CJEU said that it is also up for the Member States to decide how copyright exceptions and limitations apply on a case by case basis, but that, when implementing EU law, States need to “ensure that they rely on an interpretation of the directive which allows a fair balance to be struck between the various fundamental rights protected by the European Union legal order”.

The Court said that even if the UDPs were considered works protected by copyright, Funke Medien had the right to use the work for journalistic purposes under the copyright exception, and that therefore the publication of the papers was legal. However, this does not set a precedent for similar piece of content in other Member States (or even in Germany), since it’s up to the national courts to decide how to balance fundamental rights at stake (protection of “Intellectual Property” versus freedom of expression and freedom of information, for example).

Towards nationally implemented upload filters?

This judgment does not act in vacuum. It’s only a few months since the EU copyright Directive was adopted. From the very beginning of the discussion EDRi and other civil society groups raised the alarm on the risks of Article 13, and we published thorough analysis of Article 17 (then Article 13). One of the main risks of the adopted text is that, in order to ensure that they make their “best efforts to ensure the unavailability of specific works”, platforms will be obliged to use upload filters and scan each text, video, image (yes, including memes!) and audio uploaded to their services to avoid being sued by rights holders. Despite the widely extended claims that the copyright Directive would not lead to upload filters, it became pretty clear pretty quickly that it was all about implementing upload filters.

National parliaments are deciding how to implement the copyright Directive, and the way this happens could lead to upload filters taking care of how even documents from public authorities with a public relevance become part of the public discourse. The EU-approved censorship machines could decide to block this content to avoid judicial disputes over “copyrighted” content with a public relevance, such as the ones at stake in this court case.

Get in touch with your national EDRi members, Wikimedia chapter or consumer organisation, and make sure upload filters are not going to be mandatory in your country!

CJEU judgment Funke Medien NRW GmbH vs Bundesrepublik Deutschland, Case C‑469/17 (29.07.2019)
http://curia.europa.eu/juris/document/document.jsf?text=&docid=216545&pageIndex=0&doclang=en

Press Release: Censorship machine takes over EU’s internet (26.03.2019)
https://edri.org/censorship-machine-takes-over-eu-internet/

Re-Deconstructing upload filters proposal in the copyright Directive (28.06.2018)
https://edri.org/redeconstructing-article13/

(Contribution by Diego Naranjo, EDRi)

close
04 Sep 2019

E-Commerce review: Safeguarding human rights when moderating online content

By EDRi

This is the fourth and last blog post in our series on Europe’s future rules for intermediary liability and content moderation. You can read the introduction here.

In our previous blog posts on the upcoming E-Commerce review, we discussed examples of what can go wrong with online content regulation. But let’s imagine for a moment that all goes well: the new legislation will be based on actual evidence, a workable liability exemption for hosting companies will be well maintained, and potential negative side effects will be mitigated.

Even then, policymakers will need to put in place sufficient safeguards in order to avoid the wrongful removal of legitimate speech, art and knowledge.

A workable notice-and-action system

The current E-Commerce Directive and the absence of workable notice-and-action rules have created a Wild West of intransparent content moderation and removal practices. Notice-and-action rules would establish a coherent system for people to flag illegal content or activities on platforms like Facebook, Youtube and Twitter (the “notice”) and to which the platform companies would be obliged to respond (the “action”). Which action to take should of course depend on the type of allegedly illegal content or activity that is concerned. Implementing those rules would need to be mandatory for platform companies.

Although many academics and NGOs have written extensively about safeguards that should be introduced in any notice-and-action system, there are currently no minimum procedural requirements in place in the EU that hosting companies would be obliged to follow. This is not good for reputable businesses and certainly not good for people.

Some examples of human rights safeguards

  • There should be an obligation for hosting companies to notify users when their content has been flagged by someone, ideally before any action against the content is taken, with exceptions where law enforcement needs to investigate criminal offences.
  • Hosting companies should be obliged to report certain very serious criminal offences (such as the distribution of child abuse material) to law enforcement.
  • Users whose content has been flagged should be able to send a counter-notice to defend themselves against wrongful removals.
  • Users need a general right to appeal content moderation decisions taken by hosting companies. They should also be informed about their right to an effective judicial redress if their appeal has been unsuccessful.
  • Hosting companies should be bound to clearly defined procedural time frames for reacting to notices and counter-notices.
  • There should be a minimum standard defining how a valid notice must look like and what it needs to contain; abusive notices should be discouraged by effective administrative fines.
  • Where ever possible, temporary suspension of allegedly illegal online content or activities should take precedence over definite removals.
  • Transparency reports about removals, wrongful take-downs, hosting companies’ policies, processes and other practices that impact user rights should be required.

While this list is not exhaustive, it provides the baseline for a human rights-respecting notice-and-action system that should be implemented as part of an E-Commerce Directive review. Designing such a system is far from simple, and there are opposing commercial and political interests involved that will push hard to have it their way.

However, similarly to the General Data Protection Regulation (GDPR) and data protection, getting the content moderation conundrum right provides Europe with a unique opportunity to set global human rights standards for a thriving online space – a space where everybody can feel safe, express themselves freely, and benefit from unhindered open access to the vast amount of knowledge and opportunity that is the internet.

E-Commerce Directive
https://ec.europa.eu/digital-single-market/en/e-commerce-directive

Safeguards for Freedom of Expression in the Era of Online Gatekeeping (11.09.2018)
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3247682

Notice-and-Fair-Balance: How to Reach a Compromise between Fundamental Rights in European Intermediary Liability
https://www.ivir.nl/publicaties/download/Notice_and_Fair_Balance.pdf

Manila Principles on intermediary liability
https://www.manilaprinciples.org/

The Santa Clara Principles On Transparency and Accountability in Content Moderation
https://santaclaraprinciples.org/

This article is the introduction to our blogpost series on Europe’s future rules for intermediary liability and content moderation. The series presents the three main points that should be taken into account in an update of the E-Commerce Directive:

  1. E-Commerce review: Opening Pandora’s box?
  2. Technology is the solution. What is the problem?
  3. Mitigating collateral damage and counter-productive effects
  4. Safeguarding human rights when moderating online content
close
03 Sep 2019

Wanted: Policy Advisor!

By EDRi

European Digital Rights (EDRi) is an international not-for-profit association of 42 digital human rights organisations from across Europe and beyond. We defend and promote rights and freedoms in the digital environment, such as the right to privacy, personal data protection, freedom of expression, and access to information.

European Digital Rights’ Brussels office is looking for a talented, dedicated Policy Advisor who will be supporting the work of the EDRi Policy Team on issues such as platform regulation, cross-border access to data, privacy, data retention, Artificial Intelligence (AI), copyright and data protection.

This is an opportunity to be part of a growing and well-respected NGO that is making a real difference in the defence and promotion of online rights and freedoms in Europe and beyond. The deadline to apply is 27 September 2019. This full-time, permanent position starting at the latest on 15 November 2019.

As a Policy Advisor, your main tasks will be to:

  • Monitor, analyse and report about human rights implications of EU digital policy developments;
  • Support the work of the policy team in their advocacy work for the protection of digital rights, particularly but not exclusively in the areas of platform regulation, artificial intelligence, data protection, privacy, net neutrality and copyright;
  • Provide policy-makers with expert, timely and accurate input;
  • Draft policy documents, such as briefings, position papers, amendment proposals, advocacy one-pagers, letters, blog posts and EDRi-gram articles, with the support and supervision of senior staff within the team;
  • Support the policy team in providing EDRi members with information about relevant legislative processes as well as in coordinating EDRi’s working groups;
  • Help developing campaign messages and providing the public with information about relevant legislative processes and EDRi’s activities;
  • Organise and participate in expert meetings along with senior staff;
  • Maintain good relationships with policy-makers, stakeholders and the press;
  • Support and work closely with other staff members including policy, communications and campaigns colleagues.

Desired qualifications and experience:

  • Minimum 1 year of relevant experience in a similar role or EU institution;
  • A university degree in law, EU affairs, policy, human rights or related field or equivalent experience;
  • Demonstrable basic knowledge of and interest in platform regulation, data protection, privacy and copyright, as well as other internet policy issues;
  • Knowledge and understanding of the EU, its institutions and its role in digital rights policies;
  • Exceptional written and oral communications skills;
  • Basic knowledge of the functioning of the internet as well as basic technology skills are required; experience using free software such as LibreOffice, WordPress and Nextcloud are an asset;
  • Strong multitasking abilities and ability to manage multiple deadlines;
  • Experience of working with and in small teams;
  • Experience of organising events and/or workshops is an asset;
  • Ability to work in English. Other European languages, especially French, is an advantage.
  • What EDRi offers:
  • A permanent, full-time contract;
  • A friendly, multicultural and enthusiastic team of experts based in Brussels;
  • A competitive NGO salary with benefits;
  • The opportunity to promote the protection of fundamental rights in important legislative proposals;
  • Access to an international and diverse network of NGOs;
  • Internal career growth;
  • Networking opportunities.

How to apply:

To apply, please send a maximum one-page cover letter and a maximum two-page CV in English and in a single PDF file to applications <at> edri <dot> org with “Policy Advisor” and your full name in the subject line by 27 September 2019 (11.59 pm). Candidates will be expected to be available for interviews throughout October. The successful candidate is expected to start working with us on 15 November 2019.

We are an equal opportunities employer with a strong commitment to transparency and inclusion. We strive to have a diverse and inclusive working environment and strive for a gender balance in the policy team. Therefore, we particularly encourage applications from individuals who identify as women. We also encourage individual members of groups at risk of racism or other forms of discrimination to apply for this post.

Please note that only shortlisted candidates will be contacted.

close
02 Sep 2019

How security policy hijacks the Digital Single Market

By Jan Penfrat

On 22 August, when Politico leaked an internal European Commission document that outlined policy plans for the upcoming mandate from all corners of the EU’s executive branch, Brussels was at high alert. Although the document is an internal note, not an official Commission position, it isn’t irrelevant: Its purpose is to inform the incoming Commissioners on important policy dossiers and make recommendations about what to do with them. The digital policy chapters it contains are definitely informative.

You won’t be surprised to see that the Directorate‑General for Communications Networks, Content and Technology (DG CNECT), the Commission’s department for all things digital, focusses on completing the Digital Single Market and—perhaps most importantly—it’s own new pet project: the Digital Services Act (DSA).

The DSA will likely become the next major internet regulation battle ground. It regulates social media platforms and all kinds of online services, including the legal responsibilities they will have for content uploaded by users.

Ill-conceived rules threaten people’s freedoms

For a start, DG CNECT formulates a number of positive ideas, for instance on the use of digital tech to improve Europe’s environmental footprint, or an ambitious regulatory framework for artificial intelligence, with a strong focus to protect fundamental rights. We welcome both proposals and encourage EU policy makers to work along with civil society to achieve those goals, with the thorough debate needed and taking into consideration tools we already have at hand, such as strongly enforcing the General Data Protection Regulation (GDPR) and adopting the ePrivacy Regulation.

In addition to this, the document includes a chapter on the planned Digital Services Act in which it suggests to impose a “duty of care” for online services that deal with illegal and “harmful content” on their systems. The stated goal is to tackle things like effective content moderation, political online advertisement, disinformation, and the protection of minors.

While EDRi would prefer that the DSA regulates only illegal content, rather than vaguely defined “harmful” material, we’re glad to see that DG CNECT explicitly recognises the risk that “heavy, ill-conceived rules” pose for media pluralism, freedom of expression and other fundamental rights. Because their colleagues over at the Directorate-General for Migration and Home Affairs (DG HOME) have a very different approach.

Burning the house to roast the pig

Technically, DG HOME is responsible for topics such as migration, human trafficking, cyber crime and terrorism. Yet the department dedicated more than a quarter of their policy idea space to DG CNECT’s Digital Services Act.

DG HOME calls its contribution “For a safe and responsible use of the internet”. According to its authors, today’s internet is mostly a lawless place used for “identity theft, ransomware, child pornography, incitement to terrorism, organising crime making use of more and more encrypted environment.

To bring order to this online wild west, DG HOME’s internet specialists propose that, in the future, all platform companies should take “proactive measures”, also known as upload filters, to prevent the “criminal abuse of their services.” Sounds familiar? That’s because DG HOME’s “responsible use of the internet” looks dangerously similar to the general monitoring obligation of the new EU Copyright Directive. Apparently it needs to be emphasised, once again, that the European Court of Justice has consistently ruled that such general monitoring obligations are in violation of our fundamental rights, and therefore illegal. Experts frequently add that, taken alone, filters are also ineffective in tackling illegal online content.

DG HOME’s proposal also includes an obligation for online platforms to shut down accounts that display illicit or harmful (yet legal) content such as disinformation — an idea that would turn companies into arbiters of truth.

Both DG CNECT and DG HOME have to consult each other and cooperate on files of shared interest, that is normal procedure. But the apparent attempt to hijack the framing around the DSA, before the legislative proposal is even written, is staggering. A Digital Single Market file with major fundamental rights implications should not be pushed into the security policy sphere where it risks being abused to curtail people’s fundamental rights.

European Commission internal document on proposed priorities
https://www.politico.eu/wp-content/uploads/2019/08/clean_definite2.pdf

More responsibility to online platforms – but at what cost? (19.07.2019)
https://edri.org/more-responsibility-to-online-platforms-but-at-what-cost/

E-Commerce review: Opening Pandora’s box? (20.06.2019)
https://edri.org/e-commerce-review-1-pandoras-box/

Filters Incorporated (09.04.2019)
https://edri.org/filters-inc/

close
27 Aug 2019

E-Commerce review: Mitigating collateral damage

By EDRi

This is the third article in our series on Europe’s future rules for intermediary liability and content moderation. You can read the introduction here.

Asking social media and other platform companies to solve problems around illegal online content can have serious unintended consequences. It’s therefore crucial that new EU legislation in this field considers such consequences and mitigates any collateral damage.

With the adoption of the EU’s E-Commerce Directive in 2000, policymakers put the decision about what kind of content should be removed into the hands of hosting companies. According to this old law, hosting companies are obliged to remove illegal content as soon as they gain knowledge of it. In practice, this means that companies are forced to take a huge number of decisions on a daily basis about the legality of user-uploaded content. And, them being commercial entities, they try to do it fast and cheap. This has serious implications for our fundamental rights to freedom of expression and to access to information.

What’s the problem?

So, what’s the problem? Shouldn’t platform companies be able to decide what can and cannot be posted on their systems? In principle, yes. The problem is that in many cases, the decision about the legality of a given piece of content is not straightforward. It often requires a complex legal analysis that takes into account local laws, customs, and context. Platform companies, however, have no interest in dealing with such complex (and therefore expensive) judgements – quite the opposite! As soon as businesses run the risk of being held liable for user-uploaded content, they have a strong commercial incentive to remove anything that could remotely be considered illegal – anywhere they operate.

To make things worse, many platforms use broad and often vaguely worded terms of service. The intransparent application of those terms has led to overly eager take-down practices at the expense of human rights defenders, artists, and marginalised communities. This was pointed out for example in a recent report from the Electronic Frontier Foundation (EFF), one of EDRi’s US-based members.

Human rights organisations, and especially those fighting for lesbian, gay, bisexual, transgender and queer and intersex (LGBTQI) rights, often face two problems on social media: On the one hand, their content is regularly taken down because alleged breaches of terms of service – despite being completely legal in their country. On the other hand, they are faced with hateful comments and violent threats by other users that are often not removed by platforms. As the EFF report states: “Content moderation does not affect all groups evenly, and has the potential to further disenfranchise already marginalised communities.”

Wrongful take-downs are common

Because none of the big social media companies today include any statistical information about wrongful take-downs and removals in their transparency reports, we can only rely on publicly available evidence to understand the scale of this problem. The examples we know, however, indicate that it’s big. Here are some of them:

  • YouTube removed educational videos about the holocaust, falsely classifying them as hate speech (Newsweek).
  • Facebook removed posts from Black Lives Matter activists, falsely claiming they amounted to hate speech (USA Today).
  • Twitter temporarily blocked the account of a Jewish newspaper for quoting Israel’s ambassador to Germany as saying he avoids contact with Germany’s right-wing AfD party. Twitter claimed the tweet qualified as “election interference” (Frankfurter Allgemeine).
  • Facebook removed posts and blocked accounts of Kurdish activists criticising the Turkish government, classifying them as hate speech and “references to terrorist organisations” (Buzzfeed).

Despite the numerous examples of failing filters, there seems to be an increasing belief by policymakers that algorithms and automatic filtering technologies can solve illegal content problems – without enough thought given to the harmful side-effects.

Content filters are not the solution…

As EDRi has argued before, filters do fail: We’ve seen automated take-downs and blocking of public domain works, of satire and nuance, and even of works uploaded by legitimate rightsholders themselves. Also, filters are not as efficient as one might think. The Christchurch video streaming incident, for example, has shown that content filters based on hash databases can be easily circumvented by applying only minimal changes to the content.

The belief that big tech companies and their filter technologies can somehow magically solve all problems in society is not only misguided, it’s a threat to people’s fundamental rights. The lack of transparency and ineffectiveness of filters also means that the number of take-downs by the big platforms alone is a poor measure of success in the fight against illegal online content.

…but changing business incentives is.

Instead of mandating even more failing filter technology, EU legislation that aims at tackling illegal online content should focus on the root causes of the problem. In reality, many platform companies benefit from controversial content. Hateful tweets, YouTube videos featuring conspiracy theories and outright lies, and scandalous defamatory posts on Facebook are all a great way for platforms to drive “user engagement” and maximise screen time, which in turn increases advertisement profits. These commercial incentives need to be changed.

The fourth and last blogpost of this series will be published briefly. It will focus on what platform companies should be doing, and what a new EU legislation on illegal online content that respects people’s fundamental rights should look like.

Other articles of this series on the E-Commerce Directive review:
1. Opening Pandora’s box?
2. Technology is the solution. What is the problem?

close
26 Jul 2019

Job alert: EDRi is looking for a Senior Policy Advisor

By EDRi

European Digital Rights (EDRi) is an international not-for-profit association of 42 digital human rights organisations from across Europe and beyond. We defend and promote rights and freedoms in the digital environment, such as the right to privacy, personal data protection, freedom of expression, and access to information.

EDRi is looking for a talented and dedicated Senior Policy Advisor to join EDRi’s team in Brussels. This is a unique opportunity to be part of a growing and well-respected NGO that is making a real difference in the defence and promotion of online rights and freedoms in Europe and beyond. The deadline to apply is 15 September 2019. This full-time, permanent position offers a competitive salary and must be filled in as soon as possible. The latest start is 15 October 2019.

Key responsibilities:

As a Senior Policy Advisor, your main tasks will be to:

  • Monitor, analyse and report about human rights implications of EU digital policy developments;
  • Advocate for the protection of digital rights, particularly but not exclusively in the areas of artificial intelligence, data protection, privacy, net neutrality and copyright;
  • Provide policy-makers with expert, timely and accurate input;
  • Draft policy documents, such as briefings, position papers, amendments, advocacy one-pagers, letters, blogposts and EDRi-gram articles;
  • Provide EDRi members with information about EU’s relevant legislative processes, coordinate working groups, help developing campaign messages and providing the public with information about EU’s relevant legislative processes and EDRi’s activities.
  • Represent EDRi at European and global events;
  • Organise and participate in expert meetings;
  • Maintain good relationships with policy-makers, stakeholders and the press;
  • Support and work closely with other staff members including policy, communications and campaigns colleagues and report to the Head of Policy and to the Executive Director;
  • Contribute to the policy strategy of the organisation;

Desired qualifications and experience:

  • Minimum 3 years of relevant experience in a similar role or EU institution;
  • A university degree in law, EU affairs, policy, human rights or related field or equivalent experience;
  • Demonstrable knowledge of, and interest in data protection, privacy and copyright, as well as other internet policy issues;
  • Knowledge and understanding of the EU, its institutions and its role in digital rights policies;
  • Experience in leading advocacy efforts and creating networks of influence;
  • Exceptional written and oral communications skills;
  • IT skills; experience using free software and free/open operation systems, WordPress and Nextcloud are an asset;
  • Strong multitasking abilities and ability to manage multiple deadlines;
  • Experience of working with and in small teams;
  • Experience of organising events and/or workshops;
  • Ability to work in English. Other European languages, especially French, is an advantage.

What EDRi offers:

  • A permanent, full-time contract;
  • A dynamic, multicultural and enthusiastic team of experts based in Brussels;
  • A competitive NGO salary with benefits;
  • The opportunity to foster the protection of fundamental rights in important legislative proposals;
  • A high degree of autonomy and flexibility;
  • An international and diverse network;
  • Internal career growth;
  • Networking opportunities.

How to apply:

To apply, please send a maximum one-page cover letter and a maximum two-page CV in English and in .pdf format to applications <at> edri <dot> org with “Senior Policy Advisor” in the subject line by 15 September 2019 (11.59 pm). Candidates will be expected to be available for interviews throughout September.

We are an equal opportunities employer with a strong commitment to transparency and inclusion. We strive to have a diverse and inclusive working environment and ideally, we would like to strive for a gender balance in the policy team. Therefore, we particularly encourage applications from individuals who identify as women. We also encourage individual members of groups at risk of racism or other forms of discrimination to apply for this post.

Please note that only shortlisted candidates will be contacted.

close