privatised law enforcement

A measure which would be illegal if implemented by a government should also be illegal if implemented by industry as a “voluntary” measure, as a result of government pressure or for public relations or anti-competitive reasons. However, as key international legal instruments, such as the European Charter of Fundamental Rights and the European Convention on Human Rights, as well as national constitutions are binding for states and governments, they are not directly applicable to other entities, such as private companies. As a result, there is a major trend towards governments persuading or coercing companies to impose restrictions on fundamental freedoms under the guise of “self-regulation,” thereby circumventing legal protections.

11 Sep 2019

Poland challenges copyright upload filters before the CJEU

By Centrum Cyfrowe Foundation

On 24 May 2019, Poland initiated a legal challenge (C-401/19) before the Court of Justice of the European Union (CJEU) against Article 17 of the Directive on copyright in the Digital Single Market. EDRi member Centrum Cyfrowe Foundation has previously tried to get access to the complaint using freedom of information (FOI) requests, without success. Now, the CJEU has finally published the application for this legal challenge.

Bringing the Directive to the Court of Justice is a positive step that can help clear controversies concerning its Article 17. An independent court will assess issues that in the policy debate preceding the adoption of the Directive were typically dismissed by representatives of rights holders as fear-mongering or disinformation.

The Republic of Poland seeks the annulment of Article 17(4)(b) and Article 17(4)(c) of the copyright Directive.Alternatively, should the Court find that the contested provisions cannot be deleted from Article 17 of Directive without substantively changing the rules contained in the remaining provisions of that article, Poland claims that the Court should annul Article 17 of Directive in its entirety.

Poland claims that the Directive infringes the right to freedom of expression and information guaranteed by Article 11 of the Charter of Fundamental Rights of the European Union. The legal challenge mentions as particularly problematic the imposition on online content-sharing service providers to “make best efforts to ensure the unavailability of specific works and other subject matter for which the rights holders have provided the service providers with the relevant and necessary information” and to “make best efforts to prevent the future uploads of protected works or other subject-matter for which the rights holders have lodged a sufficiently substantiated notice make it necessary for the service providers — in order to avoid liability — to carry out prior automatic verification (filtering) of content uploaded online by users, and therefore make it necessary to introduce preventive control mechanisms”. In other words, obliging online platforms to filter all uploads by their users.

Unfortunately, the political context of the challenge has raised some questions. The complaint was submitted just two days before the elections of the European Parliament and Poland’s ruling Law and Justice party (PiS) has been brandishing its opposition to upload filters against the biggest opposition party, Civic Platform.

The EU Member States (and Iceland, Liechtenstein and Norway) have until 2 October 2019 to submit an application to the CJEU to intervene in this case, as defined by Chapter 4 of the CJEU’s Rules of Procedure (RoP). Member States can intervene to support, in whole or in part, either Poland’s position on Article 17 or the Council and Parliament’s position on Article 17.

Centrum Cyfrowe Foundation
https://centrumcyfrowe.pl/en/

The Copyright Directive challenged in the CJEU by Polish government (01.06.2019)
https://www.communia-association.org/2019/06/01/copyright-directive-challenged-cjeu-polish-government/

CJEU Case C-401/19 – Poland v Parliament and Council
http://curia.europa.eu/juris/liste.jsf?language=en&num=C-401/19

Directive (EU) 2019/790 on copyright and related rights in the Digital Single Market, Article 17
https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:32019L0790#017

(Contribution by Natalia Mileszyk, EDRi member Centrum Cyfrowe Foundation, Poland)

close
11 Sep 2019

The Netherlands, aim for a more ambitious copyright implementation!

By Bits of Freedom

All EU Member States are obliged to implement the newly adopted EU Copyright Directive, including its controversial Article 17. But how to interpret it, is up to them. In the Netherlands, there is currently a draft bill, which is unfortunately very disappointing. The government really needs to try much harder to protect the interests of internet users.

What was that again, Article 17 (or 13)?

Article 17 (formerly Article 13) includes a provision that makes platforms directly responsible for the copyright infringement from content that users upload to those services. It does not solve the problem it’s supposed to solve, but it does limit the freedom of internet users tremendously. It will oblige online companies such as Google and SoundCloud to scan and approve everything their users upload. This will likely lead to those companies, in order to avoid legal liability, to refuse many uploads in advance.

The Netherlands found European rules harmful

The Dutch government was crystal clear in the debate that took place at the European level prior to the adoption of the Directive: these rules do more harm than good. In 2017, the Netherlands asked the lawyers of the European Commission critical questions about the legal sustainability of the proposal for the Directive. Much later in the process, the Netherlands voted against the text that was to serve as a basis start of the negotiations of the Member States with the Commission and Parliament. Later again, the Dutch permanent representation stated that the adopted proposal “does not strike the right balance between the protection of right holders and the interests of EU citizens and companies”.

From European to Dutch rules

Since this is a Directive, all Member States must incorporate the rules into national legislation. Now that the Directive has been adopted, and introducing chances at the European level is no more possible, transposition to national laws is the place to limit the damage. In other words: with a minimal transposition the rights of the internet user are protected to the maximum extent. If the Netherlands was so critical of the Directive, you would expect it to also do its utmost to try to limit as much as possible the damage it will cause in the transposition into national legislation. But, unfortunately…

On 2 July 2019, the Dutch Ministry of Justice and Security published a draft bill for this transposition. That proposal is disappointing in its lack of ambition to protect the interests of the internet users It does not limit the harm by providing an adequate limited implementation of Article 17, as it could, and it does not force that the guarantees for users provided in the Directive are properly explained.

A more ambitious proposal is desperately needed

EDRi member Bits of Freedom strongly urges the Dutch government to come up with a more ambitious bill. A transposition in which the damage of the Directive is limited as much as possible and the rights of the internet users are protected as much as possible. Because this is a particularly complex legal matter, Bits of Freedom also recommends that, prior to the drafting of the bill, an investigation be carried out into the scope that a Member State has to limit the damage of the Directive. This research could be carried out by academics with expertise in the field of copyright.

The Netherlands must do better

In short, the Netherlands must do better. The fact that the Directive has been adopted does not mean that the battle is lost. There’s still a lot that can be done to limit its potential negative impacts. Here, too, hard work pays off: you reap what you sow.

Bits of Freedom
https://www.bitsoffreedom.nl/

Come on government, stand up for us! (only in Dutch, 29.08.2019)
https://www.bitsoffreedom.nl/2019/08/29/kom-op-kabinet-kom-op-voor-ons/

Come on government, stand up for us! (11.09.2019)
https://www.bitsoffreedom.nl/2019/09/11/come-on-government-stand-up-for-us/

Bits of Freedom’s advice to the Dutch government (only in Dutch, 26.08.2019)
https://www.bitsoffreedom.nl/wp-content/uploads/2019/08/20190826-inbreng-bitsoffreedom-consultatie-implementatie-auteursrechtrichtlijn.pdf

Does #article13 protect users against unjustified content blockages? (only in Dutch, 25.03.2019)
https://www.bitsoffreedom.nl/2019/03/25/beschermt-artikel13-gebruikers-tegen-onterechte-content-blokkades/

Bill of Implementation Directive Copyright in the Digital Single Market (only in Dutch)
https://www.internetconsultatie.nl/auteursrecht

NGOs call to ensure fundamental rights in copyright implementation (20.05.2019)
https://edri.org/ngos-call-to-ensure-fundamental-rights-in-copyright-implementation/

(Contribution by Rejo Zenger, EDRi member Bits of Freedom; translation from Dutch to English by Bits of Freedom volunteers Celeste Vervoort and Martin van Veen)

close
11 Sep 2019

CJEU: Public documents could be censored because of copyright

By Diego Naranjo

On 29 July 2019, the Court of Justice of the European Union (CJEU) delivered a judgment that could have serious impact on freedom of expression. The case (C‑469/17) concerns Funke Medien NRW GmbH, the editor of the German daily newspaper Westdeutsche Allgemeine Zeitung, and Bundesrepublik Deutschland (Federal Republic of Germany). It follows a request in the proceedings between those two parties concerning the publication of classified documents (with the lowest degree of confidentiality) called “Parliament briefings”, or “UdPs”, by the German publisher.

Copyright or freedom of the press?

The UdPs in question are military status reports regularly sent to Federal ministries about the deployment of the German military abroad. Funke Medien tried initially to obtain a series of documents via a freedom of information request, which was denied. Later on, the publisher obtained the documents via an unknown source and published them as part of the “Afghanistan papers”. The German government took the view that Funke Medien had infringed the copyrights by publishing the UdPs. The Regional Court in which the dispute was being settled referred the case to the CJEU for a preliminary ruling regarding the exception to use copyrighted work for journalistic reasons (Article 5(3) of Directive 2001/29, the “Infosoc Directive”) and the limits of freedom of information and freedom of the media in relation to copyrighted works. Exceptions and limitations allow to use copyrighted works without specific authorisation from the rights holder (the author or the entity owning the rights of reproduction).

CJEU replies: Copyright is king. Or freedom of the press. It depends.

In the preliminary ruling, the Court stated that military reports can be protected by copyright “only if those reports are an intellectual creation of their author which reflect the author’s personality and are expressed by free and creative choices”. Whether military reports (or any other piece of public information) is protected by copyright must be, according to the CJEU, defined case by case by national member states courts. This national interpretation also applies to exceptions in Article 5(3) of the Infosoc Directive, in this case for reproduction rights for journalistic purposes. The CJEU said that it is also up for the Member States to decide how copyright exceptions and limitations apply on a case by case basis, but that, when implementing EU law, States need to “ensure that they rely on an interpretation of the directive which allows a fair balance to be struck between the various fundamental rights protected by the European Union legal order”.

The Court said that even if the UDPs were considered works protected by copyright, Funke Medien had the right to use the work for journalistic purposes under the copyright exception, and that therefore the publication of the papers was legal. However, this does not set a precedent for similar piece of content in other Member States (or even in Germany), since it’s up to the national courts to decide how to balance fundamental rights at stake (protection of “Intellectual Property” versus freedom of expression and freedom of information, for example).

Towards nationally implemented upload filters?

This judgment does not act in vacuum. It’s only a few months since the EU copyright Directive was adopted. From the very beginning of the discussion EDRi and other civil society groups raised the alarm on the risks of Article 13, and we published thorough analysis of Article 17 (then Article 13). One of the main risks of the adopted text is that, in order to ensure that they make their “best efforts to ensure the unavailability of specific works”, platforms will be obliged to use upload filters and scan each text, video, image (yes, including memes!) and audio uploaded to their services to avoid being sued by rights holders. Despite the widely extended claims that the copyright Directive would not lead to upload filters, it became pretty clear pretty quickly that it was all about implementing upload filters.

National parliaments are deciding how to implement the copyright Directive, and the way this happens could lead to upload filters taking care of how even documents from public authorities with a public relevance become part of the public discourse. The EU-approved censorship machines could decide to block this content to avoid judicial disputes over “copyrighted” content with a public relevance, such as the ones at stake in this court case.

Get in touch with your national EDRi members, Wikimedia chapter or consumer organisation, and make sure upload filters are not going to be mandatory in your country!

CJEU judgment Funke Medien NRW GmbH vs Bundesrepublik Deutschland, Case C‑469/17 (29.07.2019)
http://curia.europa.eu/juris/document/document.jsf?text=&docid=216545&pageIndex=0&doclang=en

Press Release: Censorship machine takes over EU’s internet (26.03.2019)
https://edri.org/censorship-machine-takes-over-eu-internet/

Re-Deconstructing upload filters proposal in the copyright Directive (28.06.2018)
https://edri.org/redeconstructing-article13/

(Contribution by Diego Naranjo, EDRi)

close
04 Sep 2019

E-Commerce review: Safeguarding human rights when moderating online content

By EDRi

This is the fourth and last blog post in our series on Europe’s future rules for intermediary liability and content moderation. You can read the introduction here.

In our previous blog posts on the upcoming E-Commerce review, we discussed examples of what can go wrong with online content regulation. But let’s imagine for a moment that all goes well: the new legislation will be based on actual evidence, a workable liability exemption for hosting companies will be well maintained, and potential negative side effects will be mitigated.

Even then, policymakers will need to put in place sufficient safeguards in order to avoid the wrongful removal of legitimate speech, art and knowledge.

A workable notice-and-action system

The current E-Commerce Directive and the absence of workable notice-and-action rules have created a Wild West of intransparent content moderation and removal practices. Notice-and-action rules would establish a coherent system for people to flag illegal content or activities on platforms like Facebook, Youtube and Twitter (the “notice”) and to which the platform companies would be obliged to respond (the “action”). Which action to take should of course depend on the type of allegedly illegal content or activity that is concerned. Implementing those rules would need to be mandatory for platform companies.

Although many academics and NGOs have written extensively about safeguards that should be introduced in any notice-and-action system, there are currently no minimum procedural requirements in place in the EU that hosting companies would be obliged to follow. This is not good for reputable businesses and certainly not good for people.

Some examples of human rights safeguards

  • There should be an obligation for hosting companies to notify users when their content has been flagged by someone, ideally before any action against the content is taken, with exceptions where law enforcement needs to investigate criminal offences.
  • Hosting companies should be obliged to report certain very serious criminal offences (such as the distribution of child abuse material) to law enforcement.
  • Users whose content has been flagged should be able to send a counter-notice to defend themselves against wrongful removals.
  • Users need a general right to appeal content moderation decisions taken by hosting companies. They should also be informed about their right to an effective judicial redress if their appeal has been unsuccessful.
  • Hosting companies should be bound to clearly defined procedural time frames for reacting to notices and counter-notices.
  • There should be a minimum standard defining how a valid notice must look like and what it needs to contain; abusive notices should be discouraged by effective administrative fines.
  • Where ever possible, temporary suspension of allegedly illegal online content or activities should take precedence over definite removals.
  • Transparency reports about removals, wrongful take-downs, hosting companies’ policies, processes and other practices that impact user rights should be required.

While this list is not exhaustive, it provides the baseline for a human rights-respecting notice-and-action system that should be implemented as part of an E-Commerce Directive review. Designing such a system is far from simple, and there are opposing commercial and political interests involved that will push hard to have it their way.

However, similarly to the General Data Protection Regulation (GDPR) and data protection, getting the content moderation conundrum right provides Europe with a unique opportunity to set global human rights standards for a thriving online space – a space where everybody can feel safe, express themselves freely, and benefit from unhindered open access to the vast amount of knowledge and opportunity that is the internet.

E-Commerce Directive
https://ec.europa.eu/digital-single-market/en/e-commerce-directive

Safeguards for Freedom of Expression in the Era of Online Gatekeeping (11.09.2018)
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3247682

Notice-and-Fair-Balance: How to Reach a Compromise between Fundamental Rights in European Intermediary Liability
https://www.ivir.nl/publicaties/download/Notice_and_Fair_Balance.pdf

Manila Principles on intermediary liability
https://www.manilaprinciples.org/

The Santa Clara Principles On Transparency and Accountability in Content Moderation
https://santaclaraprinciples.org/

Read the other articles of this series on e-Commerce review:
1) Opening Pandora’s box?
2) Technology is the solution. What is the problem?
3) Mitigating collateral damage

close
03 Sep 2019

Wanted: Policy Advisor!

By EDRi

European Digital Rights (EDRi) is an international not-for-profit association of 42 digital human rights organisations from across Europe and beyond. We defend and promote rights and freedoms in the digital environment, such as the right to privacy, personal data protection, freedom of expression, and access to information.

European Digital Rights’ Brussels office is looking for a talented, dedicated Policy Advisor who will be supporting the work of the EDRi Policy Team on issues such as platform regulation, cross-border access to data, privacy, data retention, Artificial Intelligence (AI), copyright and data protection.

This is an opportunity to be part of a growing and well-respected NGO that is making a real difference in the defence and promotion of online rights and freedoms in Europe and beyond. The deadline to apply is 27 September 2019. This full-time, permanent position starting at the latest on 15 November 2019.

As a Policy Advisor, your main tasks will be to:

  • Monitor, analyse and report about human rights implications of EU digital policy developments;
  • Support the work of the policy team in their advocacy work for the protection of digital rights, particularly but not exclusively in the areas of platform regulation, artificial intelligence, data protection, privacy, net neutrality and copyright;
  • Provide policy-makers with expert, timely and accurate input;
  • Draft policy documents, such as briefings, position papers, amendment proposals, advocacy one-pagers, letters, blog posts and EDRi-gram articles, with the support and supervision of senior staff within the team;
  • Support the policy team in providing EDRi members with information about relevant legislative processes as well as in coordinating EDRi’s working groups;
  • Help developing campaign messages and providing the public with information about relevant legislative processes and EDRi’s activities;
  • Organise and participate in expert meetings along with senior staff;
  • Maintain good relationships with policy-makers, stakeholders and the press;
  • Support and work closely with other staff members including policy, communications and campaigns colleagues.

Desired qualifications and experience:

  • Minimum 1 year of relevant experience in a similar role or EU institution;
  • A university degree in law, EU affairs, policy, human rights or related field or equivalent experience;
  • Demonstrable basic knowledge of and interest in platform regulation, data protection, privacy and copyright, as well as other internet policy issues;
  • Knowledge and understanding of the EU, its institutions and its role in digital rights policies;
  • Exceptional written and oral communications skills;
  • Basic knowledge of the functioning of the internet as well as basic technology skills are required; experience using free software such as LibreOffice, WordPress and Nextcloud are an asset;
  • Strong multitasking abilities and ability to manage multiple deadlines;
  • Experience of working with and in small teams;
  • Experience of organising events and/or workshops is an asset;
  • Ability to work in English. Other European languages, especially French, is an advantage.
  • What EDRi offers:
  • A permanent, full-time contract;
  • A friendly, multicultural and enthusiastic team of experts based in Brussels;
  • A competitive NGO salary with benefits;
  • The opportunity to promote the protection of fundamental rights in important legislative proposals;
  • Access to an international and diverse network of NGOs;
  • Internal career growth;
  • Networking opportunities.

How to apply:

To apply, please send a maximum one-page cover letter and a maximum two-page CV in English and in a single PDF file to applications <at> edri <dot> org with “Policy Advisor” and your full name in the subject line by 27 September 2019 (11.59 pm). Candidates will be expected to be available for interviews throughout October. The successful candidate is expected to start working with us on 15 November 2019.

We are an equal opportunities employer with a strong commitment to transparency and inclusion. We strive to have a diverse and inclusive working environment and strive for a gender balance in the policy team. Therefore, we particularly encourage applications from individuals who identify as women. We also encourage individual members of groups at risk of racism or other forms of discrimination to apply for this post.

Please note that only shortlisted candidates will be contacted.

close
02 Sep 2019

How security policy hijacks the Digital Single Market

By Jan Penfrat

On 22 August, when Politico leaked an internal European Commission document that outlined policy plans for the upcoming mandate from all corners of the EU’s executive branch, Brussels was at high alert. Although the document is an internal note, not an official Commission position, it isn’t irrelevant: Its purpose is to inform the incoming Commissioners on important policy dossiers and make recommendations about what to do with them. The digital policy chapters it contains are definitely informative.

You won’t be surprised to see that the Directorate‑General for Communications Networks, Content and Technology (DG CNECT), the Commission’s department for all things digital, focusses on completing the Digital Single Market and—perhaps most importantly—it’s own new pet project: the Digital Services Act (DSA).

The DSA will likely become the next major internet regulation battle ground. It regulates social media platforms and all kinds of online services, including the legal responsibilities they will have for content uploaded by users.

Ill-conceived rules threaten people’s freedoms

For a start, DG CNECT formulates a number of positive ideas, for instance on the use of digital tech to improve Europe’s environmental footprint, or an ambitious regulatory framework for artificial intelligence, with a strong focus to protect fundamental rights. We welcome both proposals and encourage EU policy makers to work along with civil society to achieve those goals, with the thorough debate needed and taking into consideration tools we already have at hand, such as strongly enforcing the General Data Protection Regulation (GDPR) and adopting the ePrivacy Regulation.

In addition to this, the document includes a chapter on the planned Digital Services Act in which it suggests to impose a “duty of care” for online services that deal with illegal and “harmful content” on their systems. The stated goal is to tackle things like effective content moderation, political online advertisement, disinformation, and the protection of minors.

While EDRi would prefer that the DSA regulates only illegal content, rather than vaguely defined “harmful” material, we’re glad to see that DG CNECT explicitly recognises the risk that “heavy, ill-conceived rules” pose for media pluralism, freedom of expression and other fundamental rights. Because their colleagues over at the Directorate-General for Migration and Home Affairs (DG HOME) have a very different approach.

Burning the house to roast the pig

Technically, DG HOME is responsible for topics such as migration, human trafficking, cyber crime and terrorism. Yet the department dedicated more than a quarter of their policy idea space to DG CNECT’s Digital Services Act.

DG HOME calls its contribution “For a safe and responsible use of the internet”. According to its authors, today’s internet is mostly a lawless place used for “identity theft, ransomware, child pornography, incitement to terrorism, organising crime making use of more and more encrypted environment.

To bring order to this online wild west, DG HOME’s internet specialists propose that, in the future, all platform companies should take “proactive measures”, also known as upload filters, to prevent the “criminal abuse of their services.” Sounds familiar? That’s because DG HOME’s “responsible use of the internet” looks dangerously similar to the general monitoring obligation of the new EU Copyright Directive. Apparently it needs to be emphasised, once again, that the European Court of Justice has consistently ruled that such general monitoring obligations are in violation of our fundamental rights, and therefore illegal. Experts frequently add that, taken alone, filters are also ineffective in tackling illegal online content.

DG HOME’s proposal also includes an obligation for online platforms to shut down accounts that display illicit or harmful (yet legal) content such as disinformation — an idea that would turn companies into arbiters of truth.

Both DG CNECT and DG HOME have to consult each other and cooperate on files of shared interest, that is normal procedure. But the apparent attempt to hijack the framing around the DSA, before the legislative proposal is even written, is staggering. A Digital Single Market file with major fundamental rights implications should not be pushed into the security policy sphere where it risks being abused to curtail people’s fundamental rights.

European Commission internal document on proposed priorities
https://www.politico.eu/wp-content/uploads/2019/08/clean_definite2.pdf

More responsibility to online platforms – but at what cost? (19.07.2019)
https://edri.org/more-responsibility-to-online-platforms-but-at-what-cost/

E-Commerce review: Opening Pandora’s box? (20.06.2019)
https://edri.org/e-commerce-review-1-pandoras-box/

Filters Incorporated (09.04.2019)
https://edri.org/filters-inc/

close
27 Aug 2019

E-Commerce review: Mitigating collateral damage

By EDRi

This is the third article in our series on Europe’s future rules for intermediary liability and content moderation. You can read the introduction here.

Asking social media and other platform companies to solve problems around illegal online content can have serious unintended consequences. It’s therefore crucial that new EU legislation in this field considers such consequences and mitigates any collateral damage.

With the adoption of the EU’s E-Commerce Directive in 2000, policymakers put the decision about what kind of content should be removed into the hands of hosting companies. According to this old law, hosting companies are obliged to remove illegal content as soon as they gain knowledge of it. In practice, this means that companies are forced to take a huge number of decisions on a daily basis about the legality of user-uploaded content. And, them being commercial entities, they try to do it fast and cheap. This has serious implications for our fundamental rights to freedom of expression and to access to information.

What’s the problem?

So, what’s the problem? Shouldn’t platform companies be able to decide what can and cannot be posted on their systems? In principle, yes. The problem is that in many cases, the decision about the legality of a given piece of content is not straightforward. It often requires a complex legal analysis that takes into account local laws, customs, and context. Platform companies, however, have no interest in dealing with such complex (and therefore expensive) judgements – quite the opposite! As soon as businesses run the risk of being held liable for user-uploaded content, they have a strong commercial incentive to remove anything that could remotely be considered illegal – anywhere they operate.

To make things worse, many platforms use broad and often vaguely worded terms of service. The intransparent application of those terms has led to overly eager take-down practices at the expense of human rights defenders, artists, and marginalised communities. This was pointed out for example in a recent report from the Electronic Frontier Foundation (EFF), one of EDRi’s US-based members.

Human rights organisations, and especially those fighting for lesbian, gay, bisexual, transgender and queer and intersex (LGBTQI) rights, often face two problems on social media: On the one hand, their content is regularly taken down because alleged breaches of terms of service – despite being completely legal in their country. On the other hand, they are faced with hateful comments and violent threats by other users that are often not removed by platforms. As the EFF report states: “Content moderation does not affect all groups evenly, and has the potential to further disenfranchise already marginalised communities.”

Wrongful take-downs are common

Because none of the big social media companies today include any statistical information about wrongful take-downs and removals in their transparency reports, we can only rely on publicly available evidence to understand the scale of this problem. The examples we know, however, indicate that it’s big. Here are some of them:

  • YouTube removed educational videos about the holocaust, falsely classifying them as hate speech (Newsweek).
  • Facebook removed posts from Black Lives Matter activists, falsely claiming they amounted to hate speech (USA Today).
  • Twitter temporarily blocked the account of a Jewish newspaper for quoting Israel’s ambassador to Germany as saying he avoids contact with Germany’s right-wing AfD party. Twitter claimed the tweet qualified as “election interference” (Frankfurter Allgemeine).
  • Facebook removed posts and blocked accounts of Kurdish activists criticising the Turkish government, classifying them as hate speech and “references to terrorist organisations” (Buzzfeed).

Despite the numerous examples of failing filters, there seems to be an increasing belief by policymakers that algorithms and automatic filtering technologies can solve illegal content problems – without enough thought given to the harmful side-effects.

Content filters are not the solution…

As EDRi has argued before, filters do fail: We’ve seen automated take-downs and blocking of public domain works, of satire and nuance, and even of works uploaded by legitimate rightsholders themselves. Also, filters are not as efficient as one might think. The Christchurch video streaming incident, for example, has shown that content filters based on hash databases can be easily circumvented by applying only minimal changes to the content.

The belief that big tech companies and their filter technologies can somehow magically solve all problems in society is not only misguided, it’s a threat to people’s fundamental rights. The lack of transparency and ineffectiveness of filters also means that the number of take-downs by the big platforms alone is a poor measure of success in the fight against illegal online content.

…but changing business incentives is.

Instead of mandating even more failing filter technology, EU legislation that aims at tackling illegal online content should focus on the root causes of the problem. In reality, many platform companies benefit from controversial content. Hateful tweets, YouTube videos featuring conspiracy theories and outright lies, and scandalous defamatory posts on Facebook are all a great way for platforms to drive “user engagement” and maximise screen time, which in turn increases advertisement profits. These commercial incentives need to be changed.

The fourth and last blogpost of this series will be published briefly. It will focus on what platform companies should be doing, and what a new EU legislation on illegal online content that respects people’s fundamental rights should look like.

Other articles of this series on the E-Commerce Directive review:
1. Opening Pandora’s box?
2. Technology is the solution. What is the problem?

close
26 Jul 2019

Job alert: EDRi is looking for a Senior Policy Advisor

By EDRi

European Digital Rights (EDRi) is an international not-for-profit association of 42 digital human rights organisations from across Europe and beyond. We defend and promote rights and freedoms in the digital environment, such as the right to privacy, personal data protection, freedom of expression, and access to information.

EDRi is looking for a talented and dedicated Senior Policy Advisor to join EDRi’s team in Brussels. This is a unique opportunity to be part of a growing and well-respected NGO that is making a real difference in the defence and promotion of online rights and freedoms in Europe and beyond. The deadline to apply is 15 September 2019. This full-time, permanent position offers a competitive salary and must be filled in as soon as possible. The latest start is 15 October 2019.

Key responsibilities:

As a Senior Policy Advisor, your main tasks will be to:

  • Monitor, analyse and report about human rights implications of EU digital policy developments;
  • Advocate for the protection of digital rights, particularly but not exclusively in the areas of artificial intelligence, data protection, privacy, net neutrality and copyright;
  • Provide policy-makers with expert, timely and accurate input;
  • Draft policy documents, such as briefings, position papers, amendments, advocacy one-pagers, letters, blogposts and EDRi-gram articles;
  • Provide EDRi members with information about EU’s relevant legislative processes, coordinate working groups, help developing campaign messages and providing the public with information about EU’s relevant legislative processes and EDRi’s activities.
  • Represent EDRi at European and global events;
  • Organise and participate in expert meetings;
  • Maintain good relationships with policy-makers, stakeholders and the press;
  • Support and work closely with other staff members including policy, communications and campaigns colleagues and report to the Head of Policy and to the Executive Director;
  • Contribute to the policy strategy of the organisation;

Desired qualifications and experience:

  • Minimum 3 years of relevant experience in a similar role or EU institution;
  • A university degree in law, EU affairs, policy, human rights or related field or equivalent experience;
  • Demonstrable knowledge of, and interest in data protection, privacy and copyright, as well as other internet policy issues;
  • Knowledge and understanding of the EU, its institutions and its role in digital rights policies;
  • Experience in leading advocacy efforts and creating networks of influence;
  • Exceptional written and oral communications skills;
  • IT skills; experience using free software and free/open operation systems, WordPress and Nextcloud are an asset;
  • Strong multitasking abilities and ability to manage multiple deadlines;
  • Experience of working with and in small teams;
  • Experience of organising events and/or workshops;
  • Ability to work in English. Other European languages, especially French, is an advantage.

What EDRi offers:

  • A permanent, full-time contract;
  • A dynamic, multicultural and enthusiastic team of experts based in Brussels;
  • A competitive NGO salary with benefits;
  • The opportunity to foster the protection of fundamental rights in important legislative proposals;
  • A high degree of autonomy and flexibility;
  • An international and diverse network;
  • Internal career growth;
  • Networking opportunities.

How to apply:

To apply, please send a maximum one-page cover letter and a maximum two-page CV in English and in .pdf format to applications <at> edri <dot> org with “Senior Policy Advisor” in the subject line by 15 September 2019 (11.59 pm). Candidates will be expected to be available for interviews throughout September.

We are an equal opportunities employer with a strong commitment to transparency and inclusion. We strive to have a diverse and inclusive working environment and ideally, we would like to strive for a gender balance in the policy team. Therefore, we particularly encourage applications from individuals who identify as women. We also encourage individual members of groups at risk of racism or other forms of discrimination to apply for this post.

Please note that only shortlisted candidates will be contacted.

close
26 Jul 2019

Diego Naranjo becomes EDRi’s new Head of Policy

By EDRi

European Digital Rights is happy to announce that – following an open recruitment process – Diego Naranjo will step up from his role as Senior Policy Advisor, and start his work as EDRi’s Head of Policy in September 2019.

In his new position, Diego will occupy a central role in our advocacy strategies. He will coordinate, design, and execute our action plan to reach policy goals, and ensure that the policy team’s workplan is in line with EDRi’s overall objectives. Diego will also provide support to the EDRi office team and EDRi members.

Diego joined EDRi in October 2014. He has been covering data protection, privacy and copyright related work for EDRi. Previously, he gained international experience in the International Criminal Tribunal for former Yugoslavia, the EU Fundamental Rights Agency (FRA) and the Free Software Foundation Europe (FSFE). At national level, he worked as a lawyer in Spain, co-founded the Andalusian human rights organisation Grupo 17 de Marzo, and was appointed one of the seven members of the expert group on digital rights of the Spanish Ministry of Energy, Tourism and Digital Agenda between 2017 and 2018.

In his free time, Diego spends a considerable amount of time playing drums in a jazz trio and practicing rock climbing.


close
23 Jul 2019

Your family is none of their business

By Andreea Belu
  • Today’s children have the most complex digital footprint in human history, with their data being collected by private companies and governments alike.
  • The consequences on a child’s future revolve around one’s freedom to learn from mistakes, the reputation damage caused by past mistakes, and the traumatic effects of discriminatory algorithms.

Summer is that time of the year when parents get to spend more time with their children. Often enough, this also means children get to spend more time with electronic devices, their own or their parents’. Taking a selfie with the little one, or keeping them busy with a Facebook game or a Youtube animations playlist – these are examples that make the digital footprint of today’s child the largest in human history.

Who wants your child’s data?

Mobile phones, tablets and other electronic devices can open the door for the exploitation of the data about the person using that device – how old they are, what race they are, where are they located, what websites they visit etc. Often enough, that person is a child. But who would want a child’s data?

Companies that develop “smart” toys are the first example. In the past year, they’ve been in the spotlight for excessively collecting, storing and mis-handling minors’ data. Perhaps you still remember the notorious case of “My Friend Cayla”, the “smart” doll that was proved to record the conversations between it and children, and share them with advertisers. In fact, the doll was banned in Germany as an illegal “hidden espionage device”. However, the list of “smart” technologies collecting children data is long. Another example of a private company mistreating children’s data was the case of Google offering its school products to young American students and tracking them across their different (home) devices to train other Google products. A German DPA (Data Protection Authority) decided to ban Microsoft Office 365 from schools over privacy concerns.

Besides private companies, state authorities have an interest to record, store and use children’s online activity. For example, a Big Brother Watch 2018 report points that in the United Kingdom “Department for Education (DfE) demands a huge volume of data about individual children from state funded schools and nurseries, three times every year in the School Census, and other annual surveys.” Data collected by schools (child’s name, birth date, ethnicity, school performance, special educational needs and so on) is combined with social media profile or other data (e.g household data) bought from data brokers. Why linking all these records? Local authorities wish to focus more on training algorithms that predict children’s behaviour in order to identify “certain” children prone to gang affiliations or political radicalisation.

Consequences for a child’s future

Today’s children have the biggest digital footprint out of all humans in human history. Sometimes, the collection of a child’s data starts even before they are born, and this data will increasingly determine their future. What does this mean for kids’ development and their life choices?

The extensive data collection of today’s children aims at neutralising behavioural “errors” and optimising their performance. But mistakes are valuable during a child’s self-development – committing errors and learning lessons is an important complementary to receiving knowledge from adults. In fact, a recent psychology study shows that failure to provide an answer to a test is benefiting the learning process. Constantly using algorithms to optimise performance based on a child’s digital footprint will damage the child’s right to make and learn from mistakes.

Click to watch the animation

A child’s mistakes are not only a source of important lessons. With a rising number of attacks targeted at school’s IT systems, children’s data can get in the wrong hands. Silly mistakes could also be used to damage the reputation of the future adult a child grows into. Some mistakes must be forgotten. However, logging every step in a child’s development increases the risk that the past mistakes are later used against them.

More, children’s data can contribute to them being discriminated against. As mentioned above, data is used to predict child behaviour, with authorities aiming to intervene where they consider necessary. But algorithms portray human biases, for example against people of colour. What happens when a child of colour is predicted to be at risk of gang affiliation? Reports show that authorities treat children in danger to be recruited by a gang as if they were part of the gang already. Therefore, racial profiling by algorithms can turn into a traumatic experience for a child.

EDRi is actively trying to protect you and your beloved ones

European Digital Rights is a network of 42 organisations that promote the respect of privacy and other human rights online.

Our free “Digital Defenders” booklet for children (available in many languages) teaches in a fun and practical way why and how to protect our privacy online. EDRi is also working on the ongoing reform of the online privacy (ePrivacy) rules. This reform has a great potential to diminish practices of data exploitation online.

Read more:

Privacy for Kids: Your guide to Digital Defenders vs. Data Intruders (free download)
https://edri.org/papers

DefendDigitalMe: a call to action to protect children’s rights to privacy and family life.
https://defenddigitalme.com/

Blogpost series: Your privacy, security and freedom online are in danger (14.09.2016)
https://edri.org/privacy-security-freedom/

e-Privacy revision: Document pool (10.01.2017)
https://edri.org/eprivacy-directive-document-pool/

close