security & surveillance

While offering vast opportunities for exercising and enhancing fundamental rights, the digital environment also offers both opportunities to commit new offences and to impose new restrictions on our online rights. Measures such as filtering, blocking and untargeted surveillance are often easy to implement and extremely difficult to rectify. EDRi therefore works to ensure that all security and surveillance measures are necessary, proportionate and implemented based on solid evidence.

10 Apr 2019

Public campaigns on digital rights: Mapping the needs

By Claire Fernandez

In February 2019, the Digital Freedom Fund (DFF) strategy meeting took place in Berlin. The meeting was the perfect occasion for experts, activists, and litigators from the broad digital and human rights movement to explore ways of working together and of levelling up the field.

The group held discussions on several methods and avenues for social change in our field, such as advocacy and litigation. Public campaigning came up as an interesting option – many organisations want to achieve massive mobilisation, while few have managed to develop the tools and means needed for fulfilling this goal. One of the breakout group discussions therefore focused on mapping the needs for pan-European campaigns on digital rights.

First, we need to define our way of doing campaigns, which might differ from other movements. A value-based campaigning method should look into questions such as: Who funds us? Do we take money from the big tech companies and if yes, at what conditions and to which amount? Who are we partnering with: a large, friendly civil society and industry coalition or a restricted core group of digital rights experts? Are we paying for advertising campaigns on social media or do we rely on privacy-friendly mobilising techniques? It was agreed that being clear on how we campaign and what our joined message is were crucial elements for the success of a campaign. A risk-management system should also be put in place to anticipate criticisms and attacks.

Second, proper field mapping is important. Pre- and post- campaign public opinion polls and focus groups are useful. Too often, we tend to go ahead with our own plans without consulting the affected groups such as those affected by hate speech online, child abuse and so on.

Third, unsurprisingly, the need for staff and resources was ranked as a priority. These include professional campaigners, support staff, graphic designers, project managers and coordinators, communication consultants and a central hub for a pan-European campaign.

Finally, we need to build and share campaigning tools that include visuals, software, websites, videos, celebrities and media contacts. Participants also mentioned the need for a safe communication infrastructure to exchange tools and coordinate actions.

At EDRi, all the above resonate as we embark on the journey of building our campaigning capacity to lead multiple pan-European campaigns. For instance, one of the current campaigns we have been involved in − the SaveYourInternet.eu campaign on the European Union Copyright Directive − has revealed the importance of fulfilling these needs. Throughout this particular campaign, human rights activists have faced unprecedented accusations of being paid by Google and similar actors, and of being against the principle of fair remuneration for artists. Despite disinformation waves, distraction tactics and our small resources, the wide mobilisation of the public against problematic parts of the Directive such as upload filters has been truly impressive. We witnessed over five million petition signatures, over 170 000 protesters across Europe, dozens of activists meeting Members of the European Parliament, and impressive engagement rates on social media. The European Parliament vote, in favour of the whole Copyright Directive including controversial articles, was only won by a very narrow margin, which shows the impact of the campaign.

The EDRi network and the broader movement need to learn lessons from the Copyright campaign and properly build our campaign capacity. EDRi started this process during its General Assembly on 7-8 April in London. The DFF strategy workshop held in Berlin gave us a lot of food for thought for this process.

This article was first published by Digital Freedom Fund (DFF): https://digitalfreedomfund.org/public-campaigns-on-digital-rights-mapping-the-needs/

(Contribution by Claire Fernandez, EDRi)

close
09 Apr 2019

Filters Incorporated

By Diego Naranjo

On 26 March 2019, the European Parliament (EP) adopted the new copyright Directive. The music industry and collecting societies celebrated it as a victory for authors and creators, despite actual authors (along with civil society groups) being worried about the outcome.

Article 17 of the Directive (referred as Article 13 in the previous draft text) includes a change of platforms’ responsibility that will lead to the implementation of upload filters on a vast number of internet platforms. In effect, Article 17 represents a threat to our fundamental right to freedom of expression.

We tried hard to stop the legalisation of the first EU internet filter. Read below a summary of what happened.

It all started in 2002

EDRi has been involved in copyright discussions since the beginning of our network’s existence. We’ve promoted a positive agenda aimed at fixing the main problems within the existing framework, and supported a copyright reform that included a request for authors and artists to receive fair remuneration for their work. We published handbooks, series of blogposts, responded to public consultations, spoke in numerous public events, and met with all key policy makers in Brussels and at national level. We participated in different joint actions and were involved in the inception and development of SaveYourInternet.eu along with Copyright for Creativity (C4C).

Civic engagement vs industry lobby

During the debates, the individuals’ and civil society groups’ participation was crucial in order to balance the massive lobby efforts by industries. In July 2018, thanks to the pressure of thousands of people calling their Members of the European Parliament (MEPs), the European Parliament rejected the mandate to proceed with a flawed proposal. This gave us hope that citizens’ voice can be heard, if we shout loud enough.

During the Copyright Action Week in March 2019, ahead of the final vote on the Directive in the European Parliament, a team of 17 people from all across Europe made it all the way to Brussels and Strasbourg. They all parked their studies or jobs for a few days in order to meet their elected representatives and have a final push to delete upload filters from the copyright Directive. We were impressed with their dedication, and their thorough knowledge of the consequences Article 13 could have on the internet. More, hundreds of thousands of people went on the streets in Europe to protest against upload filters.

The latest actions taken by all of those opposing internet filters were not in vain. In the vote adopting the Directive on 26 March, 55 MEPs who previously supported Article 17 (former Article 13) in September 2018, changed their position and were willing to delete it from the final text of the Directive. The deletion could have happened through an amendment proposed by several MEPs. In order for this amendment to be adopted and Article 17 deleted, a vote on whether the text should be first opened to amendments took place during the March 2019 plenary.

The vote: Blue pill, or red pill?

On 26 March, the possibility to have a discussion on the amendments to remove Articles 11 and 13 (15 and 17 in the final text) was voted down with a difference of five votes. Thirteen MEPs claimed that they had wished to open the debate to remove both Articles, but were confused by the previous change of votes order and the obvious lack of clarity this procedural vote was introduced with, and failed to vote “yes”. The vote has been corrected only in the records, but it will not affect the actual results of the vote. After this “mistake” that made it impossible for MEPs to vote on deleting Article 13/17, the text of the Directive (including Article 13/17) was adopted with 338 votes in favor, 283 against, 36 abstentions and 93 MEPs not attending the session.

Despite some policy-makers repeatedly stating that the Directive will not lead into upload filters, it turned out it was all about filters. The day after the Directive was adopted, France hurried to declare that it will ensure that “content recognition technologies” will be a key aspect in the upcoming laws implementing the Directive.

With the adoption of Article 17 as part of the Copyright directive text, the European Union is setting a terrible precedent for the rest of the world, encouraging the implementation of upload filters. Initially under the pretext of copyright infringement, filters are already being discussed also in the framework of online “terrorist content”.

Next steps: EU Council and implementation

The final vote in the Council of the European Union, where EU Member States are represented, is scheduled for 15 April. This is traditionally a merely procedural vote – after all, the Council already agreed, before the European Parliament’s final vote, to the text on which they will be voting. However, this is technically the last chance to get rid of the upload filters. If the Member States currently opposing the ”censorship machine” (Finland, Luxembourg, Poland, Netherlands, Italy and perhaps Sweden) remain on the side of their citizens, the only beam of hope would be that a country representing around 9,5% of the population of the whole EU rejects the text. Out of those countries (Germany, France, Spain), the only realistic candidate is Germany. Will the German government respect the coalition agreement which prohibits them from implementing upload filters? Will other EU countries stand up for the citizens, taking into consideration the upcoming European Parliament (and some national) elections? We’ll find out soon.

In the case of the copyright Directive becoming law, civil rights groups are set to reject upload filters in the national implementation phase. Planned actions include potential referrals to the Court of Justice of the European Union (CJEU).

Read more:

Censorship machine takes over EU’s internet (26.03.2019) https://edri.org/censorship-machine-takes-over-eu-internet/

Copyright reform: Document pool
https://edri.org/copyright-reform-document-pool/

(Contribution by Diego Naranjo, EDRi)

close
08 Apr 2019

Terrorist Content Regulation: Successful “damage control” by LIBE Committee

By Chloé Berthélémy

Today, on 8 April 2019, the European Parliament Committee on Civil Liberties, Justice and Home Affairs (LIBE) adopted its Report on the proposed Regulation for moderation of terrorist content online.

Released by the European Commission in September 2018, the proposal was very welcomed in the Council of Member States, which rapidly concluded a political agreement a few months later. Stronger reservations were, however, expressed in the different Committees in charge of the file in the European Parliament,which lead to substantial changes in the Commission’s original proposal.

The most critical points for the protection of fundamental rights concerning the proposed Regulation were taken on board by the LIBE Committee in its Report:

  • The definitions of “terrorist content” and “hosting service providers” are clarified and brought in line with the counter-terrorism acquis. Exceptions are provided for educational, journalistic or research material, and LIBE has limited the scope of the Regulation to only cover hosting service providers that make content available to the public at the application layer, leaving out infrastructures providers, as well as cloud and messaging services.
  • Amendments to the first instrument, removal orders, require that a single judicial or functionally independent administrative competent authority should be appointed. Unfortunately, the one-hour time frame to respond to removal orders, which is simply not feasible for smaller service providers with limited capacities, was not changed by LIBE, despite the blatant lack of evidence supporting this deadline.
  • The possibility for national authorities to refer content to service providers for deletion on the basis of their terms and conditions is now removed from the text. This is a major step forward because this instrument would amount to increased online policing by platforms and a circumvention of legal safeguards attached to removal orders in order to tackle content that is not illegal.
  • The LIBE Committee also deleted the obligation of pro-activity, involving the use of automated tools like upload filters. The Parliament is clearly reasserting the prohibition to oblige platforms to generally monitor the user-generated content they host on their services (Article 15 of the e-Commerce Directive).
  • Lastly, the principles of the rule of law and the protection of fundamental rights are substantiated with additional transparency requirements falling on competent authorities and stronger redress mechanisms for both hosting service providers and content providers.

Overall, EDRi appreciates that the criticism from both European Parliament Committees on Internal Market and Consumer Protection and Culture and Education, three United Nations Special Rapporteurs as well as a large number of civil society groups was heard by the LIBE Committee.

Next steps

After the European Parliament elections in May 2019, and once a new EU Commission has been set up, the text will be subject to several rounds of trilogue negotiations between the Parliament, the Council and the Commission. These closed-door meetings aim at finding a middle ground between the diverging positions of the three negotiators. Considering that the Council position did not depart a lot from the Commission’s proposal, there is a significant risk that the “damage control” conducted by the Parliament will be partly rolled back in the next phase of the policy-making process.

Terrorist Content Regulation: Document pool
https://edri.org/terrorist-content-regulation-document-pool/

FRA and EDPS: Terrorist Content Regulation requires improvement for fundamental rights (20.02.2019)
https://edri.org/fra-edps-terrorist-content-regulation-fundamental-rights-terreg/

Terrorist Content Regulation – prior authorisation of all uploads? (21.11.2018)
https://edri.org/terrorist-content-regulation-prior-authorisation-for-all-uploads/

EU’s flawed argument’s on terrorist content give big tech more power (24.10.2018)
https://edri.org/eus-flawed-arguments-on-terrorist-content-give-big-tech-more-power/

Press Release: EU Terrorism Regulation – an EU election tactic (12.09.2018)
https://edri.org/press-release-eu-terrorism-regulation-an-eu-election-tactic/

(Contribution by Chloé Berthélémy, EDRi)

close
27 Mar 2019

EU Council Presidency outlines future counter-terrorism priorities

By Statewatch

A note produced by the Romanian Presidency of the Council of the European Union sets out the EU’s response to terrorism since 2015. It highlights the main measures adopted and calls for a “reflection process on the way forward” in a number of areas including “interoperability and extended use of biometrics”; implementing the EU Passenger Name Record (PNR) Directive and possibly extending its scope beyond air travel; and “synergies” between internal and external policies.

The issues highlighted in the document were discussed by the Justice and Home Affairs (JHA) Council on 7-8 March. It was noted that “the process of reflecting on the way forward will continue at technical level”.

On the issue of “interoperability and extended use of biometrics”, the paper says (emphasis added):
“The package on interoperability should be fully implemented. Existing databases should be filled with good quality data, and tools (such as biometrics and facial recognition) should be improved to enable querying with data across more EU information systems. All relevant competent authorities in the CT [counter-terrorism] area should have direct access to relevant information systems (notably SIS II and Prüm) to avoid information and security gaps. Connecting more systems could be explored in parallel to implementation.

This implies an appetite for further expanding the interoperability initiative before there has been any opportunity to fully assess how it functions in practice – despite serious data protection and privacy concerns raised by specialists and some Members of the European Parliament (MEPs).

The Council and Parliament recently provisionally agreed a text on two key Regulations underpinning the interoperability plans.

Regarding PNR, the note recalls the importance of all Member States fully implementing the EU PNR Directive, agreed in 2016, and says:
“The collection and processing of PNR data is crucial to detect, prevent and prosecute terrorist offences, and the effective connection of the PIUs of the Member States for information exchange is a priority. The further broadening of the scope of PNR (to other means of transportation) could be explored.

Regarding internal-external “synergies”, the Presidency highlights:
“The nexus between internal and external security has become increasingly prominent, and progress has been made in better connecting the two areas. Together with the Commission, the EEAS [European External Action Service] and the EU CTC [Counter-Terrorism Coordinator], the Presidency is further exploring ways to strengthen the links between the external and internal dimensions of security in relation to CT [counter-terrorism]. This includes focusing on the use of internal instruments to promote EU security interests related to CT in priority third countries (e.g. Western Balkans, Turkey and the MENA [Middle East and North Africa] region)…”

Other ongoing work outlined in the document concerns “violent extremism and radicalisation”; data retention; the financing of terrorism; “chemical, biological, radiological and nuclear (CBRN) risks, in particular chemical risks”; cooperation between EU agencies; and “emerging threats”:
“Evolving technologies such as UAVs (unmanned aerial vehicles), artificial intelligence (AI), blockchain or the Internet of Things, could be misused by terrorist groups. Tackling these threats requires high-tech expertise, meaning that more efforts at national and EU level are required to address the emerging threats, including through public-private partnerships and research and development. At the same time, the opportunities of the new technologies for security need to be explored and mobilised.”

The document also includes a list of adopted counter-terrorism measures, measures awaiting formal adoption, and measures under discussion.

Biometrics, extended travel surveillance, internal-external “synergies”: Presidency note outlines future counter-terrorism priorities
http://www.statewatch.org/news/2019/mar/eu-terrorism-doc.htm

EU response to terrorism – state of play and way forward (28.02.2019)
http://www.statewatch.org/news/2019/mar/eu-council-6664.pdf

(Contribution by EDRi member Statewatch, the United Kingdom)

close
27 Mar 2019

Google fined 1,5 billion euro for abusive online ad practices

By Jan Penfrat

On 20 March, the European Commission imposed yet another massive fine, 1,5 billion euro, on Google. The Commission Directorate-General for Competition stated that the data company has abused its dominant position in the online advertising market by imposing restrictive contracts with third-party websites that prevented rivals from placing their search adverts on these websites.

Competition Commissioner Margrethe Vestager said that “Google has cemented its dominance in online search adverts and shielded itself from competitive pressure”. According to her findings, Google’s misconduct lasted over ten years and prevented other companies from competing in the ad market.

The fine is imposed for the way Google uses its “AdSense for Search” product, which delivers online ads to large third-party websites such as newspapers and travel sites embedding Google Search into their online presence. Embedding Google search took place via agreements, according to the Commission’s press release. Vestager’s team says they have “reviewed hundreds of such agreements in the course of its investigation”. What they found is quite alarming: Apparently, as of 2006, Google’s agreements prohibited publishers from placing search ads from competitors on their search result pages. This was later replaced with a clause reserving the most valuable ad space to Google ads and requiring any changes that publishers wanted to make be pre-approved by Google.

Google hasn’t denied the charges. In a press statement, Senior Vice President of Global Affairs, Kent Walker, said: “We’ve always agreed that healthy, thriving markets are in everyone’s interest. We’ve already made a wide range of changes to our products to address the Commission’s concerns. Over the next few months, we’ll be making further updates to give more visibility to rivals in Europe.”

Although Google ceased those practices a few months after the Commission issued a so-called statement of objections in July 2016, the EU authority still decided to impose this fine that represents 1,29 % of Google’s turnover in 2018. The fine follows two previous decisions by the Commission to impose fines of 4,3 billion euro in 2018 and 2,4 billion euro in 2017 for the abuse of dominant positions in the mobile and shopping search. Google is currently appealing both decisions in court.

Fines such as this one are paid into the general EU budget and will be deducted from next year’s Member State contributions to the EU budget. The fines therefore co-finance operations of the EU. The Commission’s Directorate-General for Competition is probably the only part of the EU administration that regularly makes more money than it costs.

European Commission Press release: Antitrust: Commission fines Google €1.49 billion for abusive practices in online advertising (20.03.2019)
http://europa.eu/rapid/press-release_IP-19-1770_en.htm

Google hit with €1.5 billion antitrust fine by EU (20.03.2019)
https://www.theverge.com/2019/3/20/18270891/google-eu-antitrust-fine-adsense-advertising

(Contribution by Jan Penfrat, EDRi)

close
27 Mar 2019

New freedom of information law proposed in North Macedonia

By Metamorphosis

The right to freedom of information (FOI) is protected by law in North Macedonia since 2006. In theory, the law complies with international standards and creates a solid basis for establishing a system to protect this right. However, the practice during the past 12 years has shown legal gaps, inconvenient practices, and inefficiency of the national authority at implementing the law.

The urgent reform priorities set by the European Union in 2015 as preconditions for North Macedonia accession to the EU specifically require that the government fundamentally improves access to information. Some improvements were made, forcing active transparency by declassifying and publishing documents online, and allowing access to data on spending of public money.

Meanwhile, the Commission for the Protection of the Right to Free Access to Public Information (KOMSPI) that is in charge of monitoring the implementation of the law, did not function. A huge backlog of unresolved complaints is waiting for completion, because the parliament failed to appoint new commissioners and replenish its ranks.

In December 2017, an initiative for a new FOI law was launched. After a year and a half, Macedonian citizens finally received the proposed text of the new law.

EDRi member Foundation for Internet and Society – Metamorphosis endorses the process of passing the new Law on Free Access to Public Information, which would provide more efficient protection of the fundamental right to access information.

With regard to specific provisions of the proposed text, Metamorphosis suggests the following:

  • Article 1, paragraph 1: The defining of political parties as public information holders in terms of income and expenditures is one of the key positive novelties of the Law on Free Access to Public Information. Metamorphosis believes that the funding of political parties should be considered public information to increase the transparency regarding the spending of public money on the part of the political parties.
  • Article 3, paragraph 1, indent 7: The draft text attempts to define cases where the access to information would be of public interest by establishing a fixed list of criteria. Metamorphosis does not recommend the use of a restricted list to define public interest since a narrow definition bears the risk of limiting the exercise of the irght to access information. To avoid such limited definition, we suggest to introduce a mandatory injury test to assess the existence or not of public interest when an information is being requested, without being defined by law.
  • Article 10. Metamorphosis deems the definition of public information detailed and providing legal certainty for public information holders. In addition, apart from the scale of the information, its availability on websites shall contribute to reducing the number of requests for access, thereby giving the opportunity to holders to be more efficient as regards the full implementation of the law.
  • Article 21, paragraph 1: Shortening the deadline by which holders need to respond to a request from 30 to 20 days is a change Metamorphosis believes will not drastically contribute to a better implementation of the law, especially when journalists request public information. Additionally, in its work plan 2017-2022, the Government of the Republic of North Macedonia states it will implement the open government concept in full to further increase transparency. It will propose amendments for halving the deadline for response to public information requests from 30 to 15 days as it was recommended in the plan for Open Government Partnership.
  • Article 31: Metamorphosis deems positive the change of the status of the authority responsible for implementing the Law on Free Access to Public Information, from a commission, as a collective body, to an agency, as an independent body, especially when it comes to leading a complaint procedure.

The positions listed above are defined following a public debate held in the Assembly of the Republic of North Macedonia. At the moment, the Parliament is working on amendments and the final text is expected to be given to the Members of the Parliament soon.

Urgent Reform Priorities for Macedonia, European Commission, Directorate-General for Neighbourhood and Enlargement Negotiations https://eeas.europa.eu/sites/eeas/files/urgent_reform_priorities_en.pdf

Final report from the monitoring of the implementation of the reform priorities in the field of media for the period 01.07.2017 – 30.9.2018
http://mediaobservatorium.mk/wp-content/uploads/2018/11/OMR_zavrsen_izveshtaj-EN-1.pdf

Flooded with 500 complaints, Commission for Free Access to Information awaits final members
http://meta.mk/en/flooded-with-500-complaints-commission-for-free-access-to-information-awaits-final-members/

The Commission for Protection of the Right to Free Access to Public Information of the Republic of North Macedonia
http://komspi.mk/en/

(Contribution by Foundation for Internet and Society – Metamorphosis, North Macedonia)

close
27 Mar 2019

GDPR incompatibility – the blind spot of the copyright debate

By Chloé Berthélémy

The debate around the Copyright Directive reform has been intense. Former Article 13, which became Article 17 in the text voted by the European Parliament on 26 March, created the greatest controversy between stakeholders arguing about the so-called “value gap” in the creative sectors, upload filters, and a new platform liability regime, among others issues. However, few observers have analysed the impact of Article 13/17 on the General Data Protection Regulation (GDPR). On 23 March, Dr. Malte Engeler, a German judge, published an article explaining why the filtering technology required by the Copyright Directive might be incompatible with European data protection rules.

Article 13/17 requires content hosting providers to give their best efforts to prevent the upload or re-upload of copyright-protected works – which can only be achieved with upload filters – except if they are covered by specific copyright exceptions such as quotation, criticism or parody. For filters to function properly while taking into account those exceptions, they would need to recognise the context of the upload, that is to say information surrounding the content including personal data of the user uploading it. The question Engeler asks is under which legal basis of GDPR would platforms be able to process such personal data?

According to Engeler, platforms would be considered controllers in the sense of the GDPR because they decide which technologies they will use to monitor content. When analysing a film extract uploaded without authorisation, a filter would need to know whether it was used by a film critic – which would be legal according to the copyright exceptions listed in Article 13/17 – or by a user attempting to illegally distribute the film. Detecting such differences in the use of the same piece of content would depend on “meta information about the upload” such as the user identity, the place, and the date. This information would be considered personal data, and its analysis by the algorithm would be processing under GDPR.

The article goes on by examining the legal basis provided for in the GDPR (Article 6(1)), under which such processing would be allowed. Consent could not be freely given because all platforms would be required to have this processing in place, leaving no alternative to users. Making upload filters part of the terms and conditions would not respect the criteria of necessity of paragraph 1b, which allows the processing of personal data to execute a contract. Furthermore, the processing of personal data by content filters is neither necessary to protect the user’s vital interests, nor is it done for public or legitimate interests pursued by the platform – they don’t want an obligation to put filters in place. This leaves the platform with the legal basis whereby the processing is necessary for compliance with another legal obligation (para. 1c), which would be compliance with the copyright Directive.

However, considering the high risk of liability, smaller platforms will likely have to implement third party filters, bought as a service from bigger companies that have invested tens of millions of euros in such technologies. As a result, few big content filtering companies will be able to process the above-mentioned personal data of the vast majority of users. The new copyright Directive would thus lead to centralised filtering mechanisms.

This is problematic in regards to the principle of proportionality mentioned in the GDPR and in the Charter of Fundamental Rights of the European Union. Such a filtering system was already discarded by the Court of Justice of the European Union (CJEU) because it failed to strike a fair balance “between the right to intellectual property, on the one hand, and the freedom to conduct business, the right to protection of personal data and the freedom to receive or impart information on the other”. The legal obligation that Article 13/17 creates for platforms is incompatible with the right to protection of personal data, which makes it hard to rely on for the processing of personal data under the GDPR.

Copyright Directive: Does the best effort principle comply with GDPR? (23.03.2019)
https://www.telemedicus.info/article/3402-Copyright-Directive-Does-the-best-effort-principle-comply-with-GDPR.html

Press Release: Censorship machine takes over EU’s internet (26.03.2019)
https://edri.org/censorship-machine-takes-over-eu-internet/

SABAM vs Netlog – another important ruling for fundamental rights (16.02.2012)
https://edri.org/sabam_netlog_win/

All you need to know about copyright and EDRi (15.03.2019)
https://edri.org/all-you-need-to-know-about-copyright-and-edri/

(Contribution by Chloé Berthélémy, EDRi)

close
26 Mar 2019

Press Release: Censorship machine takes over EU’s internet

By EDRi

Today, on 26 March, the European Parliament voted in favour of adopting controversial upload filters (Article 13/17) as part of the copyright Directive. This vote comes after what was an intense campaign for human rights activists, with millions of signatures, calls, tweets and emails from concerned individuals, as well as Europe-wide protests.

Despite the mobilisation, 348 Members of the European Parliament (MEPs) gave their support to the proposed text which includes concerning restriction to freedom of expression. Noticeably, 274 stood up with citizens and voted to reject upload filters. The proposal to open the text for amendments was rejected by five votes difference. The amendments proposing the deletion of Article 13 were not even subject to a vote.

Article 13 of the copyright Directive contains a change of internet hosting services’ responsibility that will necessarily lead to the implementation of upload filters on a vast number of internet platforms. With dangerous potential for automatised censorship mechanisms, online content filtering could be the end of the internet as we know it.

Disappointingly, the newly adopted Directive does not benefit small independent authors, but instead, it empowers tech giants. More alarmingly, Article 13 of the Directive sets a dangerous precedent for internet filters and automatised censorship mechanisms – in the EU and across the globe.


said Diego Naranjo, Senior Policy Advisor at EDRi

European Digital Rights (EDRi) has long advocated for a copyright reform that would update the current EU copyright regime to be fit for the digital era, and make sure artists receive remuneration for their work and creativity. This Directive delivers none of those.


EU Member States will now have to transpose the Directive into their national laws and decide how strictly they will implement upload filters. People need to pay special attention to the national-level implementation of the Directive in order to ensure that the voted text does not enable censorship tools that restrict our fundamental rights.

Ahead of the next European Parliament elections, this vote comes as another important reminder of the impact that EU law-making can have on human rights online and offline. EDRi ensures the voice of civil society is represented in the EU democratic process and would like to thank all those involved in the battle against upload filters for their inspiring dedication towards the defence of fundamental rights and freedoms.

Copyright reform: Document pool
https://edri.org/copyright-reform-document-pool/

All you need to know about copyright and EDRi (15.03.2019)
https://edri.org/all-you-need-to-know-about-copyright-and-edri/

Twitter_tweet_and_follow_banner close
21 Mar 2019

Join the ultimate Action Week against Article 13

By Andreea Belu

The final vote on the Copyright Directive in the European Parliament plenary will take place on 26 March. A key piece raising concerns in the proposal is Article 13. It contains a change of platforms’ responsibility that will imminently lead to the implementation of upload filters on a vast number of internet platforms. The proposed text of Article 13 on which the Parliament will be voting is the worst we have seen so far.

Public outcry around Article 13 reached a historical peak with almost five million individuals signing a petition against it, and thousands calling, tweeting and emailing their Members of the European Parliament (MEPs). Despite the scale of the protests, legislators fail to address the problems and remove upload filters from the proposal.

Join the Action Week (20 March – 27 March) organised by the free internet community and spread the word about the #SaveYourInternet movement! Send Members of the European Parliament a strong message: “Side with citizens and say NO to upload filters!

NOW – Get active!

Kickstart the action week! Did you get your MEP to pledge opposition to the “Censorship Machine” during the plenary vote ? Did you reach out to a national news outlet to explain them why this is bad for the EU? Did you tell your best mate your meme game may be about to end? If you answered “No” to any of those questions… NOW IS THE TIME TO ACT.

21 March – Internet blackout day

Several websites are planning to shut down on this day. Wikimedia Germany is one of them. Is your website potentially hosting copyrighted content, and therefore affected by the upcoming copyright upload filter? Join the protest!
#Blackout21

23 March – Protests all over Europe

Thousands have marched the streets in the past weeks. The protests were not lastly influenced by European Commission’s allegations of the #SaveYourInternet movement as a bots-driven one, purposely misleading communication from the EU Parliament, and the attempted rushing of the final vote weeks before originally scheduled. 23 March will be the general protest day – see a map here. Commit to EU’s core democratic values and show what positive citizens’ engagement looks like!
#Article13Demo #Artikel13Demo

19 to 27 March – Activists travel to meet their MEPs

We have launched a travel grant for activists willing to travel to Strasbourg and Brussels in order to discuss with their representatives. Do you want to take part in our final effort to get rid of mandatory upload filters? Join us! The deadline to apply is Friday 15 March.
#SYIOnTour

It is very important that we connect with our MEPs and make our concerns heard every day of the Action Week. Whether you can travel or make phone calls to get in touch with your representatives, or grow awareness in your local community – it all makes a huge difference. Build on the voices of internet luminaries, the UN Special Rapporteur on Freedom of Expression, civil society organisations, programmers, and academics who spoke against Article 13!

We need the stop the censorship machine and work together in order to create a better European Union! You can count on us! Can we count on you?

Read more

Save Your Internet Campaign website
https://saveyourinternet.eu/

Pledge 2019 Campaign Website
https://pledge2019.eu/en

Upload Filters: history and next steps (20.02.2019)
https://edri.org/upload-filters-status-of-the-copyright-discussions-and-next-steps

Twitter_tweet_and_follow_banner
close
18 Mar 2019

Open letter: Regulation on terrorist content online endangers freedom of expression

By EDRi

On 18 March 2019, together with seven other organisations, EDRi sent a letter to Members of the European Parliament (MEPs), to share our concerns with regards to the draft Regulation on preventing the dissemination of terrorist online content.

The European Parliament Committee in Civil Liberties, Justice and Home Affairs (LIBE) is set to vote on its Report on the draft Regulation on 21 March. If the original Commission proposal is not seriously re-drafted, it could have major impacts on civil liberties online.

You can read the letter here (pdf), and below:

Brussels, 18 March 2019

Dear Members of the European Parliament,

We, the undersigned organisations, would like to express some of our views on the draft Regulation on preventing the dissemination of terrorist content online published in September 2018, ahead of a key vote in the Civil Liberties Committee.

We believe that illegal terrorist content is unequivocally unacceptable offline and online. While we understand the aim of the draft Regulation, we regret that the approach taken by the European Commission and the Council of the European Union did not address the most pressing concerns we share on this text, such as the wide definitions of terrorist content and of hosting service providers falling within the scope of the Regulation, the introduction of unworkable deadlines for content removal and mandatory “proactive measures”. These requirements could necessitate the introduction of upload filters and therefore potentially lead to removal of legal content. Far from helping private and public actors curb the dissemination of terrorist propaganda online, this draft Regulation risks undermining current efforts and could have a strong impact on European citizens’ fundamental rights.

Similar concerns on the provisions of this draft Regulation have been expressed by international institutions, including the EU Fundamental Rights Agency (FRA), the three UN Special Rapporteurs in a joint opinion and the European Data Protection Supervisor (EDPS).

We therefore urge the Civil Liberties Committee to take a proportionate approach compliant with the EU Charter of Fundamental Rights and the EU acquis, by:

  • Ensuring that the definition of terrorist content is aligned with the Terrorism Directive, and that the dissemination of such content is directly linked to the intent of committing terrorist offences.
  • Narrowing the definition of terrorist groups to cover only those terrorist groups listed by the United Nations and the European Union.
  • Limiting the definition of hosting services to services where a proven risk of propagation of terrorist content to the general public exists i.e. the scope should exclude services such as Cloud Infrastructure, Internet Infrastructure and Electronic Communication Services.
  • Amending the extremely short one-hour deadline to comply with removal orders; which would lead to over-removal of legal content online and is unworkable for many enterprises.
  • Ensuring that referrals are deleted from the proposal or substantially modified so they do not lead to private companies bearing the burden of deciding the legality of content instead of the judicial authorities in Member States.
  • Clearly aligning the proposal with the e-Commerce Directive, ensuring that any additional measures as drafted in Article 6 are not “proactive measures” which consist, directly or indirectly, of implementing mandatory filtering mechanisms thus inadvertently introducing a general monitoring obligation.
  • Ensuring that removal orders follow robust and accountable procedures and are issued by a single independent competent authority per Member State.
  • Including adaptable provisions for different types of companies and organisations.

Sincerely,
Access Now –
https://www.accessnow.org/
Allied for Startups –
https://alliedforstartups.org/
Computer & Communications Industry Association (CCIA) –
https://www.cccianet.org
Center for Democracy and Technology (CDT) –
https://cdt.org/
CISPE.cloud, representing Cloud Infrastructure Service Providers in Europe –
https://cispe.cloud/
EDiMA –
http://edima-eu.org
EDRi – edri.org/
EuroISPA, the pan-European association of Internet Services Providers Associations –
https://www.euroispa.org
Free Knowledge Advocacy Group EU –
https://wikimediafoundation.org/

Open letter to the European Parliament on terrorist content online (18.03.2019)
https://edri.org/files/counterterrorism/20190318-TerroristContentRegOpenLetter.pdf

Terrorist Content Regulation: Document Pool
https://edri.org/terrorist-content-regulation-document-pool/

Twitter_tweet_and_follow_banner close