12 Jul 2017

Commission Report on child protection online lacks facts and evidence

By Maryant Fernández Pérez

In December 2016, the European Commission issued two reports on the implementation of the Directive on combating the sexual abuse and sexual exploitation of children and child pornography (Child Exploitation Directive, 2011/93/EU): a general report and a specific report about Article 25 of the Directive, which covers removal and blocking of child abuse, child exploitation and child pornography online.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The publication of the implementation report was delayed for a year. Despite this additional time that the European Commission could have spent on preparing its reports, it has produced unquestionably inadequate documents. In particular, the report on Article 25 fails for at least five reasons:

  1. The Commission provides hardly any statistics. For example, it does not indicate how frequently law enforcement authorities take action after content is reported, numbers of takedowns, speed of processing reports of possibly illegal material, delays in takedowns due to ongoing investigations, the number of websites appearing in blocking lists, the technologies used for blocking, the length of time sites stay on the blocking lists, or the location of sites on the blocking lists.
  2. The report does not address state inaction following content restrictions. These restrictions only solve a small part of the serious problem that needs to be addressed.
  3. The Commission identified public authorities as being crucial to combat child abuse, exploitation and pornography. However, it lists many procedures where no single public authority is involved. This demonstrates a tendency of the Commission to focus on superficial technical “solutions” rather than addressing the crimes in a more meaningful and serious way.
  4. The Commission fails to acknowledge that blocking content can be circumvented. It also fails to report on content that was mistakenly restricted. The Commission failed to recognise that removal of content at source, as part of more comprehensive judicial and law enforcement actions, should be the preferred option, rather than simply trying to block it.
  5. The Commission claims that voluntary measures to tackle child abuse, exploitation and pornography can comply with the Charter of Fundamental Rights of the European Union. With regard to blocking measures, the Commission report identified four safeguards, but failed to assess whether Member States abide by them, contrary to the assurances it gave to EDRi in 2012.

A Swedish Member of the European Parliament (MEP) Anna Maria Corazza Bildt (EPP Group) recently published her Draft Report on the implementation of the Directive. It contains some positive elements, in particular, a call for a proportionality approach. Nevertheless, the draft report contains points that should be improved. For example, its criticism towards the Commission and the Member States with regards to the flaws in the implementation of this important instrument is too gentle. It depicts the “darknet” as a challenge, and ignores the benefits it provides to the internet ecosystem. The report also describes encryption as a “new” technology that creates one of the “main challenges” for law enforcement and judicial authorities, while ignoring that encryption is needed to share sensitive information concerning criminal investigations.

The Parliament will continue its work on the report in the autumn. There is still time for it to assume its role and obtain better results for the protection of children and fundamental rights. We hope, in particular, that the Parliament demands for a new and better report from the Commission, so that any future legislation will be based on strong evidence, thereby protecting all of the rights at stake in this complex policy area.

EDRi’s position paper on the European Commission’s implementation report on Article 25 of Directive 2011/92/EU (12.07.2017)
https://edri.org/files/children/childexploitationdirective_libedraftreport_edripositionpaper_20170712.pdf

Commission’s report on the implementation of Article 25 of Directive 2011/92/EU of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography (16.12.2016)
http://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1486726102713&uri=CELEX:52016DC0872

European Commission’s response to our letter on the blocking aspect of the Child Exploitation Directive (26.11.2012)
https://edri.org/files/priebe_response.pdf

EDRi’s letter to the Commission on the blocking aspect of the Child Exploitation Directive (02.11.2012)
https://edri.org/files/blocking_20121102.pdf

Directive on combating the sexual abuse and sexual exploitation of children and child pornography 2011/92/EU (13.12.2011)
http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A32011L0093

(Contribution by Maryant Fernández Pérez, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
12 Jul 2017

Net Neutrality: BEREC misses opportunity to lead the way

By Epicenter.works and IT-Pol

In November 2015, the European Union adopted the Net Neutrality Regulation (2015/2120), which contained a number of compromises that needed clarification. The Body of European Regulators for Electronic Communications (BEREC) was given the task of developing implementation guidelines to ensure a consistent application of the Regulation throughout Europe and, in practice, settle the remaining ambiguities of the adopted Regulation. After a public consultation to which almost half a million citizens responded and demanded strong net neutrality, BEREC adopted the guidelines in August 2016. The outcome of this process was a legal framework with robust and clear protection for net neutrality, which was applauded all around the world.

The fight for net neutrality in Europe did not, however, end there. The National Regulatory Authorities (NRAs) in the Member States of the European Union (EU) and European Economic Area (EEA) must now enforce the Regulation to ensure that all end users – consumers as well as Content and Application Providers (CAPs) – enjoy the full benefits of net neutrality.

The net neutrality landscape and the challenges faced by NRAs varies greatly throughout the 31 EU/EEA countries, but there are number of common tasks that all NRAs will have to consider. An example is certification of monitoring systems (software) for detecting possible net neutrality violations and empowering users to find out the actual speed of their internet connection.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

In March 2017, the BEREC Net Neutrality Working Group invited stakeholders to present their views on measurement methodology and net neutrality supervision tools. At the stakeholder meeting with end-users and CAPs on 14 March 2017, EDRi was invited and represented by EDRi members epicenter.works (Austria) and IT-Pol (Denmark). The message from EDRi to BEREC was clear: the net neutrality measurement system in Europe should be based on open data, open source software and open, peer-reviewed methodologies, to ensure the full transparency and trustworthiness of the measurement results. Measuring internet quality was for a long time an area with just a few, mostly outdated and complicated software solutions. The mandate of NRAs to certify such software is a huge opportunity to combine forces and create a professional measurement system that not only solves this common problem in Europe, but also around the world. By opening up the data pool of measurement results, the regulators would invite independent researchers, consumer protection organisations and civil society to look for potential net neutrality infringements and create a better understanding of the internet in Europe.

Citizens should be encouraged to measure their internet connection as often as possible. This would allow them to test if they actually get the contractually agreed internet speed, and if their provider is tinkering with their connection by blocking or throttling applications they are using. Creating such measurement system would have the added benefit of respecting European data protection standards, by respecting informed consent and minimising information that can identify individual users, such as IP addresses.

At the stakeholder meeting, EDRi also pointed out the importance of securing funding for independent civil society projects like Respect My Net, where citizens can report cases of net neutrality violations and see reports of possible violations from other end users. Projects like this are necessary because net neutrality violations often go unpunished by the regulators. Besides the comments presented in the meeting, EDRi also submitted a written stakeholder response (pdf) to the BEREC Net Neutrality Working Group.

On 7 June 2017, BEREC presented a draft document on Net Neutrality Regulatory Assessment Methodology for public consultation. This document was published more than a year after the Net Neutrality Regulation went into effect and it is underwhelming, to say the least. EDRi member epicenter.works submitted a consultation response, supported by EDRi members IT-Pol and Access Now, as well as EDRi observer Xnet.

On some technical matters, the BEREC draft document gives a lengthy explanation of basic principles of network measurement. However, on the important points of collaboration between regulators, BEREC holds no position. It proposes no common solution for measurement software, it does not recommend an open source or open data approach which would allow inter-operability, and it does not even acknowledge that NRAs should certify any measurement software at all. This indicates that opinions between Member States vary greatly and that there was no easy consensus. However, the consensus on a measurement methodology cannot be harder to find than on net neutrality. It is BEREC’s role to lay out a path for collaboration between regulators. To quote the current Chair of BEREC Sébastien Soriano:

BEREC [is] expected to be an important part of the process for identification of solutions to problems and would be unsympathetic to those who offered only excuses for inaction.

On some other matters, BEREC is choosing a technical solution which is even turning a blind eye towards a common problem for many internet users – congestion. In the draft measurement methodology BEREC is proposing to measure speed only with multiple HTTP connections to a single test server located in a national internet exchange point (IXP). This setup is inadequate for finding possible net neutrality violations since multiple HTTP connections are less likely to show congestion issues. Also, using a single server means that some Internet Service Providers (ISPs) can easily prioritise the traffic to this server. This illegal type of traffic management has been documented by some ISPs in the past and would undermine all measurement efforts if it were allowed to continue.

The draft document from BEREC mentions that NRAs are not required to establish or certify a monitoring mechanism, and that a certified monitoring mechanism will not be available in some Member States. While the Regulation does not formally establish an obligation for NRAs to certify measurement software for end users, there is a clear obligation to closely monitor and ensure compliance with Articles 3 and 4 of the Regulation. This task will be very difficult to accomplish if the NRA cannot receive reliable input from end users due to a lack of certified monitoring mechanism or software. Without measurement software, the public is blind towards potentially illegal traffic management practices and consumers are stripped of their right to exit contracts where the ISP is not delivering the promised speed. In light of the need to measure the general quality of internet access services (IAS) in Europe, in order to make sure that specialised services do not deteriorate the quality of the IAS, it would be vital to establish a historic data set. This is particularly necessary before the rise of 5G prompts new specialised service experiments by telecom operators.

Rather than simply pointing out that the Regulation does not formally require a certified monitoring mechanism, BEREC should more actively encourage NRAs to co-operate in developing certified measurement methodologies. If all NRAs work on developing their own software, a lot of work is likely to be duplicated, and there will be no comparability of the different subsidiaries of the big telecom companies in Europe. The smaller Members States would benefit greatly from such a cooperation between NRAs, and BEREC is the natural forum for coordinating this activity.

EDRi: Net neutrality wins in Europe! (29.08.2016)
https://edri.org/net-neutrality-wins-europe/

Written response to questions for BEREC stakeholder meeting with representatives of end-users and CAPs (14.03.2017)
https://epicenter.works/document/353

Respect My Net
https://respectmynet.eu/

Draft Net Neutrality Regulatory Assessment Methodology, BEREC (07.06.2017)
http://berec.europa.eu/eng/document_register/subject_matter/berec/public_consultations/7093-draft-net-neutrality-regulatory-assessment-methodology

Consultation response to BEREC on Draft Net Neutrality Regulatory Assessment Methodology, submitted by EDRi member epicenter.works (05.07.2017)
https://epicenter.works/document/546

(Contribution by Thomas Lohninger, EDRi member epicenter.works, Austria and Jesper Lund, EDRi member IT-Pol, Denmark)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
12 Jul 2017

Total information awareness for law enforcement, no data protection

By Statewatch

The European Network of Law Enforcement Technology Services (ENLETS), an informal group funded by the European Commission, has produced a report on best practices in mobile solutions for law enforcement practitioners.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The report sees developments in mobile technologies, telecoms networks and cloud computing as a “game-changer” for total information awareness for law enforcement authorities. It foresees police smartphones, smartwatches or other devices having instant, 24/7 access to a complete profile on individuals from data gathered and stored locally, nationally or internationally.

This turning point is based, firstly, on the rapidly increasing number of information sources – listing for example social media, multimedia, Internet of Things, trackers and tracers, inter-officer chat, all of which are becoming instantly available to law enforcement officers.

Secondly, the report argues, there is the issue of increasing numbers of sensors and automation. It points out that smartphones have sensors and collect information on individuals – they know who you are, your skills, preferences, tasks, whether you are walking, running or driving, they know where you are; they can recognise faces, voices and fingerprints. They are able to combine all that data into context relevant information and provide you with it at the right time, or the right location. And all of this without being asked for, the report adds.

According to the group, we will soon be at a point where “there will be a symbiotic relationship between a user (policeman) and his wearable devices,” with automatic sensors and systems constantly providing information from a whole host of sources without even being prompted.

We have not yet, however, arrived in the age of total information awareness. A cross-border policing exercise recently conducted by ENLETS showed the limits of current systems, according to the report. These limits are due to the lack of technical tools that enable efficient group messaging, exchange of photos, videos and/or documents or tracking of the important movements of people, goods or vehicles.

Currently, there are numerous centralised EU databases for law enforcement and border control, and various networks of national systems. The ENLETS report notes that in the future, access to security and border related data held in such systems will be through mobile devices, permitting direct, operational use.

A key role in developing systems that allow for simultaneous searches across all EU law enforcement and border control information systems is foreseen for the EU Agency for Large-Scale IT Systems (eu-LISA). The agency could implement and operate a centralised EU system for cross-border and covert operations, but the report acknowledges the mandate limitations of the agency’s possibility to develop any such system. An attempt to remedy this is being made with a proposal for a new legal basis for the agency, published by the European Commission. It is designed to help Member States better align national infrastructures to EU systems, including through the setting up of mobile solutions. The Commission is due to publish a further legal proposal on interoperability later in 2017.

The report’s final recommendation to the Law Enforcement Working Party (LEWP), the Council group to which it reports, calls for exploring with eu-LISA the possibility of establishing a European 24/7 centralised infrastructure for bilateral exchange in operations. This would be used for secure mobile communication between Member States to support better cross-border operations, including covert work.

The report notes that the real benefit of mobile policing is standardisation and optimisation of best practices and procedures at the operational level and combating data quality problems. Some countries are now focusing on thorough mobile ID checks, by using smartphones, as a mandatory first step in any process.

Technology is not the only stumbling block to realising this vision of total information awareness. The report does not mention data protection by name once, although it notes legality of data processing as one of the key points to address.

Neither is ENLETS’ vision likely to be easy to achieve on an organisational level. The report itself states that implementing mobile solutions in policing on a large scale is a major undertaking.

This article was originally published at http://statewatch.org/news/2017/jul/eu-mobile-policing.htm.

EU wastes no time welcoming prospect of Big Brother databases (15.05.2017)
http://statewatch.org/news/2017/may/eu-hleg-interop-report.htm

EU funding for network developing surveillance, intelligence-gathering and remote vehicle stopping tools (January 2015)
http://database.statewatch.org/article.asp?aid=34440

EU: Police forces get ready for multi-billion euro policing and security funds (June 2014)
http://database.statewatch.org/article.asp?aid=33609

EU: New police cooperation plan includes surveillance, intelligence-gathering and remote vehicle stopping technology (January 2014)
http://database.statewatch.org/article.asp?aid=32661

(Contribution by EDRi member Statewatch)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
12 Jul 2017

Dissent in the privacy movement: whistleblowing, art and protest

By Guest author

This is the first blogpost of a series, originally published by EDRi member Bits of Freedom, that explains how the activists of a Berlin-based privacy movement operate, organise, and express dissent. The series is inspired by a thesis by Loes Derks van de Ven, which describes the privacy movement as she encountered it from 2013 to 2015.*

On 29 December 2013, digital activist, technologist, and researcher Jacob Appelbaum closes the year with a talk titled “To Protect and Infect, Part 2” at the 30th edition of the Chaos Communication Congress in Hamburg, Germany. He elaborates on the kind of surveillance activities the United States National Security Agency (NSA) deploys, and reveals, among other things, the existence of a dragnet surveillance system called TURMOIL. The information he shares originates from the set of classified documents that whistleblower Edward Snowden collected while working as an NSA system administrator. In June 2013, Snowden decided to share these documents with the press, explaining that he does not want to live in a world where we have no privacy and no freedom and that the public has the right to know what their government is doing to them and doing on their behalf. Later, at the 2014 Dutch Big Brother Awards, he adds that he considered the NSA’s surveillance programs such a severe violation of human rights that he felt it was his obligation to make the documents public. Snowden’s statements are related to a larger, ongoing public debate about surveillance: how much knowledge about citizens is just and necessary for governments to possess and what actions are legitimate to obtain that information?

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Four activists surfaced in the wake of the Snowden leaks and quickly took on leading roles in the debate: Jacob Appelbaum, Glenn Greenwald, Sarah Harrison, and Laura Poitras. Although these four individuals had shared beliefs, they do not share a common background. At the time of the first publications Glenn Greenwald worked as a journalist, Laura Poitras as a documentary filmmaker, Jacob Appelbaum as a technologist, and Sarah Harrison as a journalist and legal researcher for WikiLeaks. Although they are certainly not the only individuals who are relevant to the larger group of activists who work on privacy and surveillance issues, their diversity really is a reflection of the diversity of the group concerned with these issues.

The privacy movement is incredibly diverse, decentralised, and therefore complicated to define. In spite of this, expressing dissent is one of the key characteristics of the movement. It is where activists find each other and share their ideas with the rest of the world. So what does dissent look like in the privacy movement? There are three different ways in which the privacy movement seems to express dissent, namely through whistleblowing, through art, and through protest. Each contributes to the understanding of the privacy movement as a whole.

First, whistleblowing is interesting because its role is threefold. Besides the fact that whistleblowing is a means for the privacy movement to expresses dissent, whistleblowers are also a vital source of information to the movement and furthermore often become activists within the movement themselves. Second, activist art is a way for the privacy movement to communicate its ideas and goals to members of the movement as well as to the wider public. Although there is only a small group of activists involved in the process of creating the art, it does affect the movement in its entirety. Last, the privacy movement also expresses dissent through protest. This is done both through traditional types of protest such as street demonstrations, as well as through protest forms that can only exist online, for example the development, promotion, and use of tools that provide more anonymity for internet users.

Although dissent is an element that characterises the privacy movement, it is certainly not the only one. The untraditional role of leadership within the movement and the physical meeting place in Berlin also contribute to the unique character of the movement.

In the upcoming articles in this series, we will explore whistleblowing, art, and protest as expressions of dissent in more depth.

The series was originally published by EDRi member Bits of Freedom at https://www.bof.nl/tag/meeting-the-privacy-movement/

(Contribution by Loes Derks van de Ven)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner


* This research was finalised in 2015 and does not take into account the changes within the movement that have occurred since then.

close
11 Jul 2017

Latest copyright votes: Filtering, blocking & half-baked compromises

By EDRi

On 11 July, two Committees in the European Parliament voted on their Opinions on European Commission’s proposal for a Copyright Directive: the Committee on Culture and Education (CULT) and the Committee on Industry, Research and Energy (ITRE).

CULT decided to abandon all reason and propose measures that contradict existing law on monitoring of online content. They also contradict clear rulings from the highest court in the EU on internet filtering. And for the sake of being consistently bad, the Committee also supported ancillary copyright, a “link tax” that would make linking and quotation almost impossible on social media.

ITRE made a brave effort to fix the unfixable “censorship machine”, the upload filter proposed by the Commission. On the one hand, this demonstrates a willingness in the Parliament to resist the fundamentalism of the Commission’s proposal. On the other, it shows how impossible this task really is. Despite deleting the reference to “content recognition technologies”, ITRE has decided to keep the possibility of measures to prevent the availability of copyrighted works or “other subject matter” which may or may not be understood as supporting preventive filtering.

In its Opinion, the CULT Committee proposes measures that would attack both European businesses and citizens. The “compromise amendments” to which CULT has agreed made the bad Copyright Directive proposal of the European Commission even worse. Under these “compromise amendments”, it would no longer be possible to store music recordings, video files or any other copyrighted content on European cloud storage services, even when the content has been legally acquired. European cloud services would have to install filters to either block uploads, or to pay “fair” licenses for any copyrighted material that was uploaded.

While imposing filters for copyright purposes, CULT decided in April 2017 to adopt an amendment to the Audiovisual Media Services Directive (AMSVD) to prohibit the use of the exact same method, upload filtering, for restricting hate speech and terrorist content. It is unclear why CULT thinks that upload filtering is ineffective and disproportionate for terrorism, but effective and proportionate for copyright.

Regarding the proposal for ancillary copyright (“link-tax”), included in the Article 11 of the Copyright Directive proposal, views differ in these two Committees.

In ITRE, the Rapporteur initially tried hard to make sense of the original text of the article, but the final Opinion lost much of that first motivation. First of all, the Committee lost the chance of supporting the amendments that called for deletion of the Article 11 in its entirety, which was the only reasonable option. There have already been two failed experiments to introduce the “link tax” in Germany and Spain. Europe should not repeat the same mistakes on the EU level. As a result, this proposal should not have stayed in the final text of the Opinion, and should be left out in the final position of the Parliament, which will be voted in October. The amendments adopted by the Committee have broadened the scope to non-digital publications, worsening the original proposal and missing the opportunity of agreeing on the insufficient, but a bit less harmful, compromise amendments where the new “right” was going to be replaced by a “presumption of transfer or license by authors to publishers”.

The CULT has also passed an amendment calling for the removal of the word “digital” from “digital use of their press publications”, broadening the scope of the Commission’s proposal. In a politically-driven attempt to escape public criticism, CULT proposed to remove the“link tax” for non-commercial use of press publications by individual users. However, as people use commercial networks for sharing press snippets, this amendment would have zero impact in the real world. The Commission put an absurdly high 20-year protection limit in its proposal to allow the other institutions to find a “compromise” lower limit. CULT’s amendments lower the protection under ancillary copyright from twenty years to eight years. It also includes an additional text on “fair share of the revenue generated going to journalists”, without any indication of how this could be achieved or calculated.

The main purpose of the Opinions of these two Committees is to provide specialised expertise to the Committee in charge of the Copyright Directive in the European Parliament, the Committee on Legal Affairs (JURI). The Directive was previously under the leadership of Maltese Parliamentarian Therese Comodini Cachia, who spent untold hours consulting with all stakeholders and producing a balanced draft report. Since she left the European Parliament, she has been replaced with Axel Voss. Within days, he abandoned the diligently prepared approach of Ms Comodini Cachia and is now pushing for a one-sided, restrictive Directive, which stands little chance of surviving proper legal scrutiny. To push the German conservative perspective as the approach to be taken by the European People’s Party, he has even prepared a guide on the copyright Directive.

In September 2017, EU Member States and JURI will have the final discussions before the vote in JURI on 10 October. This will be a crucial moment and the final opportunity for the civil society to make sure citizens’ point of view will be considered before the next stage in the legislative process.

This article is also available in German at https://netzpolitik.org/2017/filtern-sperren-halbgare-kompromisse-erste-eu-ausschuesse-haben-ueber-urheberrecht-abgestimmt/.

Proposal for a Directive of the European Parliament and of the Council on copyright in the Digital Single Market (14.09.2016)
http://ec.europa.eu/transparency/regdoc/rep/1/2016/EN/1-2016-593-EN-F1-1.PDF

No, you can’t enjoy the music you paid for, says EU Parliament Committee (05.07.2017)
https://edri.org/no-you-cant-enjoy-the-music-you-paid-for-says-eu-parliament/

Twitter_tweet_and_follow_banner

close
05 Jul 2017

We are looking for a policy intern to join our team of superheroes!

By Kirsten Fiedler

European Digital Rights (EDRi) is an international not-for-profit association of 35 digital human rights organisations from across Europe. We defend and promote rights and freedoms in the digital environment, such as the right to privacy, freedom of expression, and access to information.

Join EDRi now and become a superhero for the defense of our rights and freedoms online!

The EDRi office in Brussels is currently looking for an intern to support our policy team, located in Brussels. This is your opportunity to get first-hand experience in EU policy-making and contribute to a change in favour of digital rights and freedoms across Europe. The internship will go from 4 September 2017 to 28 February 2018, and is paid 750,- EUR per month.

Key tasks:

  • Research and analysis on a range of policy topics, in particular in the fields of telecommunications, net neutrality, intermediary liability and freedom of expression, encryption, cross-border access to e-evidence and digital trade;
  • Monitoring international, EU and national related policy developments;
  • Organising and participating in meetings and events;
  • Assisting with writing of the EDRi-gram newsletter;
  • Assisting with preparing draft reports, presentations and other internal and external documents;
  • Assisting with preparing communication tasks;
  • Development of public education materials;
  • Find out more about internships at EDRi.

Qualifications:

  • A demonstrated interest in and enthusiasm for human rights and technology-related legal issues;
  • Very good understanding of European decision-making;
  • Experience in the fields of telecommunications, net neutrality, intermediary liability and freedom of expression, encryption, cross-border access to e-evidence or digital trade would be an asset;
  • Excellent research and writing skills;
  • Fluent command of spoken and written English;
  • Computer literacy.

How to apply:

To apply please send a maximum one page cover letter and a maximum two page CV in English and as pdf (other formats – such as doc and docx – will not be accepted) to julien.bencze(at)edri.org.

The closing date for applications is 14 July. The interviews and written assignments will take place between 24-28 July. Please note that due to scarce resources, only shortlisted candidates will be contacted.

Twitter_tweet_and_follow_banner

close
05 Jul 2017

No, you can’t enjoy the music you paid for, says EU Parliament Committee

By Joe McNamee

A leaked European Parliament document exposes some of the most bizarre suggestions yet in the debates around the proposed new copyright rules in Europe.

The proposal for the Copyright Directive is currently being debated in various European Parliament Committees. The leaked document shows that conservative, socialist and Green Members of the European Parliament (MEPs) in the Committee on Culture and Education (CULT) have agreed to “compromise amendment”* that would ban legal uses of legally acquired copyrighted material. They would also require the implementation of upload filter, which would remove uploads identified as copyrighted material – memes for example. The vote takes place on 11 July.

The European Commission put together the proposal for the new Copyright Directive without paying attention to the feedback from citizens. The proposal fails to make any effort to tackle the main problems of the current copyright framework. Faced with this failure, the CULT Committee has managed to make a bad proposal significantly worse.

Under the CULT “compromise amendments”, it would no longer be possible to store legally acquired music recordings video files or any other copyrighted content on European cloud storage services. This is despite the fact that Europeans already pay hundreds of millions every year in levies (3,2 billion euro in the first half of this decade) to compensate rightsholders for making copies of legally obtained copyrighted works. Despite this, European cloud services would have to install filters to either block uploads or pay “fair” licenses for any copyrighted material that was uploaded. (Non-European services, on the contrary, would have nothing to worry about.)

Even more bizarre, the CULT Committee appears to have decided that the fight against private copies of legally obtained copyrighted material is more serious than the fight against terrorism. In April 2017, it adopted an amendment to the Audiovisual Media Services Directive (AMSVD) to prohibit the use of upload filtering as a method of restricting hate speech and terrorist content including “for the most harmful content”. The new compromise amendment to the Copyright Directive would require filtering of any uploads to the internet in Europe just for the purpose of protecting copyright, including protecting copyright holders from people using legally obtained, fully authorised content.

If it adopts this amendment, the Committee will seriously contradict itself, and give a distorted image of the values it supports. It would explicitly support an illegal, ineffective measure (upload filtering) for the benefit of the copyright industry that it has rejected as a proportionate measure to protect citizens “from the most harmful content” (Amendment 75 to the AVMS Directive). A true values gap.

The good news is that the text that will be adopted is not just morally unacceptable, it is logically and legally incompetent. Among the most absurd suggestions are:

  1. The Members of the European Parliament (MEPs) have agreed an explanatory note (“recital”), which states that the filters should not process any personal information, but also that there should be an appeals mechanism. Logically, if providers store no information about who uploaded the content they have removed, they have no way of allowing those unknown people to appeal.
  2. The Court of Justice of the European Union (CJEU) has previously interpreted that the existing EU law prohibits installing of similar filtering systems. The filtering system in that case (Netlog/Sabam C-360/10) was the same type that is currently being proposed. What is being proposed is manifestly and obviously illegal, according to the case law from the European Union’s highest court.
  3. The proposed text literally says allowing users to store data goes beyond “the mere provision of physical facilities” and is “performing an act of communication to the public”.

Ironically, one of the reasons why the CJEU ruled against filtering was the risk of legal material being deleted by accident. Rather than addressing that point, CULT goes even further, and suggests restricting even more legal activities.

In the Audiovisual Media Services Directive, the following MEPs voted for the final text, opposing upload filters the “most harmful content”, such as for terrorism and incitement to violence:

  • EPP: Andrea Boct Erdős (Hungary), Svetoslav Malinov (Bulgaria), Stefano Maullu (Italy), Michaela Šojdrová (Czech Republic), Sabine Verheyen (Germany), Theodoros Zagorakis (Greece), Bogdan Andrzej Zdrojewski (Poland), Milan Zver (Slovenia)
  • S&D: Elena Gentile (Italy), Giorgos Grammatikakis (Greece), Petra Kammerevert (Germany), Dietmar Köster (Germanh), Krystyna Łybacka (Poland), Momchil Nekov (Bulgaria)
  • Greens: Jill Evans (UK), Helga Trüpel (Germany)

It will be interesting to see how many of these MEPs vote in favour of upload filters for copyright.

A full list of MEPs on the CULT Committee can be found here:
http://www.europarl.europa.eu/committees/en/cult/members.html

CULT Compromise Amendments (Article 13)
https://edri.org/files/copyright/CULT_COMPArt_13-V3.pdf

CULT Compromise Amendments (recitals linked to Article 13)
https://edri.org/files/copyright/CULT_COMPArt_13_recitals-V3.pdf

Copyright reform: Document pool
https://edri.org/copyright-reform-document-pool/

Copyfails: Time to #fixcopyright!
https://edri.org/copyfails/

Proposed Copyright Directive – Commissioner confirms it is illegal (28.06.2017)
https://edri.org/proposed-copyright-directive-commissioner-confirms-it-is-illegal/

EU Copyright Directive – privatised censorship and filtering of free speech (10.11.2016)
https://edri.org/eu-copyright-directive-privatised-censorship-and-filtering-of-free-speech/

Deconstructing Article 13 of the Copyright proposal of the European Commission
https://edri.org/files/copyright/copyright_proposal_article13.pdf

Video: Why is the EU Commission pushing for illegal copyright filtering?
https://youtu.be/zROqxBFqe_k

Twitter_tweet_and_follow_banner


* Europe’s only democratically elected institution prefers to negotiate such agreements behind closed doors and not publish them until immediately before the vote. We cannot be certain, therefore, that this is the definitive version, even though this is what we have been led to believe.

close
29 Jun 2017

Germany: Will 30 June be the day populism killed free speech?

By Maryant Fernández Pérez

On 30 June 2017, the German Parliament will vote on the bill on “Enforcement on Social Networks”, also known as the “NetzDG”.

This draft law, if adopted, could seriously impair human rights online, including freedom of expression and opinion. That is why we tirelessly explored different ways to make sure the European Union (EU) would take action against it. We joined forces with other civil society and industry associations and submitted formal comments to the Commission. EDRi and its members were not the only organisations to raise our voices against the NetzDG. The United Nations Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression David Kaye, other civil society organisations, industry, academia and even the legal services of the lower house of the German Parliament, the Bundestag, issued comments against the proposal in Germany.

On 19 June 2017, there was a Parliamentary expert hearing at the German Parliament. Out of ten experts consulted, eight were very critical towards the draft law. Five of them said that it was incompatible with German constitutional law. On the very same day, however, European Commission Vice-President Andrus Ansip stated in Brussels that Germany had notified the Commission that it had made some corrections where there were legal questions. Therefore, the Commission was not worried, trusted the implementation would be perfect, and that the authorities would not misuse the legal provisions. He said that if other Member States were to decide to have such approach and if some paragraphs went against EU law, then the Commission would intervene.

The version of the draft law that will be put to the vote on 30 June is somewhat different from what was originally proposed. However, it appears that the outcomes will be no less damaging. For example, the mentions of content filters are removed. Nevertheless, the draft law now asks for the creation of a new “self-regulatory” authority that will be paid for by companies. Under the law, social media companies would make referrals to the “authority” if they are not able to determine whether content is illegal or not. The text is very vague on how this privately-sponsored “authority” will function. It is unclear about its transparency, its oversight, its accountability, which cases could be submitted to it and which criteria will be used to check whether content is illegal or not. In practice, the draft will keep promoting “voluntary” measures by private companies to take down contents, with liability rules that incentivise such restrictions, without them being obligatory. This means that the results will be similar to those of the original proposal, but just more difficult to challenge in court. In the current version, upload and content filters would not be mandatory, but whether or not mandatory, they are likely to be applied by big companies like Facebook. These companies are, quite rationally, driven by the motivation to avoid liability, using the cheapest options available, and to exploit the political legitimisation of their restrictive measures for profit. This can only lead to privatised, unpredictable online censorship.

In other words, it seems that politicians are determined to continue to ramp up populist demands for profit-motivated companies to become the executive, the legislative and judicial powers of the internet. We are disappointed at the European Commission for not issuing a legal opinion asking Germany to drop the draft law.

The European Commission had legal duties to take action to protect citizens’ interest. It failed to do so. It is not just the Charter of Fundamental Rights of the European Union that the Commission had promised to respect, but also not to “seek nor take instructions” from Member States. It is a pity the European Commission appears to have chosen politics over legal obligations. We can only assume that it is a coincidence that the Commission failed to act against an illegal proposal from the largest EU Member State, while saying that it might take action against other Member States that would propose similar measures.

We urge German Members of the Parliament to vote against the draft law on 30 June. This date should not be remembered as the day that populism killed the freedom of expression. The impact of this law is not confined to Germany. Unfortunately, other countries, such as the United Kingdom and France, are replicating this approach.

The European Commission has failed to respect its legal obligations. The German Parliament can still deliver.

EU action needed: German NetzDG draft threatens freedom of expression (23.05.2017)
https://edri.org/eu-action-needed-german-netzdg-draft-threatens-freedomofexpression/

Twitter_tweet_and_follow_banner

close
28 Jun 2017

Proposed Copyright Directive – Commissioner confirms it is illegal

By Joe McNamee

At a meeting of the European Parliament Committee on Legal Affairs (JURI) on 19 June, European Commission Vice-President Andrus Ansip made a statement that was both shocking and shockingly honest. He advertised the content filtering product of the US company Audible Magic as an affordable alternative to Google’s Content ID filtering technology for filtering European citizens’ uploads to the internet.

He explained, entirely incorrectly, that the cost of “Audible Magic” – for an internet hosting company of unspecified size and unspecified activities – was “only 400 bucks”. This is not only a tiny fraction of the actual cost, but also a fraction of the amount in the Commission’s own impact assessment, 900 Euro. That, in turn, is only a fraction of the services fees listed on the website of “Audible Magic”, which is only a fraction of the real costs shown by a study on shortcomings of content detection tools.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Commissioner Ansip, also entirely incorrectly, gave the impression that the obligation in the Copyright Directive to filter content only applies to audio or audiovisual content. In reality, the obligation to filter all uploads covers everything that can be protected by copyright – also contents such as text and pictures or other protected works.

The confirmation of the real meaning of the Copyright Directive contradicts much of the work of copyright lobbyists, who have been “explaining” to the European Parliament that the Commission proposal does not require filtering or monitoring of citizens’ communications.

However, the most important point is that Ansip explicitly admitted that Audible Magic would enable the providers to fulfil their obligations under Article 13 of the proposed Directive. In the Scarlet/Sabam case, the Court of Justice of the European Union (CJEU) explicitly prohibited a legal requirement for internet access providers to use Audible Magic, on the basis that it would be a breach of the right to privacy, of the freedom to conduct a business, and of the freedom to receive and impart information. The Court worried in particular about the reduction of the right to use copyright exceptions and limitations.

In the Netlog/Sabam case, the CJEU again looked at mandatory filtering, this time for a hosting service (a social network). It again ruled that this kind of obligation for filtering uploads was not acceptable, with reasoning that was broadly identical to the Scarlet/Sabam case.

The question now is, in whose benefit is it to have a Directive that is demonstrably illegal? It hardly helps rightsholders to have a Directive that won’t survive court scrutiny, and it certainly does not help citizens. The only benefit might be for the Commission’s ambition to coerce internet companies into “voluntary” privatised law enforcement measures – legal uncertainty is a blunt but effective tool to make the providers “an offer they can’t refuse”.

Video: Why is the EU Commission pushing for illegal copyright filtering?
https://youtu.be/zROqxBFqe_k

There are some things money can’t buy. For everything else, there’s Audible Magic! (21.06.2017)
http://copybuzz.com/analysis/things-money-cant-buy/

Proposal for a Directive of the European Parliament and of the Council on copyright in the Digital Single Market (14.09.2016)
http://ec.europa.eu/transparency/regdoc/rep/1/2016/EN/1-2016-593-EN-F1-1.PDF

Deconstructing the Article 13 of the Copyright proposal of the European Commission, revision 2
https://edri.org/files/copyright/copyright_proposal_article13.pdf

Copyright reform: Document pool
https://edri.org/copyright-reform-document-pool/

Scarlet v SABAM: a win for fundamental rights and Internet freedoms (30.11.2011)
https://edri.org/edrigramnumber9-23scarlet-sabam-win-fundamental-rights/

SABAM vs Netlog – another important ruling for fundamental rights (12.02.2012)
https://edri.org/sabam_netlog_win/

(Contribution by Joe McNamee, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
28 Jun 2017

An end to copyright blackmail letters in Finland?

By Heini Järvinen

On 12 June, the Finnish Market Court ruled in a case Copyright Management Services Ltd vs. DNA Oyj that Internet Service Providers (ISPs) are not obliged to hand out the personal data of their clients based only on the suspicion of limited use of peer-to-peer networks. Stronger proof of significant copyright infringements need to be presented in order to obtain the data.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Law firms have been sending letters to demand payments as damages for distribution of copyright-protected contents, and to threaten the people suspected of copyright infringement with legal proceedings. The ruling will put an end to this practice.

The Finnish Market Court has previously interpreted even the distribution of minor amounts of data in peer-to-peer networks as a “significant copyright infringement”. However, thanks to the case law of the Court of Justice of the European Union (CJEU), the court has now changed its interpretation. The CJEU has emphasised in its recent rulings that when evaluating the significance of the infringement, the concrete harm caused by the distribution done through a single IP address has to be taken into account.

The compensation claim brought to the court was based on approximately a thousand observations of cases in which films had been made available in BitTorrent peer-to-peer network. The court did not consider these cases to constitute a “significant amount”, because it was not possible to draw conclusions on the repetitiveness, duration, number of distributed works, and the concrete impact on other peer-to-peer users.

The seven judges decided unanimously to refuse obligation for the ISPs to hand out their clients’ personal data. Another important aspect of the decision was that the burden of proof for a “significant copyright infringement” was considered to be on the plaintiff, not the defendant.

On the other hand, on 14 June 2017, the Market Court gave its decision in a case Copyright Management Services Ltd vs. Elisa Oyj, another Finnish ISP. The court stated in its decision that the ISP is obliged to retain its clients’ data for the purpose of releasing it later. The decision, however, emphasised that the purpose of retaining the data is not to grant the plaintiff the access to it, but to avoid the loss of the data until the possible release. This requirement to store consumer data is hard to reconcile with two Court of Justice of the EU rulings prohibiting suspicionless retention of communications data (the Digital Rights Ireland case and the Tele2 ruling) and one explaining the requirement to have a specific law when imposing restrictions such as data retention (the Bonnier Audio case).

Finnish Parliament argued over the copyright initiative (21.05.2014)
https://edri.org/finnish-parliament-argued-over-the-copyright-initiative/

Finland: Common Sense in Copyright Law (24.04.2013)
https://edri.org/edrigramnumber11-8finland-copyright-blackout/

Finnish Big Brother Award goes to intrusive loyalty card programme (07.09.2017)
https://edri.org/finnish-big-brother-award-goes-intrusive-loyalty-card-programme/

Copyright letters facing headwinds – Market Court changed its line (only in Finnish, 12.06.2017)
https://www.turre.com/markkinaoikeus-muutti-linjaansa-tekijanoikeuskirjeista/

Farewell to the blackmail letters? Market Court decision makes it more difficult to claim compensation from peer to peer users (only in Finnish, 15.06.2017)
http://www.hs.fi/talous/art-2000005256360.html

Lawyers are sending blackmail letters to ask for compensation for downloading TV series and movies – “It’s useless to ask a lawyer about moral” (only in Finnish,19.01.2017)
http://www.hs.fi/talous/art-2000005052577.html

(Contribution by Heini Järvinen, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close