21 Sep 2017

Leaked document: Does the EU Commission really plan to tackle illegal content online?

By Joe McNamee

On 14 September, Politico published a leaked draft of the European Commission’s Communication “Tackling Illegal Content Online”. The Communication contains “guidelines” to tackle illegal content, while remaining coy in key areas. It is expected to be officially published on 27 September.

Introduction

The European Commission’s approach builds on the increasingly loud political voices demanding that somebody does something about bad things online, while carefully avoiding any responsibility for governments to do anything concrete. Indeed, not alone does the introduction barely mention the role of states in fighting criminal activity online, it describes (big) platforms as having a “central role” in society. States can investigate, prosecute and punish crimes, online and offline; internet companies cannot – and should not.

Investigation and prosecution

In line with existing practice, the focus is on headline-grabbing actions by internet companies as THE solution to various forms of criminality online.

Under the Commission-initiated industry “code of conduct” on hate speech, the law is downgraded behind companies’ terms of service. The Communication continues in the same vein.

The scale of the lack of interest in investigating and prosecuting the individuals behind the uploads of terrorist material – or even in whether or not this material is even illegal – is proven by a recent response to a parliamentary question in which the Commission confirmed that the EU Internet Referral Unit “does not keep any statistics of how many of the referrals to Member States led to the opening of an investigation.” Instead, the draft Communication lists some statistics about the speed of removal of possibly illegal content, without a real review mechanism, measures to identify and rectify counterproductive effects on crime-fighting, recommendations on counter-notice systems or any attempts in support of real, accountable transparency.

Duty of care – intermediaries

The draft Communication asserts, with no references or explanation, that platforms have a “duty of care”. It is difficult to work out if the Commission is seeking to assert that a legal “duty of care” exists. Such duties are mentioned in recital 48 of the E-Commerce Directive. However, correspondence (pdf) between the Commission and former Member of the European Parliament (MEP) Charlotte Cederschiöld (EPP) at the time of adoption of the Directive proves conclusively that no such “duties” exist in EU law, beyond the obligations in the articles of the E-Commerce Directive.

Duty of care – national authorities

The draft Communication suggests no diligence for national authorities regarding review processes, record-keeping, assessing counterproductive effects, anti-competitive effects, over-deletion of content, complaints mechanisms for over-deletion or investigation or prosecutions of serious crimes behind, for example, child abuse. Apparently, the crimes in question are not serious enough for Member States to have a duty of care of their own. Instead, they hide behind newspapers’ headlines. The German Justice Ministry indicated, for example, that it had no idea at all about the illegality of 100 000 posts deleted by Facebook nor, if they were illegal, whether any of the posts had been investigated (pdf).

Protecting legal speech, but how?

The draft Communication puts the emphasis on asking companies to proactively search for potentially illegal content, “strongly encouraging” “voluntary”, non-judicial measures for removal of content, and encouraging systems of “trusted flaggers” to report and remove allegedly illegal content more easily. While the European Commission makes reference to the need for adequate safeguards “adapted to the specific type of illegal content concerned”, it fails to suggest any protection or compensation for individuals in cases of removal of legal content, besides a right of appeal or measures against bad-faith notices. The leaked Communication also fails to contemplate any measures to protect challenging speech of the kind the European Court of Human Rights insisted must be protected.

Regulation by algorithm

It is very worrisome that the Commission is encouraging and funding automatic detection technology, particularly when at the same time it recognises that “one-size-fits-all rules on acceptable removal times are unlikely to capture the diversity of contexts in which removals are processed”. It is also worrisome that the leaked Communication claims that “voluntary, proactive measures [do] not automatically lead to the online platform concerned playing an active role”. This means that the Commission believes that actively searching for illegal content does not imply knowledge of any illegal content that exists. Ironically, in the Copyright Directive, the Commission’s position is that any optimisation whatsoever of content (such as automatic indexing) does imply knowledge of the specific copyright status of the content. With regard to automatic detection of possible infringements, the Commission recognises human intervention as “best industry practice”. It refers to human intervention as “important”, without actually recommending it, despite acknowledging that “error rates are high” in relation to some content.

In addition, astonishingly, the draft Communication suggests that we need to avoid making undue efforts to make sure that the (possibly automatic) removals demanded by these non-judicial authorities are correct: “A reasonable balance needs to be struck between ensuring a high quality of notices coming from the trusted flaggers and avoiding excessive levels of administrative burden”, the leaked Communication says.

Points worth keeping in the final draft

To be fair, the draft Communication contains also some positive points: It is welcome that the Commission recognises that…

  • big online platforms are not the only actors that are important;
  • “the fight against illegal content online must be carried out with proper and robust safeguards balancing all fundamental rights … such as freedom of expression and due process” – even if the draft Communication doesn’t mention who should provide them, what they are or to whom they should be available;
  • “a coherent approach to removing illegal content does not exist at present in the EU”;
  • the “nature, characteristics and harm” of illegal content is very diverse, leading to “different fundamental rights implications” and that sector-specific solutions should be addressed, where appropriate;
  • harmful content “is – in general – not illegal”;
  • “the approach to identifying and notifying illegal content should be subject to judicial scrutiny”;
  • the possibility of investigation should be facilitated – even if it omits to mention any obligations on transparency with regard to if or how often there are investigations;
  • the role of “trusted flaggers” should comply with certain criteria – even if the draft Communication does not mention what that would be – and that they should be auditable, accountable and that abuses must be terminated;
  • notices must be “sufficiently precise and adequately substantiated”;
  • content-dependent exceptions are foreseen for automatic stay-down procedures, even if the Commission makes unsubstantiated and at least partly false assertions about the effectiveness of such measures;
  • transparency reports are encouraged, even though nothing in the draft would resolve the total failure of transparency evident in the implementation of “hate speech code of conducts”;
  • counter-notice procedures are important and therefore encourages them;
  • filtering technologies have limitation, even if they are not mentioned in all relevant parts of the draft and even if their damaging impact on freedom of expression is not duly addressed.

We can only hope that these important elements remain in the final draft. We participated in expert meetings where we provided a suggested way forward. The Commission knows what is needed. If it is to respect its obligations under the Charter of Fundamental Rights of the European Union, and if it is to avoid the recklessness demonstrated by the lack of review mechanism of, for example, the Internet Referral Units. We will find out if the Commission has the courage to deliver. Further improvements are urgently needed before the final version is published next week.

Twitter_tweet_and_follow_banner

close
20 Sep 2017

Human Rights Court sets limits on right to monitor employees

By Anne-Morgane Devriendt

On 5 September 2017, the Grand Chamber of the European Court for Human Rights (ECtHR) ruled on the Bărbulescu v. Romania case. It found that there was a breach of the right to family life and correspondence (Article 8 of the European Convention on Human Rights), as claimed by Mr Bărbulescu. Mr Bărbulescu was fired after his employer monitored his communications and found that he had used company property to exchange messages with family members. Although the ruling does not forbid employee monitoring, it clarifies how this can be done respecting fundamental rights.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The Grand Chamber questioned the earlier national court decisions. It noted that national courts did not properly assess whether Mr Bărbulescu had been warned that he might be monitored, and to what extent he would be monitored. The Court also clarified the limits regarding legal monitoring of an employee by their employer and the ways national courts should assess them.

First, one of the key aspects that the Court pointed out was the lack of information given to Mr Bărbulescu on the monitoring to which he might be subject. Second, the Court ruled that, in addition to the obligation of providing information, monitoring of employees always needs to be done for a legitimate aim, and in a way that is proportionate to that aim and that does not breach their privacy more than necessary to achieve the goal. None of these safeguards had been followed in this case, as the Court pointed out in the paragraph 140 of its ruling: “the domestic courts failed to determine, in particular, whether the applicant had received prior notice from his employer of the possibility that his communications on Yahoo Messenger might be monitored; nor did they have regard either to the fact that he had not been informed of the nature or the extent of the monitoring, or to the degree of intrusion into his private life and correspondence. In addition, they failed to determine, firstly, the specific reasons justifying the introduction of the monitoring measures; secondly, whether the employer could have used measures entailing less intrusion into the applicant’s private life and correspondence; and thirdly, whether the communications might have been accessed without his knowledge”.

It needs to be stressed that the ruling does not find monitoring of employees’ communications illegal in all situations, but that the power to monitor employees is limited. The judgement limits the employers’ right to monitor employees’ communications by limiting the scope and degree of intrusion, legitimate justification and proportionality of the monitoring. All of these should have been done in this case and should be in any similar cases in the future. The Court clarified that an employee keeps enjoying his right to private and family life also in the workplace.

Press release for the Grand Chamber judgement (05.09.2017)
http://hudoc.echr.coe.int/eng?i=003-5825428-7419362

Romanian whose messages were read by employer “had privacy breached” (05.09.2017)
https://www.theguardian.com/law/2017/sep/05/romanian-chat-messages-read-by-employer-had-privacy-breached-court-rules

Privacy International response to Grand Chamber of the European Court for Human Rights Bărbulescu v. Romania judgement (05.09.2017)
https://medium.com/@privacyint/privacy-international-response-to-grand-chamber-of-the-european-court-for-human-rights-barbulescu-v-cc722b73086b

(Contribution by Anne-Morgane Devriendt, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
20 Sep 2017

Dutch digital investigation: Pushing the boundaries of legality

By Bits of Freedom

The Dutch court is currently considering the case against Naoufal F, in which the police made use of several advanced digital investigation methods that challenge the boundaries of the law.

A key issue in the case is the way in which the police gained access to and analysed the secure communication of suspects. Inez Weski, the lawyer of multiple suspects in this case, claims that in the process, the police violated so many rules that the trial must be stopped. The judge is of a different opinion. However, that doesn’t mean this will be the end of the discussion.

As part of the investigation into (failed) assassinations, the police discovered individuals under investigation were using the Dutch company Ennetcom’s PGP-phones and communication network. One of them was Naoufal F. The communication was secured with PGP, a commonly used encryption method. To read this encrypted information, the police either needed to have the key, or to bypass the encryption, for example by hacking.

The police confiscated one of Ennetcom’s Canadian servers and made a copy of the data in it. Besides the encrypted information, the server apparently contained the PGP-keys with which the encrypted communication was secured. The communication could thus be decrypted and read. The police also succeeded in reading the communications that was stored on the PGP-phones.

The police then used Hansken, a forensic search engine developed by the Dutch Forensic Institute (NFI), to search all the gathered information.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The question is if the police were allowed to do this. First, Weski compares the confiscating of the server with casting a dragnet over a communications network. According to Weski, a great deal of communications of unsuspected persons has been unjustly gathered. She claims Ennetcom offered a perfectly legal service that, although it might also have been used by criminals, was used by companies, governments and innocent citizens. Therefore, the server should never have been confiscated in this manner. A striking detail is that, according to Weski, the PGP-keys were not on the server at all, but they were stored by another company. If that turns out to be true, that might change the case significantly.

In addition, Weski believes the Hansken forensic search engine used in the investigation is, in itself, an “extralegal” investigation tool that should not have been used – extralegal meaning that there is no definition of such investigation tool in law. There are occasions when the use of an extralegal tool is allowed, for example, if there is no major violation of the rights of a suspect, and if the use does not pose a risk for the integrity and manageability of the investigation. However, Weski believes Hansken does not meet these requirements.

Weski wanted the trial stopped due to grave errors and problems in the investigation, and asked for the case to be ruled inadmissible. The police obviously disagree, as does the Public Prosecution Service. The judge ruled that the trial can continue.

This case shows yet again that the police increasingly and more easily gain access to large amounts of information – first of all because there is simply being more data available. Secondly, by using more advanced analysis techniques, more and more information can be extracted from the available data. Information that in itself might seem unimportant, can become valuable when combined with other information. This results in more intrusive analysis of personal data.

It also causes the nature of the police’s work to change fundamentally, because the emphasis lies even more on automated data processing. In this case, both components come together: there is a server available with a huge amount of information, and an advanced analysis tool is available for searching that data.

The Dutch Code of Criminal Procedure is no longer aligned with digital developments. The Ministry of Security and Justice has launched a concept proposal that will be able to face the “new” challenges of digitalisation, and that is meant to replace the current Code.

However, also the concept proposal falls short in providing answers to the problems that surface in the case against Naoufal F. For instance, a better oversight of digital investigation is needed. It is also necessary to re-think the gathering of large datasets that include data of innocent citizens. Finally, there should be better rules concerning the analysis of that data. The new law for the secret services includes a separate rule for analysis of data. That is not the case in the new law for the police. Why should less stringent rules apply?

Does digital investigation fit the confines of the law? (only in Dutch, 30.08.2017)
https://bof.nl/2017/08/30/past-de-digitale-opsporing-nog-wel-in-het-wetboek/

Case against Naoufal F. trial case for justice (only in Dutch, 29.08.2017)
https://www.nrc.nl/nieuws/2017/08/29/strafzaak-tegen-naoufal-f-testcase-voor-justitie-12720148-a1571435

(Contribution by Ton Siedsma, EDRi member Bits of Freedom, the Netherlands; Translation: Ludwine Dekker)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
20 Sep 2017

Secret documents reveal: BND attacked Tor and advises not to use it

By Guest author

The German spy agency BND developed a system to monitor the anonymity network Tor and warned federal agencies that its anonymity is “ineffective”. This is what emerges from a series of secret documents published by the German Netzpolitik blog. The spies handed a prototype of this technology over to the US National Security Agency (NSA), in expectation of a favour in return.

The story begins a few weeks prior to the annual SIGINT Development Conference in 2008 when BND hackers “developed the idea of how the Tor network could be monitored relatively easily”, according to internal BND documents. In March 2008, the spy agency filled in its partners from the US and UK. During a visit of a foreign delegation to Munich, a BND unit presented “the anonymity network Tor and a possible disbandment of the anonymity feature”. In order to implement the plan, the BND hoped for “an international cooperation with several foreign intelligence agencies”.

Both NSA and the UK Government Communications Headquarters (GCHQ) expressed “a high interest” and offered support. The three secret services decided on further meetings and the creation of a project group. The BND hackers told the NSA about “a possibility to penetrate the Tor network”, a term commonly used for the infiltration of IT systems. In this case, the data suggests that the spy agencies wanted to exploit a design decision Tor publicly specified.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Because of a lack of interest in the project within the BND, it was stated that “further development is primarily geared to the needs of the partner”, meaning the NSA. The proof of concept was already “a good status to talk to the experts of the Yanks”. While the BND hoped that their analysts could be “pushed” to work on Tor, their true goal was bigger. The BND wanted something from the NSA: a technology from the “field of cryptanalysis”, to decipher encrypted communication.

On 20 February 2009, a 16-page “concept for tracking internet traffic, which has been anonymized with the Tor system” was finalised. The cover is far from modest: a vegetable chopper over the logo – an onion – of the Tor network. Precisely how the BND planned to “chop” Tor is unfortunately described in the redacted parts of the document Netzpolitik obtained. But to implement the attack, it is probable that the BND ran its own servers in the Tor network pointing to passive snooping servers, which are presumably operated by the NSA, and emphasises the “protection of the anonymity” of the spy agencies.

Three weeks after the concept paper, the GCHQ was “very interested in the [BND’s] access to the Tor network”, the internal report of a meeting at the BND headquarters says. Both parties agreed to arrange further technical discussions and a “joint workshop on possible technical and operational procedures”. Five days afterwards the Americans accepted the offer of the concept paper by the BND – the NSA and GCHQ took over the project. Whether the BND received the compensation it hoped for, remains unknown. When Netzpolitik confronted the BND with a set of specific questions, they received only the boilerplate answer: “As a matter of principle, the BND talks about operational aspects of its work only with the Federal Government and the competent authorities of Parliament.”

One and a half years later, the BND warned German federal agencies not to use Tor. The hacker unit “IT operations“ entitled its report: “The anonymity service Tor does not guarantee anonymity on the internet”. According to the executive summary, Tor is “unsuitable” for three scenarios: “obfuscating activities on the internet”, “circumventing censorship measures” and “computer network operations for intelligence services” – spy agency hacking. The BND assumes “a very high level of surveillance within the network”, including the possibility that anyone can “set up their own so-called exit nodes for monitoring”.

According to the BND, “Tor is predominantly used to conceal activities, where users are not convinced of the legality of their actions. The number of Tor users who aim at preserving anonymity out of mere privacy considerations is relatively small.” The BND bases this statement on “several pieces of intelligence”, but does not underpin it with any facts.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Netzpolitik reached out to several people from the Tor project, but nobody had any idea how the BND came up with this hypothesis. “That sounds like nonsense,” said IT security advisor Jens Kubieziel, who is a system administrator for the Tor project and runs large Tor exit nodes.

Spy agencies and other agencies worldwide “have ways to counter anonymity. One of them is to set up own Tor nodes and monitor those intensively to gather intelligence and evidence”. The spy agencies do not treat this as a secret: “Some agencies have already reported about installing their own Tor nodes and using the logged data for different projects and criminal investigations.”

Looking at the activities of the NSA and GCHQ, the BND’s concern might just be justified. Two years after the Germans presented their gift, the spy agencies continued their work on breaking Tor. The efforts of the British team is documented in the GCHQ’s internal wiki, published by German magazine Der Spiegel from the Snowden archive.

Well-funded international spy agencies continue to refine their attacks. But the Tor community also continues to improve the project and fight off attacks – in close collaboration with the privacy research community. Project leader Roger Dingledine is skeptical as to whether spy agencies are able to make their attacks “work at scale”. Nevertheless, the documents show that “we need to keep growing the Tor network so it’s hard for even larger attackers to see enough Tor traffic to do these attacks.”

However, according to Dingledine that is not enough: “We as a society need to confront the fact that our spy agencies seem to feel that they don’t need to follow laws. And when faced with an attacker who breaks into internet routers and endpoints like browsers, who takes users, developers, teachers, and researchers aside at airports for light torture, and who uses other‚ classical measures – no purely technical mechanism is going to defend against this unbounded adversary.”

This is a shorter version of an article by Netzpolitik https://netzpolitik.org/2017/secret-documents-reveal-german-foreign-spy-agency-bnd-attacks-the-anonymity-network-tor-and-advises-not-to-use-it/

(Contribution by André Meister, EDRi observer AK Zensur, Germany; Adaptation by Maren Schmid, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
20 Sep 2017

Did the EU Commission hide a study that did not suit their agenda?

By Maren Schmid

In 2013, the European Commission announced a launch of a study on copyright – and never published its results. Julia Reda, a Member of the European Parliament (MEP), tabled a freedom of information request on this issue and was eventually granted access to the study.

Even though the independent study was finalised in 2015 and financed by public funds, the European Commission failed to publish the research. A possible reason for this behaviour might be that the key results do not seem to serve the initial purpose of the study – to justify the plans of introducing stricter copyright legislation as part of the reform launched by EU Commissioner Günther Oettinger.

The study’s main focus was how online piracy affects the sales of copyrighted content in four different industries: music, films, books and games. Between September and October 2013, a representative survey was conducted with approximately 30 000 people from six EU Member States (Germany, France, Poland, Spain, Sweden, and UK).

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

One of the main conclusions of the study states that there is no robust statistical evidence of displacement of sales by online piracy. This means that the study could not prove any negative consequences of piracy on the sales of copyrighted content. In fact, the study even found a slight positive trend in the gaming industry, implicating that the unauthorised playing of games eventually leads to paying for games.

The only partial exception to this is the film industry, where the consumption of ten pirated movies leads to four fewer cinema visits and thereby to a loss of five percent of current sales volume. This might be due to the higher price policy for films in comparison to the music, books and games industry.

Interestingly, these results concerning the film industry found their way to a publication of an academic paper by Benedikt Hertz and Kamil Kiljański, both members of the chief economist team of the European Commission. Yet the other unpublished results, showing no negative impact of piracy in the music, book and games industry, were not mentioned in the paper. Beyond that, the original study itself is not referred to either.

This seems to substantiate suspicion that the European Commission was hiding the study on purpose and cherry-picked the results they wanted to publish, by chosing only the results which supported their political agenda towards stricter copyright rules.

We understand that the Commission says that it is a complete coincidence that its decision to publish the study, a year and a half after it was finished, happens to coincide with Ms Reda’s freedom of information request. If this is the case, it would be a pity : Having experienced delays, obstruction and obfuscation from the European Commission in response to freedom of information requests, we thought that this time, it had at least acted in an appropriate, honest and timely manner this time, in response to Ms Reda’s request.

Estimating displacement rates of copyrighted content in the EU
http://ted.europa.eu/TED/notice/udl?uri=TED:NOTICE:276982-2013:TEXT:EN:HTML&tabId=1

Access to documents request: Estimating displacement rates of copyrighted content in the EU (27.07.2017)
https://www.asktheeu.org/en/request/estimating_displacement_rates_of#incoming-14307

Estimating displacement rates of copyrighted content in the EU, Final Report
https://netzpolitik.org/wp-upload/2017/09/displacement_study.pdf

Movie Piracy and Displaced Sales in Europe: Evidence from Six Countries
https://papers.ssrn.com/sol3/Papers.cfm?abstract_id=2844167

Copyright reform: Document pool
https://edri.org/copyright-reform-document-pool/

(Contribution by Maren Schmid, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
20 Sep 2017

Should video-sharing platforms be part of the AVMSD?

By Maryant Fernández Pérez

The Audiovisual Media Services Directive (AVMSD) is currently being reformed. After going through several legislative stages, the AVMSD is now being negotiated in trilogues, that is, informal, secret negotiations between the European Parliament (representing citizens) and the Council (representing EU Member States), facilitated by the European Commission (representing EU interests). As part of the negotiations, a key question will have to be addressed: should some or all video-sharing platforms be covered by the AVMSD and, if so, how?

On the one hand, there are demands for holding video-sharing platforms like YouTube responsible for content (including legal content) that is published on their sites or apps because of the impact online content has on the public debate and our democracies. On the other hand, these platforms are not producing or publishing content, but only hosting it. The AVMSD covering platforms that are so radically different from those that the Regulation was originally created to regulate – cross-border satellite TV services – would not make sense, as EDRi’s position paper, published on 14 September 2017, argues.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Video-sharing platforms, and social media generally, are not traditional media. While their activities influence (and even manipulate) the population, regulating video-sharing platforms as traditional media is not the solution to undesired impacts on our societies. When two services – linear broadcasting of editorially-controlled content and non-linear hosting of content produced by others – are significantly different, achieving a level playing field through a “one-fits-all” approach is not always possible. The consequences of getting it wrong can have a damaging effect on freedom of expression, competition, the fight against illegal material online and the protection of children in the online environment. At the Council meeting, seven Member States made unusually impassioned pleas to reject the proposed approach, mainly on grounds of freedom of expression. For these reasons, the deletion of the provisions that extend the scope of the AVMSD would be the most rational option, as the EDRi’s position paper suggests.

Failing deletion, EDRi recommends to clarify the definition of what constitutes “video-sharing platforms” and “user-generated content”. In addition, EDRi’s position paper asks for more predictability when asking companies to take action, to avoid abuses, ensure predictability and defend freedom of expression. For instance, some proposals on the table in the trilogue negotiations ask video-sharing platforms to restrict incitement to hatred based on political opinions or “any other opinions”. Asking platforms to delete hate speech based on “any other opinions” is likely to lead to arbitrary restrictions, and affect how we express ourselves online. Another reason to be cautious is that certain provisions would ask these companies to have a “self-regulatory” role in the “moral” development of children. Do we really want companies to decide what is good for the “moral” development of our kids?

Fighting against illegal hate speech, terrorism and child abuse is very important. However, asking companies, to decide what should be acceptable or not in our society is worrisome. Numerous examples demonstrate that content is being restricted in video-sharing and social media platforms without accountability or real redress. Creating a situation where video-sharing platforms are forced to regulate more of our communications and give themselves more leeway to decide on what content we can access or not, despite what the law deems to be illegal, will not be beneficial for the EU.

EDRi position on AVMSD trilogue negotiations (14.09.2017)
https://edri.org/files/AVMSD/edriposition_trilogues_20170914.pdf

ENDitorial: AVMSD – the “legislation without friends” Directive? (14.06.2017)
https://edri.org/avmsd-the-legislation-without-friends-directive/

Audiovisual Media Services Directive reform: Document pool
https://edri.org/avmsd-reform-document-pool/

(Contribution by Maryant Fernández Pérez, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
20 Sep 2017

EDRi delivers paper on encryption workarounds and human rights

By Ana Ollo

On 12 September, EDRi published the position paper “Encryption Workarounds: a digital rights perspective”. It was published in response to the European Commission’s expert consultation exercise around the Encryption Workarounds paper by Orin Kerr and Bruce Schneier.

Encryption is probably the most effective way to protect our electronic data. It transforms data into ciphertext, which looks like a random set of characters. A user can access the data in its original form – decrypt the ciphertext and turn it back into its original form – using an encryption key or password. Even if encryption is hard or impossible to break, there are several workarounds available that could help access the data. These practices can facilitate police investigations, but they can also directly interfere with fundamental rights, which is why safeguards are needed. The European Commission has recognised in its recent cybersecurity strategy that encryption “enables protecting fundamental rights such as freedom of expression and the protection of personal data”.

EDRi’s position paper describes ways law enforcement authorities can use to access encrypted data within the framework of their investigations. The current policy applicable to these practices does not provide an adequate level of protection for fundamental rights, especially for the rights to privacy, personal data protection, free expression and due process. There are several techniques that law enforcement authorities can use to access encrypted data. They can be broadly divided into two approaches, each of which can be subdivided into different workarounds, as illustrated in the Kerr / Schneier paper:

  1. The first approach consists in obtaining the key: law enforcement authorities can (a) find the key through a physical search for its copy, which could be stored, for example, on a USB drive or on a scrap of paper; (b) guess the key by trying different keys until one of them works; or (c) obtain the key from the user of the device, for example via social engineering or legal obligation.
  2. The second approach relates to accessing plaintext through bypassing the key. The most frequent workarounds proposed for this are: (a) exploiting a flaw or weakness in the system (or government hacking) after having discovered or purchased vulnerabilities within a legal framework based around human rights; (b) accessing plaintext when it is in use through either the installation of software or spyware, or conducting physical surveillance; and (c) locating a plaintext copy, maybe on another device or via another user.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

In EDRi’s position paper “Encryption Workarounds: a digital rights perspective”, we stress the importance of putting in place specific and strong safeguards for any planned encryption workaround. These protections are indispensable, especially concerning highly sensitive data on a decrypted device. However, safeguards related to traditional searches also need to be taken into account, particularly in regards to practices such as the use of CCTV in public places or the installation of a keylogger on a suspect’s device. Nowadays, governments are increasingly using hacking to access data. As shown by research conducted by EDRi members, like Access Now, there are no examples of governments respecting basic human rights principles to be found among current instances of government hacking. Therefore, the paper recommends government hacking to be presumptively banned until further safeguards are met.

The Commission, as “Guardian of the Treaties”, has to investigate and address this situation with regard to activities that fall under its remit, and to closely examine the issue of government hacking in particular. Current law and policy fail to provide adequate protection of fundamental rights – the need to assess and reform them is pressing.

EDRi’s paper Encryption workarounds: a digital rights perspective (12.09.2017)
https://edri.org/files/encryption/workarounds_edriposition_20170912.pdf

Encryption – debunking the myths (03.05.2017)
https://edri.org/encryption-debunking-myths/

Orin Kerr’s and Bruce Schneier’s paper Encryption workarounds (20.03.2017)
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2938033

European Commission’s communication Resilience, Deterrence and Defence: Building strong cybersecurity for the EU (13.09.2017)
https://ec.europa.eu/transparency/regdoc/rep/10101/2017/EN/JOIN-2017-450-F1-EN-MAIN-PART-1.PDF

(Contribution by Ana Ollo, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
20 Sep 2017

Cross-border access to data has to respect human rights principles

By Maryant Fernández Pérez

The Council of Europe started preparing an additional protocol to the Cybercrime Convention – a new tool for law enforcement authorities (LEAs) to have access to data in the context of criminal investigations. Ahead of the first meeting of the Drafting Group, EDRi coordinated a civil society submission, signed by 14 organisations from around the globe, on how to protect human rights when developing new rules on cross-border access to electronic evidence (“e-evidence”). On 18 September in Strasbourg, EDRi’s Executive Director Joe McNamee handed over the comments and suggestions on the terms of reference for drafting a new protocol.

The Council of Europe welcomed the global civil society submission. Receiving the submission on behalf of the Council, Alexander Seger, the Council of Europe’s anti-cybercrime coordinator stated:

Clear rules and more effective procedures are required to secure electronic evidence in the cloud in specific criminal investigations. Otherwise, governments will not be able to meet their obligation of protecting the rights of individuals and ensuring the rule of law in cyberspace.

The main instruments to facilitate cross-border access to data are the Mutual Legal Assistance Treaties (MLATs). However, they are criticised for being slow and inefficient. This has led countries like the United States and the United Kingdom to enter into bilateral arrangements that do not fix the MLAT, but create additional problems.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Making MLATs more efficient ought to be the number one priority, and this was part of the agenda of the last plenary meeting of the Cybercrime Convention Committee of the Council of Europe. Nonetheless, the forthcoming protocol cannot be used as a way to create parallel processes that could worsen the current situation for human rights. As stated in the civil society submission, if the MLATs are made efficient and effective, a new protocol could outline procedures under which authorities in any state can obtain access to servers and devices in another state in full compliance with three basic principles:

  1. Enforcement of jurisdiction by a State or State agency on the territory of another State cannot happen without the knowledge and agreement of the targeted State.
  2. State-parties must comply with human rights principles and requirements, including under any powers granted or envisaged in or under the Cybercrime Convention and the proposed additional protocol.
  3. Unjustified forced data localisation should be banned. Data transfers between jurisdictions should not occur in the absence of clear data protection standards.

Harmonisation in this area should be done “upwards”, keeping the highest standards of safeguards and practices, not downgrading them to the lowest common level. This means that proposals such as the recent US legislative proposal to require direct “cooperation” from companies bypassing key legal safeguards and increasing surveillance powers – to the detriment our rights and freedoms – would not be acceptable.

Global Civil Society Submission to the Council of Europe: Comments and suggestions on the Terms of Reference for drafting a Second Optional Protocol to the Cybercrime Convention (08.09.2017)
https://edri.org/files/surveillance/cybercrime_2ndprotocol_globalsubmission_e-evidence_20170908.pdf

Cross-border access to data: EDRi delivers international NGO position to Council of Europe (18.09.2017)
https://edri.org/cross-border-access-data-edri-delivers-international-ngo-position-council-europe/

New legal tool on electronic evidence: Council of Europe welcomes civil society opinion (18.09.2017)
http://www.coe.int/en/web/portal/-/new-legal-tool-on-electronic-evidence-council-of-europe-welcomes-civil-society-opinion

US cross-border data deal could open surveillance floodgates (18.09.2017)
https://www.opendemocracy.net/digitaliberties/cynthia-wong/us-cross-border-data-deal-could-open-surveillance-floodgates

(Contribution by Maryant Fernández Pérez, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
18 Sep 2017

FnF 2017: Register for travel support until 22 September!

By EDRi

Freedom not Fear, an annual barcamp organised by EDRi member Digitalcourage for European digital rights activists, takes place in Brussels the weekend of 6-9 October 2017. The event takes place at Mundo B in Brussels and is open for everyone – there are no registrations. Participants that extend their stay to visit the European Parliament on 9 October can receive an allowance for their travel and accommodation costs thanks to several Members of the EU Parliament. To receive those travel allowances, registration is required. Registrations are managed by Digitalcourage.

The barcamp highly depends on the participants and their contribution – if you want to organise a workshop or a talk, you can create an account at the FnF wiki and enter your session. (If you have problems with signing up at the wiki, please e‑mail contact@freedomnotfear.org for support!)

Some sessions have been proposed already. Here is a selection of what is on offer so far: “Data retention in the EU via ePrivacy – What are Estonia, Bulgaria & Austria up to?”, “Offline tracking: WiFi, face recognition & co.”, “Rhetorics of (governmental) surveillance – how do we convince people that freedom is nicer?”. But there is plenty of room for mor! For more details check the schedule on the wiki.

Registration is open until 22 September at https://aktion.digitalcourage.de/fnf2017, and you can find more details, including the links to the wiki, on the FnF website at https://www.freedomnotfear.org.

Twitter_tweet_and_follow_banner

close
18 Sep 2017

Cross-border access to data: EDRi delivers international NGO position to Council of Europe

By EDRi

Today, 18 September 2017, a global coalition of civil society organisations, led by European Digital Rights (EDRi), submitted to the Council of Europe its comments on how to protect human rights when developing new rules on cross-border access to electronic evidence (“e-evidence”). The Council of Europe is currently preparing an additional protocol to the Cybercrime Convention. EDRi’s Executive Director Joe McNamee handed the comments over to Mr. Alexander Seger, the Executive Secretary of the Cybercrime Convention Committee (T-CY) of the Council of Europe.

Joe McNamee, Executive Director of EDRi presents Alexander Seger with his contribution on the forthcoming Cybercrime Protocol. (Photo: Candice Imbert / Council of Europe)

Over the next two and a half years, the work on the new protocol needs to incorporate the civil society principles presented today,

said Joe McNamee, Executive Director of European Digital Rights.

Global civil society is engaging in this process to ensure that any harmonisation in this crucial policy area is up to the highest human rights standards, in line with the ethos of the Council of Europe,

he added.

We are a group of 14 civil society organisations from around the world. We submitted our comments and suggestions on the Terms of Reference for drafting a Second Protocol to the Cybercrime to the Council of Europe. Our aim is to make sure that human rights are fully respected in the preparation of the new protocol. In this global submission, we emphasise the importance of an inclusive, open and transparent drafting process. To facilitate the Council of Europe’s and the State-Parties’ work, we have elaborated key principles that will serve to guide the work of the Drafting group and allow us to engage constructively in the coming two and a half years.

It is vital that the new protocol, if adopted, include and respect three basic principles:

  1. Enforcement of jurisdiction by a State or State agency on the territory of another State cannot happen without the knowledge and agreement of the targeted State.
  2. State-parties must comply with human rights principles and requirements, including under any powers granted or envisaged in or under the Cybercrime Convention and the proposed additional protocol.
  3. Unjustified forced data localisation should be banned. Data transfers between jurisdictions should not occur in the absence of clear data protection standards.

We remain open to work with other civil society organisations in integrating these principles.

Background information:

Electronic evidence (“e-evidence”) refers to digital or electronic evidence, such as contents of social media, emails, messaging services or data held in the “cloud”. Access to these data is often required in criminal investigations. Since in the digital environment the geographical borders are often blurred, investigations require cross-border cooperation between public authorities and between public authorities and the private sector.

The new optional protocol aims to address three areas of activity:

  1. the direct gathering of electronic evidence online by law enforcement agencies in one State, from ICT infrastructure and devices in another State;
  2. closer cooperation between designated bodies in different states in relation to cross-border investigations and transnational collecting of evidence;
  3. the direct requesting and obtaining of possibly highly sensitive personal information by law enforcement agencies in one State from private sector companies in another State, without the knowledge or consent of the latter country, bypassing its laws and potentially violating its sovereignity.

Read more:

New legal tool on electronic evidence: Council of Europe welcomes civil society opinion (18.09.2017)
http://www.coe.int/en/web/portal/-/new-legal-tool-on-electronic-evidence-council-of-europe-welcomes-civil-society-opinion

Global Civil Society Submission to the Council of Europe: Comments and suggestions on the Terms of Reference for drafting a Second Optional Protocol to the Cybercrime Convention (08.09.2017)
https://edri.org/files/surveillance/cybercrime_2ndprotocol_globalsubmission_e-evidence_20170908.pdf

Access to e-evidence: Inevitable sacrifice of our right to privacy? (14.06.2017)
https://edri.org/access-to-e-evidence-inevitable-sacrifice-of-our-right-to-privacy/

EDRi position paper on the Cybercrime Convention – cross-border access to electronic evidence (17.01.2017)
https://edri.org/files/surveillance/cybercrime_accesstoevidence_positionpaper_20170117.pdf

EDRi letter to the Council of Europe on the report of the T-CY Cloud Evidence Group (2016)5 (10.11.2016)
https://edri.org/files/surveillance/letter_coe_t-cy_accesstoe-evidence_cloud_20161110.pdf

Professor Douwe Korff’s comments on the T-CY report (2016)5 (09.11.2016)
https://edri.org/files/surveillance/korff_note_coereport_leaaccesstocloud%20data_final.pdf

Twitter_tweet_and_follow_banner

close