28 Jun 2017

Proposed Copyright Directive – Commissioner confirms it is illegal

By Joe McNamee

At a meeting of the European Parliament Committee on Legal Affairs (JURI) on 19 June, European Commission Vice-President Andrus Ansip made a statement that was both shocking and shockingly honest. He advertised the content filtering product of the US company Audible Magic as an affordable alternative to Google’s Content ID filtering technology for filtering European citizens’ uploads to the internet.

He explained, entirely incorrectly, that the cost of “Audible Magic” – for an internet hosting company of unspecified size and unspecified activities – was “only 400 bucks”. This is not only a tiny fraction of the actual cost, but also a fraction of the amount in the Commission’s own impact assessment, 900 Euro. That, in turn, is only a fraction of the services fees listed on the website of “Audible Magic”, which is only a fraction of the real costs shown by a study on shortcomings of content detection tools.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Commissioner Ansip, also entirely incorrectly, gave the impression that the obligation in the Copyright Directive to filter content only applies to audio or audiovisual content. In reality, the obligation to filter all uploads covers everything that can be protected by copyright – also contents such as text and pictures or other protected works.

The confirmation of the real meaning of the Copyright Directive contradicts much of the work of copyright lobbyists, who have been “explaining” to the European Parliament that the Commission proposal does not require filtering or monitoring of citizens’ communications.

However, the most important point is that Ansip explicitly admitted that Audible Magic would enable the providers to fulfil their obligations under Article 13 of the proposed Directive. In the Scarlet/Sabam case, the Court of Justice of the European Union (CJEU) explicitly prohibited a legal requirement for internet access providers to use Audible Magic, on the basis that it would be a breach of the right to privacy, of the freedom to conduct a business, and of the freedom to receive and impart information. The Court worried in particular about the reduction of the right to use copyright exceptions and limitations.

In the Netlog/Sabam case, the CJEU again looked at mandatory filtering, this time for a hosting service (a social network). It again ruled that this kind of obligation for filtering uploads was not acceptable, with reasoning that was broadly identical to the Scarlet/Sabam case.

The question now is, in whose benefit is it to have a Directive that is demonstrably illegal? It hardly helps rightsholders to have a Directive that won’t survive court scrutiny, and it certainly does not help citizens. The only benefit might be for the Commission’s ambition to coerce internet companies into “voluntary” privatised law enforcement measures – legal uncertainty is a blunt but effective tool to make the providers “an offer they can’t refuse”.

There are some things money can’t buy. For everything else, there’s Audible Magic! (21.06.2017)

Proposal for a Directive of the European Parliament and of the Council on copyright in the Digital Single Market (14.09.2016)

Deconstructing the Article 13 of the Copyright proposal of the European Commission, revision 2

Copyright reform: Document pool

Scarlet v SABAM: a win for fundamental rights and Internet freedoms (30.11.2011)

SABAM vs Netlog – another important ruling for fundamental rights (12.02.2012)

(Contribution by Joe McNamee, EDRi)



28 Jun 2017

An end to copyright blackmail letters in Finland?

By Heini Järvinen

On 12 June, the Finnish Market Court ruled in a case Copyright Management Services Ltd vs. DNA Oyj that Internet Service Providers (ISPs) are not obliged to hand out the personal data of their clients based only on the suspicion of limited use of peer-to-peer networks. Stronger proof of significant copyright infringements need to be presented in order to obtain the data.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Law firms have been sending letters to demand payments as damages for distribution of copyright-protected contents, and to threaten the people suspected of copyright infringement with legal proceedings. The ruling will put an end to this practice.

The Finnish Market Court has previously interpreted even the distribution of minor amounts of data in peer-to-peer networks as a “significant copyright infringement”. However, thanks to the case law of the Court of Justice of the European Union (CJEU), the court has now changed its interpretation. The CJEU has emphasised in its recent rulings that when evaluating the significance of the infringement, the concrete harm caused by the distribution done through a single IP address has to be taken into account.

The compensation claim brought to the court was based on approximately a thousand observations of cases in which films had been made available in BitTorrent peer-to-peer network. The court did not consider these cases to constitute a “significant amount”, because it was not possible to draw conclusions on the repetitiveness, duration, number of distributed works, and the concrete impact on other peer-to-peer users.

The seven judges decided unanimously to refuse obligation for the ISPs to hand out their clients’ personal data. Another important aspect of the decision was that the burden of proof for a “significant copyright infringement” was considered to be on the plaintiff, not the defendant.

On the other hand, on 14 June 2017, the Market Court gave its decision in a case Copyright Management Services Ltd vs. Elisa Oyj, another Finnish ISP. The court stated in its decision that the ISP is obliged to retain its clients’ data for the purpose of releasing it later. The decision, however, emphasised that the purpose of retaining the data is not to grant the plaintiff the access to it, but to avoid the loss of the data until the possible release. This requirement to store consumer data is hard to reconcile with two Court of Justice of the EU rulings prohibiting suspicionless retention of communications data (the Digital Rights Ireland case and the Tele2 ruling) and one explaining the requirement to have a specific law when imposing restrictions such as data retention (the Bonnier Audio case).

Finnish Parliament argued over the copyright initiative (21.05.2014)

Finland: Common Sense in Copyright Law (24.04.2013)

Finnish Big Brother Award goes to intrusive loyalty card programme (07.09.2017)

Copyright letters facing headwinds – Market Court changed its line (only in Finnish, 12.06.2017)

Farewell to the blackmail letters? Market Court decision makes it more difficult to claim compensation from peer to peer users (only in Finnish, 15.06.2017)

Lawyers are sending blackmail letters to ask for compensation for downloading TV series and movies – “It’s useless to ask a lawyer about moral” (only in Finnish,19.01.2017)

(Contribution by Heini Järvinen, EDRi)



28 Jun 2017

The Freedom Index – easing access to information on rights issues


A diverse group of human rights defenders in the EU has launched an ambitious project that aims to radically change the way information relating to human rights is organised. If successful, the initiative will create a system that can permanently identify and preserve all human rights data across all languages, and radically improve its availability to anyone who is working on the same issues in the future.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The new project, “the Freedom Index”, is a universal indexing concept designed to catalogue all data relating to human rights and fundamental freedoms. This system of unique 12-digit numeric codes will substantially increase the visibility, longevity and effectiveness of information published in all languages.

The system (known as a “taxonomy”) assigns codes to audio and visual material, reports, judicial decisions, websites, legislation, articles, blogs, forums, research material… It is best imagined as a library index system, though instead of identifying individual works, the new index is centred on microscopically identifying subject matter, specifically, approximately a billion sub-divisions of the human rights field. The system will create an unprecedented linkage of information. This means a civil rights group in, say, France working on a campaign against online content filtering, will be able to identify information on every similar campaign in the world, regardless of the language of the published material. Presently, most data published in languages other than English are almost invisible to search engines.

The Index is the brainchild of privacy pioneer Simon Davies, founder of Privacy International and the Big Brother Awards. He collaborated on the project with other experts, namely Annie Machon, former UK Security Service MI5 officer turned whistleblower, and Robert Beens, the CEO of privacy enterprise StartPage/Ixquick.

Consultation and planning for the project began in early 2015 and included collaboration with the European Commission, Amnesty International, the Ford Foundation, the University of Amsterdam and the EU Agency for Fundamental Rights (FRA). The initiative has now been formally established in the Netherlands as a non-profit foundation (a Stichting).

Annie Machon, a Board member of the Freedom Index, says the project is essential to the functioning of civil society: “I firmly believe the Index is revolutionary – and that’s a description I rarely if ever use. One aspect that particularly interests me is the removal of linguistic barriers. We’re quickly heading toward a data meltdown and some hard choices need to be made to ensure that information can be protected and preserved.”

“I became involved in this project because I think it is essential for the development of the information age and it is crucial to the growth and integrity of human rights. If we can’t find a way to organise information, we all risk being held hostage to fortune from big commercial enterprises and for crucial information to be ‘disappeared’.”

Simon Davies described the current data problem for human rights as “systemic and dangerous”.

“Each day, millions of new items relating to human rights are added to the online environment. And yet, while this vast reserve of data continues to grow and diversify, its accessibility and visibility are shrinking. For a variety of reasons, information moves out of public reach almost as fast as it is added. Generally speaking, data are ‘dumped’ into cyberspace, and publishers tend to expect that search engines will reliably present the information on demand. This outcome is far from reality.”

“The core problem in the current online environment is that – in all but the most exceptional cases – the conduits to online information have become corrupted or obscured to the point where using conventional search has little effect. It’s fair to say that to find the ‘less popular’ items, the searcher needs either to have prior knowledge of a piece of data or access to a dedicated portal before that item can be discovered. In the vast majority of cases, neither of those conditions exist.”

Freedom Index

The world’s most powerful human rights data initiative is unveiled (24.11.2015)

Stichting the freedom index foundation registration details



28 Jun 2017

Denmark allows massive retention of location data for mobile internet

By IT-Pol

On 24 May 2017, the Danish telecom regulator announced its decision concluding that the retention of location data for mobile internet usage is lawful. With the decision, the regulator allowed for massive data retention, which seriously undermines citizens’ right to privacy, since it means they can be tracked at all times and the data is being stored.

Under the Danish data retention law, mobile communications service providers must retain location data (cell ID) for telephone calls and SMS/MMS messages. There is no requirement to retain location data in connection with mobile internet usage. Smartphones generate internet traffic more or less constantly even when the device is not actively used, for example with updates from social media services. Therefore, a formal obligation or informal practice to retain location data for internet traffic effectively means that every movement in physical space of the citizen is registered and stored for a long period (12 months in Denmark).

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

The e-Privacy Directive 2002/58 only allows for providers of electronic communication services to retain traffic data, including location data, without consent from the subscriber if the data is required for billing or if there is a data retention requirement in national law. Location data for mobile internet traffic is not needed for billing, and there is no specific data retention requirement for this data in Denmark. The logical assumption would be that Danish mobile operators are not allowed to retain this information, even if they wanted to do so voluntarily for commercial reasons. However, in a somewhat surprising decision of 24 May 2017, the Danish telecom regulator concluded that the retention of location data for internet traffic is lawful.

A Danish citizen discovered, through a subject access request under the Data Protection Act, that his mobile operator retained a substantial amount of location data for internet traffic. In February 2016, this citizen filed a complaint with the Danish Business Authority, the telecom regulator responsible for the enforcement of the data protection rules of the e-Privacy Directive.

In its response to the complaint case, the mobile communications service provider TDC confirmed that location data is stored for so-called “state changes” in the network, which include start/end of an internet session, after 60 minutes of an uninterrupted session, after a certain volume of traffic, and when changing between different radio technologies (2G, 3G and 4G). TDC argued that this practice is necessary in order to comply with the data retention requirement for MMS traffic where the cell ID of sent and received messages must be retained. In the TDC mobile network, MMS messages are sent as data traffic, and the MMS traffic cannot be separated from the ordinary internet traffic. The cell IDs for internet traffic are retained based on pre-defined criteria related to data and network usage patterns, so the actual cell ID used when sending or receiving an MMS message is not directly available.

When law enforcement seeks access to communications metadata for a subscriber, TDC will match timestamps for MMS messages with the closest timestamp for the retained cell IDs for internet traffic in order to generate approximate cell IDs for MMS traffic. Law enforcement can also seek access to the full location data for internet traffic. Under Danish law (the Administration of Justice Act), law enforcement access to mobile location data, even if detailed in a way that it effectively records every movement of the citizen, is not restricted to investigation and prosecution of serious crime. Any offence that is subject to public prosecution is a legal ground for access to location data by the police. TDC was asked by the Danish Business Authority whether it would be possible to crosslink the cell IDs with MMS traffic immediately after collection and erase the records which are not related to MMS traffic. TDC responded that this procedure would compromise the data quality since the original location data (described as “raw data”) is no longer available.

The Danish Business Authority also asked the Ministry of Justice for an opinion on the interpretation of the Danish data retention rules. According to the Ministry of Justice, the obligation to retain location data (cell ID) for MMS traffic applies even if the mobile network is designed so that location data for other traffic types will have to be retained as well. This broad interpretation is hard to reconcile with data retention being an exception to the main rule in the e-Privacy Directive of erasure of traffic data. The Danish data retention law includes a provision similar to Article 1(1) and recital 13 of the now annulled Data Retention Directive 2006/24. The Directive limited the retention requirement to traffic data that is accessible (generated or processed) when supplying a communication service. In the present case, it could certainly be argued that location data for the MMS communication service is not accessible for the provider, especially as the procedure followed by TDC does not necessarily deliver the actual cell ID from which an MMS message is sent or received.

Based on the information received from TDC and the Ministry of Justice, the Danish Business Authority decided that the retention of location data for internet traffic by TDC is not in violation of the Danish law transposing the e-Privacy Directive. Retaining this data can be allowed, since there is a retention requirement for MMS traffic, and it would be disproportionate to require that TDC modifies its systems so that MMS and internet traffic are physically separated in the mobile network. In this regard, the Danish Business Authority accepted the argument from TDC that erasing the internet location records not related to MMS traffic – most likely all but a small fraction of the total set of location data – would compromise the traffic data that can be made available to law enforcement. The legal basis for this part of the decision seems somewhat questionable since the data retention law has no provisions on data quality or documentation for the retained data. All retained traffic data is presumably filtered or processed from a larger pool of traffic data that only exists temporarily in the network.

In the proportionality assessment of the decision, the Danish Business Authority also took into account that a revision of the Danish data retention rules is being planned, and that the Ministry of Justice intends to propose new requirements to retain location data for internet traffic. The decision mentions a pre-draft proposal for retention of location data for internet traffic which coincidentally is very close to what TDC is currently doing on the company’s own accord. However, this preliminary proposal by the Ministry of Justice for blanket retention of location data for internet traffic predates the Tele2 judgment of 21 December 2016, where the Court of Justice of the European Union (CJEU) clearly ruled that a blanket data retention requirement is illegal under European Union law. In March 2017, the Ministry of Justice accepted that the Danish data retention law would have to be changed as a consequence of the CJEU judgment. While a targeted data retention scheme could potentially include new requirements with location data for internet traffic, the overall setup would have to be distinctly different from the current practices of TDC which are based on retention of location data for all subscribers.

Decision by the Danish Business Authority on the processing and storage of mobile location data by TDC (only in Danish, 24.05.2017)

EDRi: Denmark: Our data retention law is illegal, but we keep it for now (08.03.2017)

(Contribution by Jesper Lund, EDRi member IT-Pol, Denmark)



28 Jun 2017

Are we on the right track for a strong e-Privacy Regulation?

By Diego Naranjo

European legislation protecting your personal data (the General Data Protection Regulation and Law Enforcement Directive on Data Protection) was updated in 2016, but the battle to keep your personal data safe is not over yet. The European Union is revising its legislation on data protection, privacy and confidentiality of communications in the digital environment: the e-Privacy Directive. This piece of legislation contains specific rules related to your freedoms online.

In today’s interconnected societies, the way we frame technology defines if we are able to ensure the privacy of our most intimate conversations and thoughts. If the policy-makers fail to achieve this and end up with a vague text full of loopholes because of political “compromises”, it will have a far-reaching impact on our online freedoms for the decades to come.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

In January 2017, the European Commission launched the reform of the e-Privacy legislation by proposing a harmonised framework. The text needs improving. Tracking walls and offline tracking should be banned, and encryption should be ensured, among other issues. Despite the flaws of the proposal, there are also positive aspects to build on.

On 9 June 2017, the lead committee of the European Parliament in charge of the dossier for the e-Privacy reform, the Committee on Civil Liberties (LIBE), published its draft Report on the ePrivacy Regulation including amendments to the Commission’s original proposal. Marju Lauristin, the Parliamentarian in charge of the file for LIBE has shown great determination to improve the protection of citizens’ privacy by proposing numerous positive changes to the Commission’s text. The changes proposed by LIBE will help to ensure legal certainty by limiting the ways data can be used (strict grounds for processing), broadening the type of trackers that will be regulated (not only “cookies”), and reinforcing users’ rights by promoting end-to-end encryption without backdoors. Ms Lauristin also proposed introducing a household exception similar to the one in the General Data Protection Regulation (GDPR), in order make certain that accessibility tools are not unintentionally restricted by the legislation. In addition to this, the draft Report broadens the scope to include the protection of employees from surveillance by their employers, and adds the possibility of collective redress for organisations. However, the text could have gone one step further, for example, the absence of a stronger text opposing offline tracking in the proposed amendments is regrettable. It is difficult to imagine how consent in those situations (ones’ movements being tracked through Bluetooth or WiFi, as one wanders around a town or a shopping centre) can be informed, how data could be meaningfully anonymised and how opt-out would work without excluding users of certain services.

The LIBE Committee put forward a stronger text than the original proposal. It is, however, to be seen if strong opponents of the e-Privacy Regulation, such as the Rapporteur of the Committee on Legal Affairs (JURI) Axel Voss, will succeed to undermine the key elements of the text. Only few Member States seem to have a strong position on this dossier, which makes it even harder to guess what the final result of this reform will look like. Member States have been heavily lobbied by the most regressive parts of the online industry for years. This resulted in fourteen Member States calling for “the right balance between digital products and services and the fundamental rights of data subjects” – as ridiculous as it seems to demand a balance between “products” and fundamental rights.

A lot of work still needs to be done to keep the best parts of the proposals, and to avoid the amount of disharmony and “flexibility” we ended up with in the GDPR. The way we will communicate with others, and the way our interconnected devices will work depends greatly on the outcome of this new Regulation. Will the EU set up the standards of protection high enough? The next months will give us an answer to this question.

Draft Report on the e-Privacy Regulation of the Committee on Civil Liberties, Justice and Home Affairs (LIBE) (09.06.2017)

e-Privacy revision: Document pool

Your privacy, security and freedom online are in danger

EDRi’s proposal for amendments to proposal for the e-Privacy Regulation

(Contribution by Diego Naranjo, EDRi)



14 Jun 2017

#ALTwitter privacy revelation: European parliamentarian goes bananas

By Guest author

Recently, Mr Dunston (of the “Dunston Checks In” fame) came to the EDRi Brussels office looking for help. He complained that somebody from the European Parliament is messing up with his “holy banana collection” that he has been preserving since decades after he inherited it from his forefathers. Other than that we had no information.

Being the defenders of human rights in the digital environment, we decided to help Mr Dunston. Coincidentally, we were working on a project called ALTwitter, where we had created Twitter-like profiles of the Members of the European Parliament (MEPs) based on their metadata. We thought, for once, let’s use metadata for social good.

Here is what we did:

Step 1: Data collection
We collected approximately 10000 publicly available tweets from the Twitter accounts of 617 MEPs.

Step 2: Metadata extraction
We stripped the metadata associated with these tweets – such as source of the tweet i.e. the device or service from which the tweet originated – for further analysis.

Step 3: Metadata analysis
We counted the number of times each of those devices or services were used by MEPs. Then we arranged them according to how frequently they had been used.

Step 4: Finding the anomaly or unique artifact
We then selected the few least commonly used devices or services. This was to find the sources which are used only by a few MEPs.

Step 5: Finding the culprit
We were surprised to see “Banana Kong” as one of the rarely used sources of tweets from MEPs. Apparently, it was used by only one MEP on her Apple (iOS) phone. That was none other than Angelika Niebler.

Step 6: Helping Dunston with the proof
Because we had seen this information in Ms Niebler’s metadata, we searched her chronology to see if she had ever mentioned her surprising pastime. Sure enough, evidence of Ms Niebler’s banana enthusiasm came to light.

(I’ve just reached 90 meters in “Banana Kong”. Download it from the App Store and try to beat me!)

Angelika has been stealing Mr Dunston’s bananas since August 2013, and she owes him compensation – big-time. We would never have found this proof, if metadata hadn’t pointed us in the right direction. This is the same metadata the advertisers are using to target her with more personalised ads, track her online activities, and undermine her privacy online, and possibly offline. And our privacy, too.

When signing up to use the app, Ms Niebler agreed, as a prominent Member of the European Parliament, to share a variety of personal information, including her device identifier, geo-location information and IP address data with the game supplier and fourteen other companies, mainly based in the United States.

We suggested to Mr Dunston that he should take legal action against Ms Niebler for banana theft. But, he says:
“Listen! I am a nice orangutan. I don’t need any monetary compensation, but I want her and every other MEP to understand the importance of privacy. Today it is my banana, tomorrow it could be yours. If not Ms Niebler, someone else will steal it. In fact, the advertisers have been already stripping our online privacy, with or without our knowledge. It’s time to put an end to this! Let’s try to understand why privacy matters and let’s defend it! Let’s help the parliamentarians to do the same! That is the best compensation I would expect.”

We believe that his demands are fair. If you agree, join us on our mission to defend everyone’s digital rights! We want to convince Ms Niebler and other MEPs to vote right on the e-Privacy Regulation, to make sure it guarantees privacy by design and by default for our online communications. We want to make sure that no one can be refused to access information because they oppose being tracked (no “tracking walls”), that groups can act on behalf of citizens when an infringement has occurred and that tracking can never be the default. Find out more about e-Privacy here!

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

ALTwitter #hakunametadata: Twitter metadata profiles of the Members of European Parliament

ALTwitter: The treasure trove behind 140 characters (31.05.2017)

Hakuna Metadata – Let’s have some fun with Sid’s browsing history! (03.05.2017)

Hakuna Metadata – Exploring the browsing history (22.03.2017)

New e-Privacy rules need improvements to help build trust (09.03.2017)

(Contribution by Siddharth Rao, Ford-Mozilla Open Web Fellow, EDRi)



14 Jun 2017

Running an algorithmic empire: The human fabric of Facebook

By SHARE Foundation

Facebook, the most popular social networking platform, has undoubtedly become one of the most influential entities in our networked world. As SHARE Lab and researchers previously explained, Facebook can be seen as a huge, global factory of immaterial labour in which its users have basically one role – churn out as much personal data as possible. However, Facebook Inc., the company which owns this huge immaterial factory, is (still) run by human beings. Those human beings’ connections both inside the company itself and with other parts of society, such as IT industry, government and civil society organisations, are the topic of the most recent research by SHARE Lab and Tactical Tech.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Facebook is understood as an “uber-collective” with non-transparent decision-making concerning the rules, data exploitation and privacy, development, user freedoms, and various kinds of censorship. This analysis helps us realise why this is the only way a company like Facebook can exist. In order to visualise the connections of Facebook’s management – its board of directors and advisors, and two executive levels – SHARE Lab used publicly available information provided by The Official Board and Crunchbase websites. Based on the official biographies, the educational and professional background of every person on these lists was analysed. Also, publicly accessible data from the LinkedIn network was used to find out more about the Facebook employees, for example their educational background, country of origin and employment history.

The results were quite interesting, as most people now working for Facebook, even at the executive level, previously worked for the competition – Google, Amazon, Microsoft and Yahoo. This could be seen as a risk for the “industry ecosystem” since these circles seem to be rather closed. The vast majority of them, as expected, lives in the San Francisco Bay Area, while the second-largest city of importance to Facebook is London. One of the conclusions of the research was that Facebook as an employer mostly recruits people who graduated from US universities. This means that in spite of acting globally, this company does not see the need to represent the structure of its users around the world. The dominance of US-based education is also obvious both in the managing board and among the employees.

The task of sketching out the social structure of a large company such as Facebook is important for understanding the impact of this global social network on the society, local and global economy and civil freedoms. It is also crucial to understand how the development of high-end technology and communication infrastructures intertwines with the accumulation of capital and political power.

This article is also available in German at https://netzpolitik.org/2017/wie-man-ein-imperium-der-algorithmen-beherrscht/.

SHARE Lab: The Human Fabric of the Facebook Pyramid (03.05.2017)

SHARE Lab: Facebook research

(Contribution by Bojan Perkov, EDRi member SHARE Foundation, Serbia)



14 Jun 2017

Internet clampdown – convenient distraction from political turmoil?

By Open Rights Group

There was unforeseen result in the United Kingdom general election. The Conservative Party was expected to increase their majority in government. However, it failed to achieve a majority and was forced to seek an alliance with the controversial Democratic Unionist Party (DUP) in order to form a government.

Despite this, Prime Minister Theresa May has hinted that she will push ahead with plans to clamp down on the internet. The Conservative Party manifesto included proposals that would force internet companies to be more proactive in removing user-generated content from their sites. This would undoubtedly mean the automated, algorithmic censorship of the internet without oversight. She is also likely to push through powers that could see companies forced to weaken the security of their products, especially to limit encryption.

On 13 June, the UK government issued a press release announcing a joint campaign with France to tackle online radicalisation. The press release states that the two countries will explore the possibility of creating a new legal liability for tech companies if they fail to remove content, which could lead into tightening actions against tech companies who fail to remove unacceptable content. Prime Minister May and French President Emmanuel Macron also indicated that they will make proposals to press for establishing an industry-led forum to develop solutions to tackle terrorist content on the internet.

It remains to be seen whether Prime Minister May will be successful, given how weak her position looks at the moment. But focusing on internet regulation could be a convenient distraction from political turmoil and impending Brexit negotiations.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

May and Macron plan joint crackdown on online terror (12.06.2017)

UK government pushes for companies to weaken encryption (31.05.2017)

UK government attacks encryption … again (05.04.2017)

UK’s mass surveillance law being rushed through legislative process (09.03.2017)

UK Draft Investigatory Powers Bill: Missed opportunity (18.11.2015)

UK: Report of the investigatory powers review (17.06.2015)

(Contribution by Pam Cowburn, EDRi member Open Rights Group, the United Kingdom)



14 Jun 2017

Commission’s waiting game: Gambling with freedom of information


In April 2017, EDRi wanted to shed light on the industry lobbying in Brussels surrounding the copyright reform. We therefore filed a freedom of information (FOI) request to access the correspondence the European Commission received from rightsholders at the time it was finalising its proposal for the new Copyright Directive.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

On the very last day of the period for responding to our demand, we received a request for clarification of what we were asking for. The request for clarification of a very simple request can only be interpreted as a sign of the Commission not being able to understand a basic demand, or as an abusive effort to avoid respecting its legal obligation to respond to freedom of information requests.

We confirmed that our request to access “all correspondence” indeed covered “all correspondence”, and the Commission kindly acknowledged receipt of our clarification. However, for no obvious reason, it decided that this started a new deadline for it to respond.

On 1 June 2017, the very last day of this new, entirely unexplained extension of the period for responding to our demand, we received a response from the Commission:
“An extended time limit is needed as the documents requested originate from third parties which have been consulted. Therefore, we have to extend the time limit with 15 working days in accordance with Article 7(3) of Regulation (EC) No 1049/2001 regarding public access to documents.”

We understand and acknowledge that consulting third parties is time-consuming. However, the fact that the request concerned third parties was known by the Commission from the start.

We are impatiently awaiting the extended deadline on 23 June, anticipating the results of consulting the third parties, and perhaps, finally acquiring access to the correspondence in question – unless the Commission happens to have another card up its sleeve, probably unexpected clarifications that were demanded by the “individuals concerned” which, the Commission regrets, will require the deadline to be extended yet again.

EU Commission on FOI request: Incompetence or ill-intent? (31.05.2017)

ENDitorial: Transparency and law-making on EU copyright – mutually exclusive? (05.04.2017)

Artificial unintelligence – lobbyletter



14 Jun 2017

Access to e-evidence: Inevitable sacrifice of our right to privacy?

By Guest author

What do you do when human rights “get in the way” of tackling crime and terrorism? You smash those pillars of your democratic values – the same ones you are supposedly protecting. Give up your right to privacy, it is a fair price to pay for the guarantee of your security! This is the mantra that, during the past decades, we have heard populist politicians repeat over and over again – never mind that gambling with our rights actually helps very little in that fight.

One of the bargaining chips in the debate on privacy versus security is access to e-evidence.

E-evidence refers to digital or electronic evidence, such as contents of social media, emails, messaging services or data held in the “cloud”. Access to these data is often required in criminal investigations. Since the geographical borders are often blurred in the digital environment, investigations require cross-border cooperation between public authorities and private sector.

Thorough police investigations are indeed of utmost importance. However, the access to people’s personal data must be proportionate and necessary for the aim of the investigation and provided for by law.

In a similar way that the police cannot enter your home without a court warrant, they are not supposed to look into your private communications without permission, right? Not really.

The EU is working towards easing the access to e-evidence for law enforcement authorities. The plan of the European Commission is to propose new rules on sharing evidence and the possibility for the authorities to request e-evidence directly from technology companies. One of the proposed options is that police would be able to access data directly from the cloud-based services.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

This means that Facebook, Google, Microsoft, providers of messaging services, and other companies which collect and store data of millions of EU citizens, would be obliged to provide this data to the authorities, even when stored in the cloud in another EU Member State. The types of data that might fall within the scope of the law range from metadata (such as location, time, sender and recipient of the message and other non-content data) to the content of our personal communications.

But for sure there must be safeguards to protect people’s right to privacy, right? Not necessarily, especially when pushing for “voluntary” cooperation between companies and law enforcement. This kind of arrangements often lack in accountability and predictability. This is why any new measures on e-evidence must comply with international human rights and data protection standards. Member States must continue to be able to regulate access to data in their jurisdiction and on their citizens and residents, in particular by foreign law enforcement and national security agencies. Individuals must also be able to seek protection and redress in their own country.

Access to e-evidence is also being discussed beyond EU borders. The Council of Europe (CoE) is preparing to adopt a new protocol to the so-called Budapest Convention – the Convention on Cybercrime of the Council of Europe. The Convention covers not only CoE Member States, but all 53 countries that have ratified it. This means not all of them are bound by data protection or human rights conventions. EDRi is following this process attentively and has submitted input on several occasions.

The initiative from the European Commission is establishing the framework for a new legislative proposal, which is scheduled to be presented in the beginning of 2018. On 8 June 2017, the Commission presented the options for practical and legislative measures to the EU ministers. EDRi is participating in expert discussions on the suggested way forward.

It is crucial that safeguards to ensure data protection and the rule of law are applied to the new legislation. Otherwise, it will be imposed at the cost of the human rights of citizens.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

RightsCon session on cross-border access to e-evidence – key interventions (10.05.2017) https://edri.org/rightscon-session-on-cross-border-access-to-e-evidence-key-interventions/

EDRi’s position paper on cross-border access to electronic evidence in the Cybercrime Convention (17.01.2017)

EDRi’s letter to the Council of Europe on the T-CY Cloud Evidence Group Report on criminal justice access to evidence in the cloud (10.11.2016)

Professor Douwe Korff’s analysis on the T-CY Cloud Evidence Group Report on criminal justice access to evidence in the cloud (10.11.2016)

European Commission: e-evidence

(Contribution by Zarja Protner, EDRi intern)