security & surveillance

While offering vast opportunities for exercising and enhancing fundamental rights, the digital environment also offers both opportunities to commit new offences and to impose new restrictions on our online rights. Measures such as filtering, blocking and untargeted surveillance are often easy to implement and extremely difficult to rectify. EDRi therefore works to ensure that all security and surveillance measures are necessary, proportionate and implemented based on solid evidence.

19 Apr 2017

AVMS Directive – censorship by coercive comedy confusion

By Joe McNamee

On 25 April 2017, the European Parliament Committee on Culture and Education (CULT) will vote on its report on the European Commission’s proposal on Audiovisual Media Services Directive (AVMSD).

To understand just how confused the proposal is, it is worth understanding its history. In 1989, the EU adopted the “Television without Frontiers” Directive, to regulate cross-border satellite TV, covering issues such as jurisdiction and protection of minors. This Directive was out of date very quickly, leading to a revision that was adopted in 1997. That, in turn, was quickly out of date and revised in 2007. Then, in 2010, the EU adopted its fourth revision, this time trying to fit video on demand (VOD) services, such as Netflix, HBO Go, Amazon Video and others, into this legislation. In 2016, the European Commission proposed yet another revision, this time trying to squeeze yet another type of service – video-sharing platforms – into regulation designed in the mid-eighties for satellite TV.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

The current proposal, which proposes even more obligations on video-sharing platforms, is horribly contradictory and unclear. It does contain, however, a reasonable amount of comedy, which is an innovation for the EU institutions. For example, this legislation on “audiovisual” content covers, on the basis of Parliament compromise amendments, “a set of moving images”, which would cover, for example, an animated GIF.

Furthermore, it doesn’t cover all online video-sharing. For example, it does not cover video sections of news sites that are “indissociably complementary” to the site (borrowing wording from a Court of Justice of the European Union (CJEU) ruling in the New Media Online case). This means that video contents, featured on a news website, should only be regulated according to the Directive if they are not complementary to the journalistic activity of that publisher and are independent of written press articles on the site.

In a further (failed) effort to add to legal certainty, the Parliament’s draft compromise text also seeks to clarify the notion of “user-generated content” by removing from the Commission’s proposal the notion that it has to be user-generated. If the compromise text is adopted, the new definition of “user-generated” video would be “a set of moving images with or without sound constituting an individual item that is uploaded to a video-sharing platform”. This means that to be a “user-generated video”, it would not need to be user-generated nor, indeed, would it need to be a video.

On a more serious note, the proposal requires badly defined video-sharing platforms to take measures to protect children from content that would harm their “physical, mental or moral development” (“moral” added by the Parliament to various new parts of the Directive). This involves measures to restrict (undefined) legal content. The European Commission proposed also that the companies should enforce the law on incitement to racism and xenophobia. The Parliament’s suggestion is to extend law enforcement to areas where there is no law – such as incitement to hatred of “a person or group of persons defined by reference to nationality, sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion”. The Parliament also proposes setting up dispute resolution systems to verify decisions about which videos should stay online or not after accusations that they might lead to hatred of a person due to, for example, “any other opinion”. Video-sharing platforms will also need to make sure that video uploaders “declare” whether or not their videos contain advertisements, product placement or sponsored content.

It is clear that the broad restrictions of legal and illegal content that video-sharing platforms are meant to impose will lead to significant levels of removal of legal content, particularly due to the spectacularly unclear scope of their obligations. Restrictions on freedom of communication must, under the Charter of Fundamental Rights of the European Union just be “provided for by law” and necessary and genuinely meet objectives of general interest. The Commission’s text failed to achieve this minimum standard, while the draft compromise amendments to be voted on 25 April by the Parliament fall very far short of this standard. The only possible result of the legal chaos that this will create for video-sharing platforms is the deletion of a large amount of legal content, in order to minimise their exposure to possible state sanctions or other litigation.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Television broadcasting activities: “Television without Frontiers” (TVWF) Directive – Summaries of EU legislation
http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=URISERV%3Al24101

Audiovisual Media Services Directive (2010/13/EU)
http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2010:095:0001:0024:en:PDF

Revision of the Audiovisual Media Services Directive (AVMSD), 2016 proposal
https://ec.europa.eu/digital-single-market/en/revision-audiovisual-media-services-directive-avmsd

(Contribution by Joe McNamee, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
19 Apr 2017

Dangerous myths peddled about data subject access rights

By Guest author

Now that the date on which the General Data Protection Regulation (GDPR) becomes enforceable is rapidly approaching, the European Data Protection Authorities (DPAs) are in the process of clarifying what their shared positions will be on various topics, including profiling. This is done through stakeholder consultation meetings.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

During the latest meeting, one of the more contentious issues surrounding profiling turned out to be the transparency requirements regarding the algorithms used for automated decision making and profiling. While industry representatives in general provided constructive input on the various topics, this issue was more challenging. Several industry representatives were pushing for a very narrow interpretation of the right to access regarding the logic in automated decision making.

The basic argument is that industry has a right to hide the precise details of the calculations used to make decisions that discriminate against individuals. Three points were made in support of claims that the right of information regarding the logic of processing should not extend to disclosing the actual algorithms used:

  1. they would be protected trade secrets;
  2. intellectual property rights would preclude such disclosure;
  3. it would create a moral hazard in case of applications of profiling in fraud prevention.

Regarding the protection of trade secrets, the situation is fairly simple. The Trade Secrets Directive (2016/943/EU), for all its flaws, mentions specifically in its recitals that it shall not affect, among other rights, the right to access for data subjects. Since this Directive has to be implemented by June 2018, there is only a window of a few weeks in which trade secrets protections in some member states could, theoretically, prejudice data subject access to the logic used in automated decision making. So for all practical intents and purposes, trade secret legislation cannot be invoked to prevent disclosure of such underlying algorithms.

As far as intellectual property rights are involved, this is even more of a non-issue. The only so-called intellectual property rights that bear relevance here are copyright law and patent law.

Software copyright law does not explicitly cover underlying algorithms, a view that is reiterated in the ruling of the SAS Institute Inc. v World Programming Ltd case (C‑406/10 CJEU), in which the Court of Justice of the European Union (CJEU) ruled that the functionality of a computer program is not protected by copyright under Computer Programs Directive (91/250/EEC).

As far as patent law is involved, the European Patent Convention states that “schemes, rules and methods for performing mental acts, playing games or doing business, and programs for computers” shall not be regarded as patentable inventions (article 52(2) sub c). It would be difficult to argue that the logic for automated decision making in profiling of personal data is not a method for doing business. A requirement for patent law protection is disclosure of the underlying technology, which makes it even less likely to support an argument that it might prejudice disclosure of logic in automated decision making. Given that none of the other intellectual property rights even come close to covering the logic of algorithms, it follows that there are no barriers in intellectual property laws to disclosure of logic for automated decision making.

Even if there were intellectual property rights covering the underlying logic of software algorithms, it would still not necessarily be a given that these should override the data protection legislation. The CJEU has repeatedly considered competition law interests outweighing intellectual property interests in cases where it had to balance competition against intellectual property.

The last argument, that of a moral hazard, may or may not come into play in the context of fraud detection and insurance risk assessment. First of all, the European legislator has never made any exceptions for it in the GDPR, secondly, this can be addressed by disclosure of the logic as applied to a specific data subject instead of disclosure of the general logic as applied to all data subjects affected.

The logical conclusion for DPAs enforcing the GDPR in the future is to interpret the aforementioned arguments from parts of industry with a great deal of cynicism. They simply have no basis in the EU law and/or reality.

Rejections of data subject access requests to the underlying logic of automated decision making based on “trade secrets” or “intellectual property rights” should be treated by DPAs as violations of the GDPR and addressed accordingly.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

The Trade Secrets Directive (2016/943/EU)
http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:32016L0943

Ruling of the SAS Institute Inc. v World Programming Ltd case
http://curia.europa.eu/juris/document/document.jsf?text=&docid=122362&pageIndex=0&doclang=EN&mode=req&dir=&occ=first&part=1&cid=154228

European Patent Convention
http://www.epo.org/law-practice/legal-texts/html/epc/2016/e/index.html

Insurance: How a simple query could cost you a premium penalty (30.09.2013)
https://www.theguardian.com/money/2013/sep/30/insurance-query-higher-premiums

(Contribution by Walter van Holst, EDRi member Vrijschrift, the Netherlands)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
19 Apr 2017

Data mining for profit and election result – how predictable are we?

By Guest author

Did Donald Trump become president because he hired the data mining firm Cambridge Analytica, which uses profiling and micro-targeting in political elections? Some say yes, many say no. But what we know is that we are subjected to extensive personalised commercial and political messaging on the basis of data, including metadata, collected and used without our awareness and consent. It can result in changes in our behaviour, at least to some extent.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

As much as we would like to think we are able to make decisions that are impossible to predict, we are creatures of habit, with our routines and patterns, and submerged in the filter bubbles of the like-minded. In short – it is fairly easy to learn about our activities, preferences, habits, relationships, just by getting a glimpse of our digital footprint, be it our browsing history, our social network, or our location data. This comes very handy when marketing companies, data brokers or campaign strategists try to understand and predict our shopping preferences or our vote in the next elections. Our data is turned into profit and power.

Data mining refers to seeking useful insights from collected data. In other words, it’s a method to examine existing large data sets to generate new information. Profiling on the basis of data mining is problematic from at least two perspectives.

The first is that our public data are used to learn additional personal and intimate information about us, which we are not keen on disclosing to the world or about whose existence we know nothing. It is for example easily possible to learn about a person’s sexual orientation, even if he or she does not disclose this information publicly.

The second point is the fact that one can never get a full image of a person on the basis of these data, so the predictions can never be fully accurate – from a business perspective, they only need to be accurate enough to be profitable. This might seem rather harmless; you might only be slightly upset when you are being targeted with advertisement for a romantic getaway when you just broke up with your partner. But inaccurate profiling can easily turn into a nightmare. Even when relying heavily on algorithms, the processes of data mining are still subverted to human subjectivity, and there is a potential for all sorts of biases, reflecting human prejudice. Decisions based on prejudicial profiles can have real-life consequences, such as welfare, employment, credit, even education.

It may not be possible to single-handily manipulate the outcome of an election through data mining. However, it raises serious concerns for privacy and democracy. Profiling on the basis of information about our interests, personalities, activities and affiliations can service political and commercial marketing, which aims to change people’s behaviour by exposing their vulnerable spots to manipulation. This is why it is crucial to address profiling in the legislation.

The regulation intended to strengthen data protection within the EU, General Data Protection Regulation (GDPR) was adopted in April 2016, and applies from May 2018. It gives citizens more rights to information and to object, and contains more explicit requirements for consent than existing legislation. A proposal for an e-Privacy Regulation (ePR) to complement the GDPR was published in January 2017. It seeks to add more clarity and legal certainty for individuals and businesses by providing specific rules related to our freedoms in the online environment.

Data protection is about privacy, security, autonomy and, ultimately about how our society functions.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Cambridge Analytica Explained: Data and Elections (13 April 2017)
https://medium.com/@privacyint/cambridge-analytica-explained-data-and-elections-6d4e06549491

Everything you need to know about the Data Protection Regulation
https://protectmydata.eu/

New e-Privacy rules need improvements to help build trust (9 March 2017)
https://edri.org/new-e-privacy-rules-need-improvements-help-build-trust/

(Contribution Zarja Protner, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
19 Apr 2017

German Social Media law – sharp criticism from leading legal expert

By Joe McNamee

Professor Wolfgang Schulz, one of Europe’s preeminent legal experts, has prepared a short critique of Germany’s so-called “Act improving Law Enforcement on Social Networks”, also known under the abbreviation NetzDG.

Professor Schulz criticises the fact that the draft law covers a range of different types of offences, making it difficult to assess its necessity as a means of restricting freedom of speech. More damningly, he points to the key assumptions on which the law is based, arguing that they have been abandoned “for a long time”. Furthermore, he argues that “there are many effective ways of addressing fake news or hateful speech” that should be [implicitly, were not] taken into account to minimise potential negative effects on freedom of speech”.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

EDRi’s suggested amendment to recital 31 of the Audio-Visual Media Services Directive, adopted by the European Parliament’s Civil Liberties Committee, raises concerns about the “balance of incentives” for internet companies. In line with the amendment, Professor Schulz points to the negative consequences of the German law for the “incentive structure” for social media companies. He argues that the draft law “strengthens this incentive structure further at the expense of freedom of speech”. In short, the incentives to remove information are increased, while incentives to leave information online have been reduced.

Professor Schulz also points out that the aim is not the criminal prosecution of offenders, but that the law “rather creates duties and provisions for administrative offences for platform providers”. The solitary provision that does focus on offenders is too broad and should be “restricted to particularly grave infringements upon rights only”. He also raises very clear arguments regarding the constitutionality of the proposal. His argument is that, as the focus is on regulation of content and not criminal prosecution, there is no specific federal power provided for in Germany’s “basic law” (Grundgesetz), which devolves content regulation to the regional governments (Länder).

The analysis of scope is particularly depressing. The term “provider” has already created problems in the German Telemedia Act, the term “user” is unclear regarding the applicability of the act, and business networks have been excluded in a way that risks “favouring domestic companies”. While freedom of speech would normally have to be considered when dealing with content of this nature, “the draft does not allow for that”.

The document also points to the weak approach to reporting in the draft. In particular, the obligation to report on take-down performance “creates even more incentives for the provider to perform a take-down on request without checking to avoid any self-blaming and -shaming in the report”.

With the lone exception of child abuse material (“child pornography”), removal of content requires a context-sensitive assessment. If a law is likely to have the effect of removing legal content, this is a restriction on freedom of speech, as laid down in the basic law, the European Convention on Human Rights and elsewhere. The provision requiring removal of “obviously illegal” content within 24 hours from the request creates an environment for the provider where it is safer to remove content quickly, if there is any doubt. The seven-day deadline for non-obviously illegal content raises similar concerns. Finally, the rather unclear rules on re-uploads, which remove a possibility to assess context, mean that automatic takedowns will result.

Professor Schulz points to references in the draft law to the E-Commerce Directive to show that the Charter of Fundamental Rights of the European Union is applicable. If the German law refers to an EU Directive, it needs to comply with the Charter. Professor Schulz also points out that the proposed measures appear to be in breach of Article 3.4 of the E-Commerce Directive.

Finally, the document details the unpredictable nature of the fines that could be imposed. The fines would be imposed by the Federal Office of Justice. There is a broad margin of appreciation as to whether a fine should be imposed or not. In case of inaction, if politically-motivated, there is limited court oversight. Even when courts are involved, the person whose content is concerned is not involved in the court proceedings.

Comments on the Draft for an Act improving Law Enforcement on Social Networks (NetzDG)
http://www.hans-bredow-institut.de/webfm_send/1178

Reckless social media law threatens freedom of expression in Germany (05.40.2017)
https://edri.org/reckless-social-media-law-threatens-freedom-expression-germany/

LIBE Opinion on amendments to Audio-Visual Media Services Directive
http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-%2F%2FEP%2F%2FNONSGML%2BCOMPARL%2BPE-593.952%2B03%2BDOC%2BPDF%2BV0%2F%2FEN 

(Contribution by Joe McNamee, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
19 Apr 2017

Challenges for “Legal Frameworks for Hacking by Law Enforcement”

By Guest author

A study entitled “Legal Frameworks for Hacking by Law Enforcement: Identification, Evaluation and Comparison of Practices” was published by the European Parliament’s Policy Department for Citizens’ Rights and Constitutional Affairs at the request of the Committee on Civil Liberties Justice and Home Affairs (LIBE). It presents policy proposals on the use of hacking techniques by law enforcement authorities. Based on the the maturity of the legal framework, public debate and practices, the proposals rely on a thorough comparative examination of the legal frameworks for hacking by Law Enforcement Agencies (LEA) across six EU Member States (France, Germany, Italy, the Netherlands, Poland and the UK) and three non-EU countries (Australia, Israel and the US). Even though the primary rationale behind the study is the international and EU-level debates on the issue of ‘going dark’ (i.e. the decreasing ability of LEA to access and examine evidence due to encryption), it builds its proposal based on failures of other alternatives such as backdoors and zero-day exploits.

The study examines the legal and practical balances and safeguards implemented at national level to ensure the legality, necessity and proportionality of restrictions to the fundamental right to privacy, the security of the internet, and to a lesser extent, the regulation of the sale of hacking tools. Based on these factors, the study highlights several key risk factors imposed by the use of hacking techniques by law enforcement:

  • Hacking techniques are extremely invasive, particularly when compared with traditional intrusive investigative tools (such as wiretapping and house searches), and this imposes a very high degree of risk to the fundamental right to privacy and freedom of expression and information without appropriate policies in place.
  • Use of hacking techniques has the potential to significantly weaken the security of the internet by “increasing the attack surface for malicious abuse”, with possible damage far beyond the intended target.
  • Given the global nature of the Internet, LEA (and service providers) may not know the physical location of the target data – this has resulted in the concept of “loss of knowledge of location”. In many such cases, the LEA may remotely access the data located in the jurisdiction of another country, which poses serious risks to territorial sovereignty. Most of the time, LEA breach jurisdictional boundaries unknowingly due to confusing nature of the internet infrastructure and lack of concrete procedures for mutual legal assistance in cross-border investigations.
  • In the recent past, many civil society organisations (including EDRi members) have questioned the current dual-use export control regimes.

The study further compares the provisions for legal frameworks and their context by evaluating the technical means of hacking and the fundamental rights considerations in order to derive both benefits and risks of the use of hacking techniques by law enforcement. It is found that all the EU Member States examined for the study supplement the common types of ex-ante and ex-post conditions with different, less common, conditions. Some of the key ex-ante considerations include:

  • judicial authorisation for law enforcement hacking;
  • restriction on the use of hacking tools based on the gravity of crimes which are limited either by a list of crimes for which hacking is permitted, or they are limited by the maximum custodial sentence of greater than a certain number of years, along with the restriction on the duration for which hacking may be used.

Some of the key ex-post considerations include:

  • provision for the notification of targets of hacking practices and remedy in cases of unlawful hacking; and
  • report through logging hacking activities for review and to identify the oversight mechanisms.

The study highlights some of the criticisms in each country’s legal provisions for hacking, for example, the lack of knowledge amongst the judiciary in France, Germany, Italy and the Netherlands; unclear definition of devices that can be targeted in the Netherlands; and the inefficient process for screening and deleting non-relevant data (in Germany). It also underlines some of the good aspects of the provisions, such as the 2017 Italian draft law’s efforts to protect against the overuse or abuse of a hacking tool’s extensive capabilities by separating the functionalities of the tools, and Dutch Computer Crime III Bill’s mandates on the need to conduct a formal proportionality assessment for each hacking request, with strict rules on the authorisation and expertise of the investigation officers that can perform hacking.

Based on the above analysis, the study derives twelve actionable policy proposals and recommendations. The proposals highlight the fact that the European Parliament should pass a resolution calling on the Member States to conduct a Privacy Impact Assessment when new laws are proposed to permit and govern the use of hacking techniques by LEA with clear and precise legal basis. The Parliament should support efforts to evaluate and monitor lawful hacking activities; support efforts to develop appropriate responses to handling zero-day vulnerabilities, and finally it should reaffirm its commitment to strong encryption considering both fundamental rights of EU citizens and the internet security. Furthermore, the policy proposals emphasise the impact analysed by the EU Agency for Fundamental Rights (FRA) research on fundamental rights protection in the context of surveillance in response to the Snowden revelations, and recommends to produce a similar brief related to the legal frameworks governing the use of hacking techniques by LEA across all the EU Member States. Also, it strongly proposes the collaboration of FRA, CEPOL and Eurojust to provide training to all stakeholders who would potentially be involved in the hacking activities.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Legal Frameworks for Hacking by Law Enforcement: Identification, Evaluation and Comparison of Practices
http://www.europarl.europa.eu/RegData/etudes/STUD/2017/583137/IPOL_STU(2017)583137_EN.pdf

Rights groups demand action on export controls (06.03.2017)
https://edri.org/rights-groups-demand-action-export-controls/

(Contribution by Siddharth Rao, Ford-Mozilla Open Web Fellow, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
19 Apr 2017

Member Spotlight: SHARE Foundation

By Guest author

This is the fifth article of the series “EDRi member in the Spotlight” in which our members have the opportunity to introduce themselves and their work in depth.

Today we introduce our Serbian member SHARE Foundation.

Member_spotlight_banner

1. Who are you and what is your organisation’s goal and mission?

We are SHARE Foundation from Serbia, a non-profit organisation dedicated to protecting human rights and freedoms in the digital environment and promoting positive values of openness, decentralisation, free access to information, technology and knowledge.

2. How did it all begin, and how did your organisation develop its work?

After a series of huge SHARE conferences held in Belgrade and Beirut, each with more than 1000 participants, we decided that our activism and advocacy efforts needed to be established on an organisational level. In 2012, SHARE Foundation was born.

3. The biggest opportunity created by advancements in information and communication technology is…

Access to knowledge, free flow of information, and of course, cat memes!

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

4. The biggest threat created by advancements in information and communication technology is…

Mass surveillance of people’s lives and their monetisation by mega-corporations which use non-transparent algorithms.

5. Which are the biggest victories/successes/achievements of your organisation?

In late 2014, Government of Serbia proposed amendments to the Law on Games of Chance, which would introduce blocking and filtering of internet in Serbia. After a joint opposing effort to such measures from civil society, led by SHARE Foundation, the proposed provisions were dropped from the legislative procedure.

During the past years, SHARE Foundation has organised more than 20 events, published more than ten info-guides and numerous SHARE Lab investigative stories. We have also provided free legal and technical assistance to many online news portals and civil society media platforms.

In January 2017, SHARE Foundation received a certificate of gratitude.

6. If your organisation could now change one thing in your country, what would that be?

We would reduce the number of political trolls and bots on the internet – which is currently too damn high!

7. What is the biggest challenge your organisation is currently facing in your country?

The outdated Law on Protection of Personal Data. The process of reforming the law is taking almost a year and a half now, and we still have no information on when a new version of the draft law will be published.

8. How can one get in touch with you if they want to help as a volunteer, or donate to support your work?

You can reach us via Twitter and Facebook or visit our website shareconference.net, where you can find more info about our work. If you prefer email, you can contact us at info(at)sharedefense(dot)org.

SHARE Foundation
http://www.shareconference.net/en

SHARE Lab investigative stories
https://labs.rs/en/

SHARE Twitter
https://twitter.com/ShareConference

(Contribution by EDRi member SHARE Foundation, Serbia)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
10 Apr 2017

RightsCon session on cross-border access to e-evidence – key interventions

By Joe McNamee

European Digital Rights organised a session at the RightsCon conference in Brussels on 31 March 2017, in order to build awareness among stakeholders about the multiple international developments on law enforcement access to electronic evidence.

The bulk of the discussions focussed on a possible new protocol to the Cybercrime (Budapest) Convention of the Council of Europe (CoE). The CoE initiative is far broader than the Council of Europe area, covering all 53 countries that have ratified the Cybercrime Convention (including the USA, Australia, Canada and others). It is not necessary for countries to have ratified the data protection or human rights Conventions of the CoE before ratifying the Cybercrime Convention. Some of the issues surrounding the instrument are assessed in an “issue paper” by Professor Douwe Korff on the “Rule of Law on the Internet and in the Wider Digital World” prepared for the Council of Europe Human Rights Commissioner (with input from EDRi).

To ensure accuracy and balance, speakers were given the opportunity to edit the draft summary of their own interventions. The text below is, therefore, not a perfect record of what was actually said.

The key speakers were Alexander Seger, head of the Cybercrime Division of the Council of Europe, Lani Cossette from Microsoft, Owen Bennett from the European Internet Services Providers Association (EuroISPA) and Javier Ruiz from EDRi member Open Rights Group (ORG).

Alexander Seger opened by arguing that cross-border is a rather fictional concept on the internet. He said that the Cybercrime Convention is a criminal justice treaty, so data access is about access to specific data in specific criminal investigations, and not about bulk data collection or national security measures. All measures fall under criminal law, which is where countries have the strongest safeguards. In addition, the starting point in any such discussions is that, as detailed in the European Convention on Human Rights (ECHR) [note: the Convention is open to countries that are not a party to the ECHR], there is a positive obligation on states to protect citizens from crime.

He pointed out that if there are a hundred cybercrimes reported, then cases that actually lead to a court judgement might be 0,1% or even less. This raises questions as to whether this positive obligation is being met. He mentioned that, in other meetings, he asked participants to think of three types of crime where evidence would not be, to some extent, on a computer system, but few examples could be found. In reality, there is almost always some evidence on a computer system. However, access to such evidence is extremely complicated.

There are areas the Cybercrime Committee of the Council of Europe have identified where action is possible, in particular regarding how to deal with evidence in the “cloud” (i.e. servers equipment in foreign, multiple, shifting or unknown jurisdictions). The Committee looked at the issues for two and a half years and produced five recommendations.

Four of the five recommendations have been followed and a work on a fifth will be subject to a decision in June 2017.

  1. Mutual Legal Assistance Treaty (MLAT) arrangements to be made more efficient. MLA remains the most important tool to obtain evidence from foreign jurisdictions. This is not an effort to get around existing procedures. Examples include a way to find a “light” system for getting access to basic “subscriber information” or to deal with emergency situations. The United States legal system has such options, but many national legal systems do not. For example, MLATs in South America often go through foreign ministries rather than through criminal justice systems, which is complicated and causing delays.
  2. Guidance Note on Article 18 (Production orders for subscriber information) to be produced. This is about subscriber information, not content or traffic data. This is the type of data that is needed most often. Big US companies receive thousands of requests per year directly from law enforcement authorities (LEAs) abroad. Article 18a is about production orders in a given jurisdiction. The Guidance Note says that it does not matter where the data is stored, the decisive question is who is in possession or control. This is the same as when banks repatriate data to deal with a fraud in a country. Article 18.1.b covers situations where a service provider based abroad is providing a service to users in another country. However, there is lack of clarity regarding how to service production orders to companies in such situations and no enforcement mechanism if the provider fails to provide the requested data.
  3. Governments to fully implement Article 18. Procedures on this point need to be precise rather than the broad powers that are often used at the moment. Such rules need to be clearly defined in national law to meet rule of law requirements.
  4. Development of practical measures for cooperation with providers and making available of an online tool, to facilitate procedures. That will allow providers to understand the domestic law of the country making the request and respecting these legal requirements. It will also allow for more understanding from requesting authorities of companies’ procedures.
  5. Decision to be made on the possible drafting of an additional protocol on access to evidence in the cloud in June 2017. The draft plan is to take about 2.5 years to reach agreement on a draft protocol.

Current plans focus on four key topics:

  1. Additional possibilities for mutual legal assistance, including emergency and light procedures.
  2. Transborder access to data. Some countries may already get access to data, but such access is often on a shaky legal framework. Access under the Budapest Convention at the moment is under very narrow conditions and this was confirmed in Guidance Note on Article 32. Are there additional options – if the persons are in the jurisdiction and the crime is in the jurisdiction, what protections are needed? Work on this issue was suspended, as moving forward in the aftermath of the Snowden revelations. Work to avoid a situation where states unilaterally develop their own solutions, thereby creating a jungle.
  3. Direct cooperation with providers in other jurisdictions. Can we do more in that environment?
  4. Data protection and rule of law safeguards. The more innovation that is proposed, the more safeguards will be needed.

Data protection organisations, civil society and industry will be consulted in the process.

For context, six major providers in the US directly received 138 000 requests from Parties to the Budapest Convention other than the US.

Lani Cossette, representing Microsoft stressed that the company complies with legal obligations and does not volunteer access to data. Microsoft prefers obligation rather than cooperation. She said that there is confusion at the moment as regards what law applies in what country.

The current legal framework was written when “the cloud,” as we know it, did not exist. Things were fairly simple in the 1980s. Emails were stored on local servers. Later, data centres were built in the US. More recently, more and more data centres are being built in Europe, opening up new questions regarding whose law applies in various scenarios.  Jurisdiction is traditionally rooted in territoriality, so jurisdiction over digital evidence has been challenging to sort out because data does not always sit in one territory. Microsoft has data centres in Ireland, which serves users and customers in Europe, which means that there are legal conflicts even within Europe, not just between the US and Europe.

Microsoft has participated in the consultation of the Commission task force on e-evidence.  The Commission was asked:

  • could procedures be improved;
  • could MLA procedures be improved to unburden the system; and
  • is legislation needed with regard to enforcement jurisdiction?

The Commission’s current work cycle on this issue started in July 2016, with stakeholder meetings. The consultation process includes civil society.

In June 2016, the Council of the EU (the institution representing EU Member State governments in the EU decision-making process) produced a document on “improving criminal justice in cyberspace”, which sets out the broad policy direction to be followed.

This led the European Commission to produce a report in December 2016, which details a problem definition and details different options for jurisdiction, requests for data, etc.

Commissioner Jourova (responsible for justice, consumers and gender equality) has indicated that she expects the Commission to present three or four options for moving forward with the file at the next Justice and Home Affairs Council (JHA) Council meeting on 8 June, so the timetable is very compact. However, we do not expect a full legislative proposal at that stage.

Owen Bennett from the European Internet Services Providers Association (EuroISPA) said that systems are designed around the needs and capabilities of bigger companies. However, more cross-border access issues are rising for small companies due, for example, to lower roaming prices, more cross-border services. There is a huge increase in cross-border requests, which can create significant burdens.

EuroISPA stresses three key principles:

  1. On a high level, it is important that smaller services should only ever be expected to cooperate with local law enforcement. It is important to have clear rules to build on existing good cooperation. There are increasing demands received in foreign languages from foreign jurisdictions, with little clarity on legal obligations. Sometimes there is a legal obligation not to respond.
  2. Direct access is very worrying. There are also issues regarding the financing of procedures and the financial burden of the legal assessment of requests for data.
  3. Mutual legal assistance arrangements should remain the core of any new framework in this policy area.

Javier Ruiz then gave a summary of Open Rights Group’s (ORG) views of the negotiations between the United Kingdom and the US on access to data.

ORG met senior staff from the UK government during spring 2016 to discuss the proposed UK-US treaty. The discussion was based on US documents as the UK has not to date produced any paper trail.

The first thing to clarify is that, despite MLAT being portrayed as the problem this initiative has to solve, the proposed treaty is not about MLAT, but law enforcement accessing communications at an early stage of investigations, not to put the pieces together after a crime has been committed. MLAT would still need to be fixed.

Also, the proposed treaty would cover only interception of communications, which, in the UK at least, is not supposed to include requests for metadata. The UK police already asks US companies for metadata and it is legally possible for companies to disclose this information under US law. This does not cover content.

On the basis of the available information, ORG raised concerns that the system proposed was extremely weak with regard to safeguards or processes to smooth the interoperability of the UK and US jurisdictions. ORG saw it as throwing the systems against each other and hoping for the best. There have been lots of complaints from the US side of civil society about UK processes not being to the same level, which the UK government strenuously denies. Issues raised are independent authorisation, inadmissibility of intercept evidence in court, lack of equivalence to US restrictions on live wiretapping, etc.

ORG complained to UK government officials that the system appears to be designed from the point of view of UK access to US data, with little thought being invested in the reverse process. The response was that in real life such requests would never happen.  ORG also raised concerns with the lack of accountability mechanisms, purely relying on existing reporting. Since then, the Home Office (the UK ministry of the interior) has stated that they will be strengthening the processes, with a single point of contact out of the country. This is not enough, as US companies would have to deal directly with the British system, and vice versa.  At least you would need some common processes, and for the final administrative step of the warrant to be undertaken domestically so appeals and complaints could be handled in the same country.

Twitter_tweet_and_follow_banner

close
06 Apr 2017

The European Parliament adopts another resolution critical of the Privacy Shield

By EDRi

On 6 April 2017, the European Parliament (EP) voted a motion for a resolution on the adequacy of the protection afforded by the EU-US Privacy Shield. The scheme gives the United States a unique arrangement for the transfer of personal data from the European Union to the United States. The Privacy Shield replaced the Safe Harbor decision, which used to serve the same purpose, until the Court of Justice of the European Union (CJEU) invalidated it in the Schrems case in 2015.

The EU-US Privacy Shield has been showered with criticism from the moment the details of the new(ish) rules were published. However, the European Commission (EC) proposed and adopted it anyway.

The Article 29 Data Protection Working Party of national data protection authorities and the European Union Data Protection Supervisor (EDPS) issued opinions expressing numerous concerns regarding the level of protection offered by the Privacy Shield and its compliance with the right to the protection of personal data and the right to privacy. Moreover, the EP adopted a similar resolution in May 2016, when the draft decision on Privacy Shield was adopted, but its recommendations seemed to be ignored.

Today, the EP has adopted a new resolution which regards many of the Privacy Shield’s provisions as inadequate. The resolution lists several problems in the agreement and calls on the Commission to thoroughly examine them in its first annual review in September 2017.

Among the issues listed in the resolution, the EP raises awareness about the lack of specific rules on automated decisions and of a general right to object and the need for stricter guarantees on the independence and powers of the Ombuds mechanism, the current non-quorate status of the Privacy and Civil Liberties Oversight board, as well as the lack of concrete assurances that the US agencies have established safeguards against mass and indiscriminate collection of personal data (bulk collection). Another flaw mentioned in the Parliament’s criticism is the fact that the Privacy Shield is based on voluntary self-certification and therefore applies only to US organisations which have voluntarily signed up to it, which means that many companies are not covered by the scheme.

Furthermore, the resolution asks the Commission to seek (long overdue) clarification on the legal status of the “written assurances provided” made by the US and to make sure the commitments taken under the new decision will be kept by the new US administration. Furthermore, the resolution calls on the European data protection authorities (DPAs) to monitor the functioning of Privacy Shield and to exercise their powers to suspend or ban data transfers “if they consider that the fundamental rights to privacy and the protection of personal data of the Union’s data subjects are not ensured.”

Unsurprisingly, the Parliament “with concern” the dismantling of the FCC’s privacy rules. Last but not least, the EP calls on the Commission to take all the necessary measures for the Privacy Shield to comply with the General Data Protection Regulation (GDPR) and with the Charter of Fundamental Rights of the European Union.

The Privacy Shield has already been brought to the CJEU by two advocacy groups: EDRi member Digital Rights Ireland (case number T-670/16) and EDRi observer La Quadrature du Net (case number T-738/16). If the CJEU applies the same reasoning as for the former Safe Harbour agreement, the Privacy Shield will need a replacement very soon. It is to be hoped that the EC is preparing the contingency plan to resolve this situation as soon as possible and not wait (again, like it did with Safe Harbour and the two Data Retention rulings) until it is forced to act by the Court of Justice. If the Commission does this then maybe, finally, fundamental rights can be protected on both sides of the Atlantic and both citizens and businesses can enjoy the benefits of increased trust in the online environment.

Civil society letter: Without reforms in US surveillance laws, the Privacy Shield must be suspended (02.03.2017)
https://edri.org/civil-society-letter-without-reforms-us-surveillence-laws-privacy-shield-must-suspended/

Privacy Shield: Privacy Sham (12.07.2016)
https://edri.org/privacy-shield-privacy-sham/

European Parliament confirms that “Privacy Shield” is inadequate (26.05.2016)
https://edri.org/european-parliament-confirms-privacy-shield-inadequate/

Twitter_tweet_and_follow_banner

close
05 Apr 2017

Social media companies launch upload filter to combat “terrorism and extremism”

By Guest author

A database set up jointly by Facebook, Microsoft, Twitter and YouTube aims to identify “terrorist and radicalising” content automatically and to remove it from these platforms.

The prototype of a mechanism to prevent the publication of violent terrorist content on platforms such as Facebook and Twitter commenced operations last week. This was announced by European Commissioner for Migration, Home Affairs and Citizenship, Dimitris Avramopoulos, who met representatives from Facebook, Twitter and YouTube on 10 March in order to discuss the progress made so far with regard to the “removal of terrorist content online”.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

It appears that no research whatsoever has been done on the likely impact of this initiative, including no review mechanisms on its impact and no way of establishing whether the initiative has counter-productive effects.

This prototype is a database operated jointly by Facebook, YouTube, Twitter and Microsoft that gathers “digital fingerprints” (hashes) of content marked as “terrorist” or “extremist”. Once designated as such, photos or videos can, in theory, no longer be uploaded to these platforms. The upload filters are intended to ensure that undesirable content is identified and removed more swiftly. The role of judicial and law enforcement authorities in this process has, unsurprisingly, not been mentioned.

The participating companies are part of what is known as the EU Internet Forum. With this initiative, the European Commission intends to encourage internet companies to, among other things, monitor content on their platforms more intensively, outside of an accountable, law-based environment.

Alongside the removal of content online, the EU Internet Forum discusses further measures in the area of cyber security and the production of electronic evidence. The ministers of the interior of the EU member states are calling for greater numbers of direct inquiries to be submitted to companies in the future, thereby circumventing the often laborious route that is international judicial assistance.

This primarily applies to the operators of cloud servers in the US. The Commission is currently assessing whether US companies could fall under the remit of the European Investigation Order. This directive could be extended to include operators that, while headquartered in a third country, offer their services in the European Union.

Since the establishment of the EU Internet Forum in December 2015, access by investigative authorities to encrypted telecommunication has been on the agenda. According to the German Federal Ministry of the Interior, the European Commission had initially kept a low profile in this area. According to a Commission press release, the issue of encryption was, however, discussed at the last meeting of the Forum.

The EU’s Counter-terrorism Coordinator, Gilles de Kerchove, who has called for assistance with decryption by companies in a number of papers over the last two years, was also in attendance. His post was established in order to present new priority areas for action with respect to fighting terrorism and extremism on a biannual basis.

Likewise, under the umbrella of the EU Internet Forum, the Commission is currently launching an EU Civil Society Empowerment Programme (CSEP). This is overseen by the European Commission’s Radical Awareness Network (RAN), which became fully operational as a “Centre of Excellence” in 2016.

In previous press releases, the Commission announced that the programme would receive financial support to the tune of ten million euros. It is intended to help “civil society, grassroots groups and credible voices” to fill the internet with “alternative narratives”. A particular focus is on “capacity and/or resources” for disseminating messages to achieve this end. The aim here is for participants to develop campaigns in cooperation with internet companies.

Little information is available regarding the EU Civil Society Empowerment Programme. On the Commission’s website, it appears that an opening event was to take place at the beginning of March 2017, attended by internet companies, “marketing experts” and “civil society”. Following this event, campaigns were launched, but no details have been disclosed. In 2016, it was announced that Twitter could accord “counter-narratives” greater visibility without charging the usual fee for this service.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Update: Statewatch published the 2017-2019 Joint Activity Plan (JAP) for the Civil Society Empowerment Programme at http://www.statewatch.org/news/2017/mar/eu-com-radicalisation-civil-society-empowerment-programme-work-plan-2017-19.pdf.

The article was originally published at https://digit.site36.net/2017/03/17/social-media-companies-launch-upload-filter-to-combat-terrorism-and-extremism/.

EDRi: The tale of the fight for transparency in the EU Internet Forum
https://edri.org/the-tale-of-the-fight-for-transparency-in-the-eu-internet-forum/

Council conclusions on improving criminal justice in cyberspace
http://www.consilium.europa.eu/en/meetings/jha/2016/06/cyberspace–en_pdf/

Progress Report following the Conclusions of the Council of the European Union on Improving Criminal Justice in Cyberspace
http://data.consilium.europa.eu/doc/document/ST-15072-2016-INIT/en/pdf

(Contribution by Matthias Monroy, Bürgerrechte & Polizei/CILIP, Germany)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
05 Apr 2017

Denmark: Weakening the oversight of intelligence services

By Guest author

A draft law to amend the data protection provisions of the law on the oversight of the Danish Security and Intelligence Service (PET) was submitted for public consultation in September 2016. In their consultation responses, several NGOs including EDRi member IT-Pol Denmark, as well as the Danish Intelligence Oversight Board (TET) criticised the proposal. The amendments would legalise PET’s existing data processing practices, removing any obligation to regularly assess whether the information collected on citizens is still necessary, as well as the obligation to delete personal data, in some circumstances.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The Danish Security and Intelligence Service (PET) is part of the Danish National Police. The main responsibility of PET is prevention and prosecution of offences under chapters 12 and 13 of the Danish Penal Code, which cover national security and terrorism. Compared to the rest of the Danish National Police service, PET is subject to much weaker data protection standards. For data collection, the main rule is that PET can collect information on citizens, unless it can be ruled out beforehand that the information is relevant. Upon request, all Danish public authorities are required to provide information on citizens to PET without a court order, if PET believes that the information can be assumed to be relevant for PET’s tasks in connection with chapters 12 and 13 of the Penal Code. Furthermore, most of the provisions of the Data Protection Act do not apply to PET. Denmark is currently transposing the Law Enforcement Data Protection (LEDP) Directive 2016/680 into national law. In the draft law, PET is completely exempted based on the national security exemption in Article 2(3)(a) and recital 14 of the LEDP Directive, even though PET regularly exchanges information with police authorities in other EU Member States.

Since 2014, independent oversight of PET is provided by the Danish Intelligence Oversight Board (TET). The oversight of PET covers the provisions in the special PET law on data collection and internal information processing, including the rules for deletion of personal data when it is no longer necessary or when the statutory retention period of 10-15 years is exceeded. All citizens can ask TET to investigate whether PET processes information about them unlawfully. If the investigation shows that information is processed unlawfully, TET can order PET to delete the information, but the citizen will not be notified of this decision. TET can also investigate the data processing practices of PET on its own initiative. Last but not least, TET publishes an annual report about its oversight of PET.

The annual TET reports for 2014 and 2015 contained substantial criticism of PET. Even though the legal standards for processing personal data on citizens are very weak, PET apparently has severe problems living up to these standards. For the 2014 report, TET looked at a sample of persons registered by PET and found that information about roughly half of them should have been deleted because retention periods were exceeded or because the information was no longer necessary. For the 2015 report, TET conducted a more detailed investigation of PET’s data processing practices which confirmed the conclusions of the 2014 report. The databases of PET contained a substantial amount of personal data which should have been deleted, at least under the interpretation of the PET law used by TET.

The TET report for the year 2015 also revealed that TET and PET did not agree on the interpretation of the law governing the operations of PET. The main controversy was related to personal data which was part of another document. TET interpreted the PET law as saying that if information about a citizen was no longer necessary, the information should be deleted, irrespective of whether the personal data in question was a full document or part of another document. PET interpreted the law differently and refused to delete the personal data if it was part of another document which was still necessary for PET’s tasks.

For oversight investigations that are not linked to complaints from citizens, TET can only make recommendations to PET and the Ministry of Justice. In May 2015, TET informed the Ministry of Justice of the disagreement with PET, but despite several requests to the Ministry of Justice for a reply to the letter, TET had not received a reply by May 2016. Shortly after the TET report for 2015 with the substantial criticism of PET was published in May 2016, the Minister of Justice announced that he would propose amendments to the PET law in the next parliamentary year to clarify to legal issues raised by TET.

A draft law amending the PET law was submitted for public consultation in September 2016. Two specific amendments “clarified” the legal situation for PET by simply removing the two specific data protection obligations which had given rise to the criticism in the 2014 and 2015 annual reports from TET. The first amendment removes any obligation on PET to regularly assess whether the information collected on citizens is still necessary. Under the amendment, PET is only required to delete documents and cases that are no longer necessary, if PET discovers this during other information processing tasks. The second amendment provides that PET has no obligation to delete personal data which is no longer necessary if this personal data is part of another document which is still necessary for PET’s tasks. Only full documents and cases must be deleted if they are no longer necessary, not partial elements of documents.

In essence, the two amendments legalise the existing data processing practices of PET which TET had concluded were unlawful in the annual report for 2015. The Danish government justified the amendments on grounds that following the interpretation by TET of the existing law would require too many resources and reduce PET’s counterterrorism capabilities. Apparently, the IT systems used by PET do not support partial removal of information from documents unless it is done as a manual, time-consuming task. By the end of December 2016, the amendments of the PET law were passed with an overwhelming majority in the Danish Parliament, and there was almost no mention of the political debate (or rather, lack thereof) in Danish media.

Consultation responses from several NGOs including EDRi member IT-Pol Denmark were quite critical of the government’s proposal. However, TET provided the by far most serious criticism in its consultation response. First, TET pointed out that the concept of “document” in the systems used by PET to an increasing extent meant whole databases or electronic files with considerable amounts of information, rather than single documents in the traditional sense. This would severely limit the number of situations where personal data that was no longer necessary for PET would actually be deleted. Secondly, TET stated somewhat cryptically that the existing oversight activities of TET would have limited relevance in the future since the only task left for TET will be to assess whether full documents and cases are deleted when they are no longer necessary for PET.

The oversight of the Danish intelligence services was further weakened in February 2017 when the Minister of Defence proposed the same data protection amendments for the law governing the Danish Defence Intelligence Service (DDIS). TET is also responsible for the oversight of DDIS, but the annual reports on data processing by DDIS for 2014 and 2015 do not contain any noticeable critical remarks. Nonetheless, the Minister of Defence proposed to weaken the data protection provisions and, indirectly, the oversight of DDIS. The amendments of the DDIS law have not yet been passed by the Danish Parliament, but there was no real opposition to the proposal during the initial public debate.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Homepage of the Danish Intelligence Oversight Board, annual reports (only in Danish)
http://www.tet.dk/en/

Law to amend the data protection provisions of the PET law (only in Danish, 09.11.2016)
http://www.ft.dk/samling/20161/lovforslag/l71/bilag/1/index.htm#nav

IT-Pol consultation response on law to amend the data protection provisions of the PET law (only in Danish, 21.10.2016)
https://itpol.dk/hoeringssvar/aendring-pet-lov-sletning-oplysninger

IT-Pol consultation response on law to amend the data protection provisions of the DDIS law (only in Danish, 30.01.2017)
https://itpol.dk/hoeringssvar/fe-adgang-til-pnr-og-regler-om-sletning

(Contribution by Jesper Lund, EDRi member IT-Pol, Denmark)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close