privatised law enforcement

A measure which would be illegal if implemented by a government should also be illegal if implemented by industry as a “voluntary” measure, as a result of government pressure or for public relations or anti-competitive reasons. However, as key international legal instruments, such as the European Charter of Fundamental Rights and the European Convention on Human Rights, as well as national constitutions are binding for states and governments, they are not directly applicable to other entities, such as private companies. As a result, there is a major trend towards governments persuading or coercing companies to impose restrictions on fundamental freedoms under the guise of “self-regulation,” thereby circumventing legal protections.

29 Jan 2020

AG’s Opinion: Mass retention of data incompatible with EU law

By Privacy International

On 15 January, Advocate General (AG) Campos Sánchez-Bordona of the Court of Justice of the European Union (CJEU), issued his Opinions (C-623/17, C-511/18 and C-512/18 and C-520/18) on how he believes the Court should rule on vital questions relating to the conditions under which security and intelligence agencies in the UK, France and Belgium could have access to communications data retained by telecommunications providers.

The AG addressed two major questions:

  1. When states seek to impose obligations on electronic communications services in the name of national security, do such requirements fall within the scope of EU law?
  2. If the answer to the first question is yes, then what does EU law require of the national schemes at issue, which include: a French data retention regime, a Belgian data retention regime, and UK regime for the collection of bulk communications data?

The AG’s short answers to those questions are:

  1. Yes, EU law applies whenever states seek to impose processing requirements on electronic communications services, even if those obligations may be motivated by national security concerns; and
  2. Accordingly, the national regimes at issue must all comply with the CJEU’s previous judgments in Digital Rights Ireland and Others, Cases C-293/12 and C-594/12 (“Digital Rights Ireland”), and Tele2 Sverige and Watson and Others, Cases C-203/15 and C-698/15 (“Tele2/Watson”). None of them do, which leads the AG to advise that none of the regimes are compatible with EU law.

The AG’s Opinion is an affirmation of the basic principle at the heart of EDRi member Privacy International’s work: national security measures must be subject to the rule of law and respect our fundamental rights.

Privacy International initiated the challenge to the UK bulk communications data regime, and intervened in the challenge to the French data retention law.

Does EU law apply?

Central to all three Opinions is the question of whether EU law applies when Member States are acting to protect their national security. The AG concludes that the national security context does not disapply EU law. Instead, one must look to the effect of the proposed requirement – data retention or collection – on electronic communications services. Requiring these service providers to retain and/or transmit data to the security and intelligence agencies (SIAs) falls under EU law because such practices qualify as the “processing of personal data”.

Stating this principle in the negative, the AG says: “The provisions of the directive will not apply to *activities* which are intended to safeguard national security and are undertaken by the public authorities themselves, without requiring the cooperation of private individuals and, therefore, without imposing on them obligations in the management of business” (UK Case C-623/17, paragraph 34/79) (emphasis in original).

Is the UK Bulk Communications Data Regime compatible with EU law?

In the UK case, Privacy International challenged the bulk acquisition and use of communications data by Government Communications Headquarters (GCHQ and the Security Service MI5. That case began in the Investigatory Powers Tribunal (IPT), which referred to the CJEU the questions that the AG is addressing. The IPT asked the CJEU to decide, first, whether requiring an electronic communications network to turn over communications data in bulk to the SIAs falls within the scope of European Union law; and second, if the answer to the first question is yes, what safeguards should apply to that bulk access to data?

As noted above, the AG’s answer to the first question is yes, which brings the second question into play. In short, the AG declares that the UK bulk communications and data retention regime (as implemented under section 94 of the Telecommunications Act 1984) “does not satisfy the conditions established in the Tele2 Sverige and Watson judgment, because it involves general and indiscriminate retention of personal data” (UK Case C-623/17, paragraph 37).

The AG re-emphasises that access to retained data “must be subject to prior review by a court or an independent administrative authority” (UK Case C-623/17, paragraph 139). The value of this authority lies in its commitment “to both safeguarding national security and to defending citizens’ fundamental rights” (Id.).

The AG further endorses the application of the other conditions from the Tele2/Watson judgment, including:

  • the requirement to inform affected parties, unless this would compromise the effectiveness of the measure; and
  • the retention of the data within the European Union. (UK Case C-623/17, paragraph 43)

Is the French Data Retention Regime compatible with EU law?

The French case similarly asked whether general and indiscriminate data retention was permissible under EU law for the purposes of combating terrorism.

The AG concluded that the French regime amounts to generalised and indiscriminate data retention and as such it is not compatible with EU law (French Cases C-511/18 and C-12/18, paragraph 111). The French legislation at issue imposes a one-year retention obligation on all electronic communications operators and others with regard to all data of all subscribers for the purpose of the investigation, finding, and prosecution of criminal offenses.

The AG reiterates the conclusion of the Tele2/Watson judgment that the fight against terrorism or similar threats to national security cannot justify generalised and indiscriminate retention of data. He suggests that data retention should be targeted and permissible only if certain criteria are satisfied, for example targeting a specific group of people or a particular geographical area (French Cases C-511/18 and C-12/18, paragraph 133). The Belgian opinion elaborates on possible types of targeting criteria. On the question of access to retained data, he advises that access should depend on previous authorisation of a judicial or independent administrative authority following a reasoned request by the competent authorities.

The AG, furthermore, concluded that that real-time collection of traffic and location data of individuals suspected to be connected to a specific terrorist threat would be permissible under EU law so long as it does not impose on the service providers an obligation to retain additional data beyond what it is already required for billing or marketing services. Independent authorisation is also necessary for accessing this data (French Cases C-511/18 and C-12/18, paragraphs 142-3).

Similarly to the UK Opinion above, the AG reaffirms the requirement to inform affected parties, unless this would compromise the effectiveness of the measure that was already established in Tele2/Watson case and concludes that the French law is not compatible with the EU requirements (French Cases C-511/18 and C-12/18, paragraph 153).

Are AG’s opinions the judgments of the CJEU?

The AG’s opinions are not binding on the CJEU. The Court will issue its Opinion in the coming months.

What comes next?

Following the CJEU judgment, each case will be sent back to each state’s national courts. If the CJEU agrees with the Advocate General, then national courts will have to apply the CJEU judgment and accordingly find domestic regimes incompatible with EU law.

This article was originally published at: https://privacyinternational.org/news-analysis/3334/advocate-generals-opinion-national-security-mass-retention-regimes-are

Indiscriminate data retention considered disproportionate, once again (15.01.2020)
https://edri.org/indiscriminate-data-retention-considered-disproportionate-once-again/

(Contribution by Caroline Wilson Palow and Ilia Siatitsa, EDRi member Privacy International)

close
29 Jan 2020

Data retention: “National security” is not a blank cheque

By Laureline Lemoine

On 15 January, Advocate General (AG) Campos Sánchez-Bordona of the Court of Justice of the European Union (CJEU) delivered his opinions on four cases regarding data retention regimes in France, Belgium and the UK, in the context of these Members States’ surveillance programmes.

The AG endorsed the previous case law on data retention, confirming that a general and indiscriminate retention of personal data is disproportionate, even when such schemes are implemented for national security reasons.

An interesting take from his Opinions is how he challenged EU Member States who tend to consider national security as their get-out-of-jail-free card.

National security cannot be used as escape route from EU law

One of the questions the AG had to answer concerned the applicability of the ePrivacy Directive, which Member States contested. They argued that EU law was not applicable, as the surveillance programmes were a matter of national security, in the context of terrorism threats, and therefore not within the EU’s jurisdiction.

Even though the matter had already been solved in the Tele 2 case, the AG, faced with determined Member States, provided for a clear and hopefully once and for all analysis on the national security argument. In all three opinions, the AG stresses that, in these cases, national security reasons could not prevent the applicability of EU law. For the AG, the notion of “national security” is too vague to be invoked to oppose the application of safeguards regarding the protection of personal data and confidentiality of citizens (C-511/18 and C-512/18, para. 74).

He therefore proceeded to define this notion in light of the ePrivacy Directive. The Directive would not apply when activities related to “national security” are undertaken by the public authorities directly themselves, by their own means and for their own account. But as soon as the States impose obligations on private actors for these same reasons, the Directive applies (C-511/18 and C-512/18, para. 79 to 85).

In these cases, telecom operators are obliged, under the law, to retain the data of their subscribers and to allow public authorities access to it. It does not matter that these obligations are imposed for national security reasons.

…and neither can the fundamental right of security

To add another layer to the “security” argument, the French case mentioned the right to security under Article 6 of the Charter of Fundamental Rights of the European Union as a justification to the data retention scheme. This could be a valid argument, but as the AG points out, the right to security protected in the Charter is the right to personal security against arbitrary arrest or detention and does not cover public security in the sense of terrorism threats and attacks (C-511/18 and C-512/18, para. 98, 99).

Terrorism as an excuse?

As part of the “national security” argument, France also argued that the general and indiscriminate retention of personal data was put in place to fight terrorism, in a context of serious and persistent threats to national security.

The AG, however, rightly points out that in the French legislation, terrorism is only one of the justifications possible for such a data retention regime. Terrorism threats are part of the factual context and the excuse for imposing such a regime, while in reality, the regime applies generally, for the purpose of fighting crime (C-511/18 and C-512/18, para. 119 & 120).

Moreover, the CJEU had already rejected, in the Tele2 case, the possibility of having a general and indiscriminate data retention regime for antiterrorism reasons. The AG underlines that this is not incompatible with the view of the Court that fighting terrorism is a legitimate and vital interest for the State. But the case law of the CJEU is clear that, such an objective of general interest, as vital as it can be, cannot in itself justify the necessity of a general and indiscriminate retention regime.

In response to Member States arguing against anything less than a general and indiscriminate retention for this purpose, the AG explains that the fight against terrorism cannot only be contemplated in regards to its efficiency. Because of the scale and the means put into this issue, it must be part of the Rule of Law and must respect fundamental rights. Relying only on efficiency would mean ignoring other democratic issues and could potentially, in extreme cases, lead to harms done to citizens. (C-511/18 and C-512/18, para. 131).

The AG succeeds in debunking the Member States’ arguments, but stops short of preventing abuse.

The danger of “state of emergency” exceptions

Indeed, at the end of his analysis, the AG very briefly (C-511/18 and C-512/18, para. 104 and C-520/18, para. 105 & 106) explains that regardless of what he argued, Member States could be allowed to impose an obligation to retain data, as wide and general as needs be. This could only be done in really exceptional situations, where there is an imminent threat or an extraordinary risk justifying the enactment of a state of emergency in a Member States.

The only safeguard mentioned is the “limited period” that these kind of schemes could run for. This is not enough as we saw how a “state of emergency” can be abused. In France, after the terrorist attacks of November 2015, l’état d’urgence, state of emergency, was enacted and went on for two years. It has been shown that this scheme was not only used for antiterrorism purposes, but also as a tool of social, security and political control, used to conduct surveillance and arrests of, for example, climate activists who are considered “extremists” .

More globally, this has been demonstrated by the various electronic surveillance programmes implemented by the USA after 9/11 in the name of the “war on terror”.

The AG’s opinions are not binding but usually influence the final judgments of the CJEU, which will be issued in the upcoming months. EDRi will be following the development of these cases.

Indiscriminate data retention considered disproportionate, once again (15.01.2020)
https://edri.org/indiscriminate-data-retention-considered-disproportionate-once-again/

Preliminary Statement: Advocate General’s Opinion Advises that Mass Surveillance Regime is Unlawful (15.01.2020)
https://privacyinternational.org/press-release/3332/preliminary-statement-advocate-generals-opinion-advises-mass-surveillance-regime

AG’s Opinion: Mass retention of data incompatible with EU law (29.01.2020)
https://edri.org/ag-opinion-mass-retention-of-data-incompatible-with-eu-law

CJEU Press Release: Advocate General Campos Sánchez-Bordona: the means and methodsof combating terrorismmust be compatible with the requirements of the rule of law (15.01.2020)
https://curia.europa.eu/jcms/upload/docs/application/pdf/2020-01/cp200004en.pdf

(Contribution by Laureline Lemoine, EDRi)

close
29 Jan 2020

#PrivacyCamp20 happened

By Andreea Belu

Privacy Camp 2020 took place on 21 January 2020 in Brussels, Belgium. EDRi had the pleasure to co-organise the 8th edition of this side event of the Computers, Privacy and Data Protection conference (CPDP), together with VUB-LSTS, Privacy Salon vzw and the Institute for European Studies at USL-B. In the context of rising social movements and increasing civil engagement across the globe, this year’s topic centered around Technology and Activism.

Digital rights advocates, activists as well as academics and policy-makers from all around Europe and beyond came together for one day of panels, workshops, exhibitions and brainstorming sessions. Over 40 speakers had immersed our audience in fruitful discussions around social media and political dissent, online civil disobedience, surveillance, censorship, civic participation in information policy making, data justice, data activism, commons and peer production, citizen science and more. In addition to that, we co-organised the second edition of the European Data Protection Supervisor (EDPS) Civil Society Summit under the topic of facial recognition.

The programme of this edition of Privacy Camp, with the details of each session, is available here: https://privacycamp.eu/, and we will soon publish full recordings from the panels, including the speakers’ presentations and Q&A. In order to be the first receiving updates about Privacy Camp (including the publication of video recordings), make sure you subscribe to the new Privacy Camp mailing list: https://mailman.edri.org/mailman/listinfo/privacycamp

One new venue. Over 40 speakers. 9 sessions. 250 people ++. More than 15 hours of chats and whispers. Over 100 liters of coffee fuel. Plenty of #PrivacyCamp20 tweets. One trending hashtag. Dear reader, this was the 8th edition of Privacy Camp.

Privacy Camp
https://privacycamp.eu/

(Contribution by Andreea Belu, EDRi)

close
28 Jan 2020

Activist guide to the Brussels Maze 3.0

By Heini Järvinen

Exciting news 🤩 A new edition of our popular Activist guide to the Brussels Maze is out!

The purpose of this booklet is to provide activists with an insight into where EU legislative and non-legislative Proposals come from, and what can be achieved at each stage of the legislative process. As the life cycle of any EU Proposal of any description is very long, it is important to know where to target any activity at any given moment. Each institution is very powerful and influential at certain moments and very much a spectator at other moments. We hope that this guide will help serve as a map of the Brussels maze.

Download the Activist guide to the Brussels maze 3.0 here: https://edri.org/files/Activistguide_V3_web.pdf

The guide is distributed under a Creative Commons 4.0 Licence: http://creativecommons.org/licenses/by-nc-sa/4.0/

close
16 Jan 2020

2020: Important consultations for your Digital Rights!

By EDRi

Public consultations are an opportunity to influence future legislation at an early stage, in the European Union and beyond. They are your opportunity to help shaping a brighter future for digital rights, such as your right to a private life, data protection, or your freedom of opinion and expression.

Below you can find a list of public consultations we find important for digital rights. We will update the list on an ongoing basis, adding our responses and other information that can help you get engaged.


Online consultation – a European Strategy for data: Public consultation to help shape the future policy agenda on the EU data economy

  • Deadline: 31 May 2020

Consultation on the White Paper on Artificial Intelligence – A European Approach to excellence and trust

  • Deadline: 31 May 2020

close
15 Jan 2020

Your face rings a bell: Three common uses of facial recognition

By Ella Jakubowska

Not all applications of facial recognition are created equal. As we explored in the first and second instalments of this series, different uses of facial recognition pose distinct but equally complex challenges. Here we sift through the hype to analyse three increasingly common uses of facial recognition: tagging pictures on Facebook, automated border control gates, and police surveillance.

The chances are that your face has been captured by a facial recognition system, if not today, then at least in the last month. It is worryingly easy to stroll through automated passport gates at an airport, preoccupied with the thought of seeing your loved ones, rather than consider potential threats to your privacy. And you can quite happily walk through a public space or shop without being aware that you are being watched, let alone that your facial expressions might be used to label you a criminal. Social media platforms increasingly employ facial recognition, and governments around the world have rolled it out in public. What does this mean for our human rights? And is it too late to do something about it?

First: What the f…ace? – Asking the right questions about facial recognition!

As the use of facial recognition skyrockets, it can feel that there are more questions than answers. This does not have to be a bad thing: asking the right questions can empower you to challenge the uses that will infringe on your rights before further damage is done.

A good starting point is to look at impacts on fundamental rights such as privacy, data protection, non-discrimination and freedoms, and compliance with international standards of necessity, remedy and proportionality. Do you trust the owners of facial recognition systems (or indeed other types of biometric recognition and surveillance) whether public or private, to keep your data safe and to use it only for specific, legitimate and justifiable purposes? Do they provide sufficient evidence of effectiveness, beyond just the vague notion of “public security”?

Going further, it is important to ask societal questions like: does being constantly watched and analysed make you feel safer, or just creeped out? Will biometric surveillance substantially improve your life and your society, or are there less invasive ways to achieve the same goals?

Looking at biometric surveillance in the wild

As explored in the second instalment of this series, many public face surveillance systems have been shown to violate rights and been deemed illegal by data protection authorities. Even consent-based, optional applications may not be as unproblematic as they first seem. This is our “starter for ten” for thinking through the potentials and risks of some increasingly common uses of facial verification and identification – we’ll be considering classification and other biometrics next time. Think we’ve missed something? Tweet us your ideas @edri using #FacialRecognition.

Automatic tagging of pictures on Facebook

Facebook uses facial recognition to tag users in pictures, as well as other “broader” uses. Under public pressure, in September 2019, they made it opt-in – but this applies only to new, not existing, users.

Potentials:

  • Saves time compared to manual tagging
  • Alerts you when someone has uploaded a picture of you without your knowledge

Risks:

  • The world’s biggest ad-tech company can find you on photos or videos across the web – forever
  • Facebook will automatically scan, analyse and categorise every photo uploaded
  • You will automatically be tagged in photos you might want to avoid
  • Errors especially for people with very light or very dark skin

Evidence:

Creepy, verging on dystopian, especially as the feature is on by default for some users (here’s how to turn it off: https://www.cnet.com/news/neons-ceo-explains-artificial-humans-to-me-and-im-more-confused-than-ever/). We’ll leave it to you to decide if the potentials outweigh the risks.

Automated border control (ePassport gates)

Automated border control (ABC) systems, sometimes known as e-gates or ePassport gates, are self-serve systems that authenticate travellers against their identity documents – a type of verification.

Potentials:

  • Suggested as a solution for congestion as air travel increases
  • Matches you to your passport, rather than a central database – so in theory your data isn’t stored

Risks:

  • Longer queues for those who cannot or do not want to use it
  • Lack of evidence that it saves time overall
  • Difficult for elderly passengers to use
  • May cause immigration issues or tax problems
  • Normalises face recognition
  • Disproportionately error-prone for people of colour, leading to unjustified interrogations
  • Supports state austerity measures

Evidence:

  • Stats vary wildly, but credible sources suggest the average border guard takes 10 seconds to process a traveler, faster than the best gates which take 10-15 seconds
  • Starting to be used in conjunction with other data to predict behaviour
  • High volume of human intervention needed due to user or system errors
  • Extended delays for the 5% of people falsely rejected
  • Evidence of falsely criminalising innocent people
  • Evidence of falsely accepting people with wrong passport

Evidence of effectiveness can be contradictory, but the impacts – especially on already marginalised groups – and the ability to combine face data with other data to induce additional information about travellers bear major potential for abuse. We suspect that offline solutions such as funding more border agents and investing in queue management could be equally efficient and less invasive.

Police surveillance

Sometimes referred to as face surveillance, police forces across Europe – often in conjunction with private companies – are using surveillance cameras to perform live identification in public spaces.

Potentials:

  • Facilitates the analysis of video recordings in investigations

Risks:

  • Police hold a database of faces and are able to track and follow every individual ever scanned
  • Replaces investment in police recruitment and training
  • Can discourage use of public spaces – especially those who have suffered disproportionate targeting
  • Chilling effect on freedom of speech and assembly, an important part of democratic participation
  • May also rely on pseudo-scientific emotion “recognition”
  • Legal ramifications for people wrongly identified
  • No ability to opt out

Evidence:

Increased public security could be achieved by measures to tackle issues such as inequality or antisocial behaviour or generally investing in police capability rather than surveillance technology.

Facing reality: towards a mass surveillance society?

Without intervention, facial recognition is on a path to omniscience. In this post, we have only scratched the surface. However, these examples identify some of the different actors that may want to collect and analyse your face data, what they gain from it, and how they may (ab)use it. They have also shown that benefits of facial surveillance are frequently cost-cutting reasons, rather than user benefit.

We’ve said it before: tech is not neutral. It reflects and reinforces the biases and world views of its makers. The risks are amplified when systems are deployed rapidly, without considering the big picture or the slippery slope towards authoritarianism. The motivations behind each use must be scrutinised and proper assessments carried out before deployment. As citizens, it is our right to demand this.

Your face has a significance beyond just your appearance – it is a marker of your unique identity and individuality. But with prolific facial recognition, your face becomes a collection of data points which can be leveraged against you and infringe on your ability to live your life in safety and with privacy. With companies profiting from the algorithms covertly built using photos of users, faces are literally commodified and traded. This has serious repercussions on our privacy, dignity and bodily integrity.

Facial Recognition and Fundamental Rights 101 (04.12.2019)
https://edri.org/facial-recognition-and-fundamental-rights-101/

The many faces of facial recognition in the EU (18.12.2019)
https://edri.org/the-many-faces-of-facial-recognition-in-the-eu/

Stalked by your digital doppelganger? (29.01.2020)
https://edri.org/stalked-by-your-digital-doppelganger/

Data-Driven Policing: The Hardwiring of Discriminatory Policing Practices across Europe (05.11.2019)
https://www.enar-eu.org/IMG/pdf/data-driven-profiling-web-final.pdf

Facial recognition technology: fundamental rights considerations in the context of law enforcement (27.11.2019)
https://fra.europa.eu/sites/default/files/fra_uploads/fra-2019-facial-recognition-technology-focus-paper.pdf

What the “digital welfare state” really means for human rights (08.01.2020)
https://www.openglobalrights.org/digital-welfare-state-and-what-it-means-for-human-rights/

Resist Facial Recognition
https://www.libertyhumanrights.org.uk/resist-facial-recognition

(Contribution by Ella Jakubowska, EDRi intern)

close
15 Jan 2020

Serbia: Complaints filed against Facebook and Google

By SHARE Foundation

EDRi member SHARE Foundation has filed complaints to the Commissioner for Information of Public Importance and Personal Data Protection of Serbia against Facebook and Google for their non-compliance with the obligation to appoint representatives in Serbia for data protection issues. In May 2019, before the start of application of the new Serbian Law on Personal Data Protection, SHARE Foundation sent letters to 20 international companies and called upon them to appoint representatives in Serbia, in accordance with the new legal obligations.

Appointing representatives of these companies is not a formality – it is essential to exercising the rights of Serbian citizens prescribed by law. In the current circumstances, companies like Google and Facebook view Serbia, like many other developing countries, as a territory for unregulated exploitation of citizens’ private data, even though Serbia harmonised its rules with the EU Digital Single Market by adopting the new Law on Personal Data Protection. Namely, these companies recognise Serbia as a relevant market, offer their services to citizens of the Republic of Serbia and monitor their activities. In the course of doing business, these companies process a large amount of data of Serbian citizens and make huge profits. On the other hand, the new law guarantees numerous rights to citizens in relation to such data processing, but at the moment it seems that exercising these rights would face many difficulties.

Among other things, these companies do not provide clear contact points thatcitizens can contact – they mostly have application forms available in a foreign language. Experience has shown that such forms are not adequate, not only because they require advanced knowledge of a foreign language by Serbian citizens, but also because this type of communication is mostly done by programs that send generic automated responses.

Although fines under the domestic Law on Personal Data Protection that the Commissioner may impose, in this case 100 000 Serbian dinars (around 940 USD or 850 EUR), wouldn’t have a major impact on the budgets of these gigantic companies, SHARE believes that they would show that the competent authorities of the Republic of Serbia intend to protect its citizens and point out that these companies are not operating in accordance with domestic regulations.

SHARE Foundation
https://www.sharefoundation.info/en/

Facebook and Google asked to appoint representatives in Serbia (05.06.2019)
https://edri.org/facebook-and-google-asked-to-appoint-representatives-in-serbia/

Will Serbia adjust its data protection framework to GDPR? (24.04.2019)
https://edri.org/will-serbia-adjust-its-data-protection-framework-to-gdpr/

(Contribution by EDRi member SHARE Foundation, Serbia)

close
15 Jan 2020

ECtHR demands explanations on Polish intelligence agency surveillance

By Panoptykon Foundation

The European Court of Human Rights (ECtHR) has demanded the Polish government to provide an explanation on surveillance by its intelligence agencies. This is a result of complaints filed with the Strasbourg court in late 2017 and early 2018 by activists from EDRi member Panoptykon Foundation and Helsinki Foundation for Human Rights as well as attorney Mr. Mikołaj Pietrzak. The attorney points out that uncontrolled surveillance by the Polish government violates not only his privacy but most importantly the rights and freedoms of his clients. Activists add that as active citizens, they are at a particular risk of being subject to government surveillance.

Panoptykon has been criticising the lack of control over government surveillance for years. Without appropriate controls concerns and doubts exist about what intelligence agencies can use their broad powers without proper limitations. However, there’s no way of verifying to which extent these powers are used, because the law does not envision access to information about whether an individual has been subject to surveillance – even if surveillance has finished and the individual has not been charged. Therefore, as citizens we are defenceless and we cannot protect our rights.

The ECtHR decided that the complaints meet formal requirements and communicated the case to the Polish government which will have to answer the question whether its actions violated our privacy (Article 8 of the European Convention on Human Rights) and the right to an effective remedy (Article 13 of the Convention).

What’s at stake is not just the right to privacy. As attorney Mikołaj Pietrzak explains, the basis of the attorney-client relationship is trust that can only exist on condition of confidentiality. Attorneys are obliged to protect legal privilege, especially when it comes to defence in criminal cases. Current laws make it impossible. This infringes on the rights and freedoms of their clients, and in particular their right to defence.

The Polish Constitutional Court pointed out that the law should have been changed already in July 2014. However, so-called Surveillance Act and Counter-terrorism Act that were adopted in 2016, only expanded the intelligence agencies’ powers, without introducing any mechanisms of control. Compared to other EU countries where independent control over the activities of intelligence agencies is not surprising to anyone, Poland stands out in a negative way. These irregularities have been pointed out, among others, by the Venice Commission in a June 2016 Opinion. The obligation to inform the data subject about the fact that intelligence agencies accessed their telecommunication data results from multiple ECtHR (e.g. Szabo and Vissy v. Hungary, Saravia v. Germany or Zakharov v. Russia) and Court of Justice of the European Union (CJEU) judgements (e.g. Tele2 Sverige).

The complainants are represented by attorney Małgorzata Mączka-Pacholak.

Panoptykon Foundation
https://en.panoptykon.org/

No control over surveillance by Polish intelligence agencies. ECHR demands explanations from the government (18.12.2019)
https://en.panoptykon.org/government-surveillance-echr-complaint

(Contribution by EDRi member Panoptykon Foundation, Poland)

close
15 Jan 2020

Copyright stakeholder dialogues: Filters can’t understand context

By Laureline Lemoine

On 16 December 2019, the European Commission held the fourth meeting of the Copyright Directive Article 17 stakeholder dialogues. During the “first phase”, meetings focused on the practices in different industries such as music, games, software, audiovisual and publishing. This meeting was the last of what the Commission called the “second phase”, where meetings were focused on technical presentations on content management technologies and existing licensing practices.

During this fourth meeting, presentations were given by platforms (Facebook, Seznam, Wattpad), providers of content management tools (Audible Magic, Ardito, Fifthfreedom, Smart protection), rightsholders (European Grouping of Societies of Authors and Composers – GESAC, Universal Music Publishing, Bundesliga) as well as by consumer group BEUC and the Lumen database.

Say it again louder for the people in the back: Filters cannot understand context

After Youtube’s Content ID presentation during the third meeting, Facebook’s Rights Management tool presentation reiterated what civil society has been repeating during the entire duration of the copyright debates: filtering tools cannot understand context. Content recognition technologies are only capable of matching files and cannot recognise copyright exceptions such as caricature or parody.

This argument has now been clearly and repeatedly laid out to the European Commission by both civil society organisations and providers of content recognition technology. We would therefore expect that the Commission’s guidelines will take this into account and recommend that filters should not be used to automatically block or remove uploaded content.

A lack of trust

As the meetings usually help revive old divisions between stakeholders, it also tells us about new ones. Facebook’s Rights Management tool pointed out that one of their biggest issues was the misuse of the tool by the rightsholders who claim rights on work they do not own. As a result, not every rightsholder get access to the same tools. Some tools such as automated actions are limited or reserved for what the provider calls “trusted rightsholders”.

On the other side, rightsholders such as GESAC have criticised the way they are treated by the big platforms such as YouTube. In particular, they highlighted the categorisation made by the content recognition tools which can lead to loss of revenue. Indeed, rightsholders sometimes have no choice but to use tools created and controlled by big platforms with their own opaque rules, and therefore emphasised the need for transparency and accuracy of the information on the way platforms like YouTube operate with content whose rights they own.

Transparency is key

With the aim of understanding the management practices of copyright-protected content, quantitative information is crucial. Faced with the issue of filters, content recognition providers said they have been relying on redress mechanisms and human judgment. But when asked for factual information on the functioning of their practices, no number or percentage was available. It is therefore impossible to understand the necessity, proportionality or efficiency of the use of automated content recognition tools.

According the Article 17(10) of the Copyright Directive, which provides the basis for the ongoing stakeholder dialogue, “users’ organisations shall have access to adequate information from online content-sharing service providers on the functioning of their practices with regard to paragraph 4.”

After four meetings and still lacking such information from companies, civil society organisations participating in the dialogue decided to send a request for information to the European Commission. We hope that the Commission will be able to gather such factual information from platforms so that the ongoing dialogue can lead to an evidence-based outcome.

As part of these transparency needs, EDRi also signed an open letter asking the Commission to share the draft guidelines they will produce at the end of the dialogue. In the letter, we asked that the guidelines should also be opened to a consultation with the participants of the stakeholder dialogues and to the broader public, to seek feedback on whether the document can be further improved to ensure compliance with the Charter of Fundamental Rights of the EU.

What’s next?

The next stakeholder dialogue meeting will be held on 16 January and will open the “third phase” of consultation, which will focus on the practicality of Article 17. The Commission already sent out the agenda, and the topics covered on 16 January will be authorisations, notices and the notion of “best efforts”, while the following session on 10 February will cover safeguards and redress mechanisms.

EU copyright dialogues: The next battleground to prevent upload filters (18.10.2019)
https://edri.org/eu-copyright-dialogues-the-next-battleground-to-prevent-upload-filters/

NGOs call to ensure fundamental rights in copyright implementation (20.05.2019)
https://edri.org/ngos-call-to-ensure-fundamental-rights-in-copyright-implementation/

Copyright: Open letter asking for transparency in implementing guidelines (15.01.2020)
https://edri.org/copyright-open-letter-asking-for-transparency-in-implementing-guidelines

(Contribution by Laureline Lemoine, EDRi)

close
15 Jan 2020

Our New Year’s wishes for European Commissioners

By Laureline Lemoine

EDRi wishes all readers a happy new year 2020!

In 2019, we had a number of victories in multiple fields. The European Parliament added necessary safeguards to the proposed Terrorist Content Online (TCO) Regulation to protect fundamental rights against overly broad and disproportionate censorship measures. The Court of Justice of the European Union (CJEU) ruled that clear and affirmative consent needs to be given to set cookies on our devices. Member States have been increasingly issuing fines under the General Data Protection Regulation (GDPR). Also, Google was fined for its abusing online ad practices, and new security standards for consumer Internet of Things (IoT) devices were introduced.

However, 2019 was also the year when some governments positioned themselves against encryption and started to normalise facial recognition in public spaces without adequate safeguards, public debate or fundamental rights assessment (France, Sweden, the UK). Mandatory upload filters were approved at EU level, and data breaches and privacy scandals frequently made the news.

For 2020, we need to ensure that the EU pushes forward policies that will lead to a human-centric internet rather than data exploitation models which deepen inequalities and enable surveillance capitalism. We are sending our wishes to the fresh new European Commissioners, so that they can help us defend our rights and freedoms online.

In 2020, we wish for President Ursula von der Leyen to:

  • Start implementing a human-centric vision for the internet to ensure the protection of fundamental rights online (and offline);
  • Define high privacy, security, safety and ethical standards for the new generation of technologies that will become the global norm;
  • Ensure that EU decision making is strengthened by ensuring transparency in the Council;
  • Ensure that any future measures on Artificial Intelligence (AI) leads to AI systems in Europe are based in the principles of legality, robustness, ethics, and human rights and where current data protection and privacy laws are not circumvented, but strengthened;
  • Ensure that the upcoming proposal Digital Services Act (DSA) (reforming the current e-Commerce Directive) creates legal certainty and introduce safeguards that will enable users to enjoy their rights and freedoms.

In 2020, we wish for Executive Vice President for A Europe Fit for the Digital Age Margrethe Vestager to:

  • Provide clarity on safeguards, red lines, and enforcement mechanisms to ensure that the automated decision making systems — and AI more broadly — developed and deployed in the EU respect fundamental rights;
  • Assess the fundamental rights and societal impacts of facial recognition and other biometric detection systems, and propose criteria to assess or define domains or use cases where AI-assisted technologies should not be developed;
  • Tackle exploitative business models and their violation of personal data protections through the Digital Services Act and any other necessary legislative or non-legislative initiatives;
  • Promote equality and fight discrimination in the development and use of technology;
  • Guarantee and promote the respect of fundamental rights through competition policy by investigating abuses by dominant platforms and exploring cooperation with data protection authorities.

In 2020, we wish for Commissioner for Internal Market Thierry Breton to:

  • Unlock the ePrivacy reform through discussion with the EU Council and the Member States;
  • Develop a sustainable, human-centric and rights-promoting Digital Services Act;
  • Ensure privacy by design and by default in current and future tech-related proposals;
  • Achieve digital sovereignty by ensuring the development of the necessary free and open hardware and software;
  • Ensure that the strategy on data developed as part of the EU’s approach on AI respects fundamental rights.

In 2020, we wish for Vice President and Commissioner for Values and Transparency Věra Jourová to:

  • Ensure transparency in trilogue negotiations;
  • Address the harms caused by hate speech, political disinformation and the abuse of internet controls by authoritarian states;
  • Analyse the risks of targeted political advertising and the online tracking industry;
  • Protect and promote freedom of expression online.

In 2020, we wish for Commissioner for Home Affairs Ylva Johansson to:

  • Ensure that illegal mass surveillance is not deployed, for example in any future attempts to implement data retention in Member States;Review all PNR frameworks in light of the jurisprudence of the CJEU;
  • Reassess the “e-evidence” proposal and its necessity or to include meaningful human rights safeguards;
  • Ensure that the safeguards adopted by the European Parliament and advocated by human rights groups are part of the final TCO Regulation.

In 2020, we wish for Commissioner for Justice Didier Reynders to:

  • Ensure the full enforcement of the GDPR in Member States by ensuring that data protection authorities have the necessary funding, resources, and independence to protect our rights;
  • Promote the European approach to data protection as a global model;
  • Contribute to legislation on AI to ensure that fundamental rights are fully protected, and especially, equality for everyone, by adopting rules that mitigate the harms caused by discrimination.

The new year is a time to reflect on the past year and pledge to do better in the next. Looking for new year’s resolutions? You can do more to stay safe online or donate to EDRi, to help us continue defending your digital human rights and freedoms in 2020 and beyond.

CJEU on cookies: ‘Consent or be tracked’ is not an option (01.10.2019)
https://edri.org/cjeu-cookies-consent-or-be-tracked-not-an-option/

Light at the end of the cyber tunnel: New IoT consumer standard (27.02.2019)
https://edri.org/light-at-the-end-of-the-cyber-tunnel-new-iot-consumer-standard/

The Dangers of High-Tech Profiling, Using Big Data (07.08.2014)
https://www.nytimes.com/roomfordebate/2014/08/06/is-big-data-spreading-inequality/the-dangers-of-high-tech-profiling-using-big-data

EU Commissioners candidates spoke: State of play for digital rights (23.10.2019)
https://edri.org/eu-commissioner-candidates-spoke-state-of-play-for-digital-rights/

A Human-Centric Digital Manifesto for Europe
https://www.opensocietyfoundations.org/publications/a-human-centric-digital-manifesto-for-europe

Cross-border access to data for law enforcement: Document pool (12.04.2019)
https://edri.org/cross-border-access-to-data-for-law-enforcement-document-pool/

(Contribution by Laureline Lemoine, EDRi)

close