28 Jan 2019

Austrian postal service involved in a data scandal

By Epicenter.works

After a media report from the media outlet “Addendum”, the Austrian postal service faces public outcry over its data gathering and sales activities. The Austrian Post is known for not only exercising their main duty of post delivery, but also selling addresses of Austrian residents to companies and political parties, for advertising. The media report said that not only are addresses being sold, but also sensitive data of 2,2 million Austrian inhabitants.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

The postal service’s data sheet includes a person’s name, address, age and gender, but also more than 40 other data sets, some of which are very sensitive types of personal information. One of those data points is the preference to a political party, which is a “special category of data”, and therefore requires explicit consent for processing. The postal service answered to the public outcry by stating that the data they are collecting on political preference is just an estimated probability, which is generated in a similar way as polls on elections.

Due to a lack of explicit consent, we believe this must be considered a breach of the General Data Protection Regulation (GDPR). To build public pressure, EDRi member epicenter.works provided a form for individuals to easily request access to their data. Within a week, the form was downloaded nearly 2000 times, and sent to the Austrian Posts data protection officer, which lead into wide media coverage by national and international news outlets.

A few days after stating the absolute confidence in the legality of this kind of data collection, the postal service changed their strategy to the opposite, and declared that they intend to delete these records and refrain from selling them further to their clients.

Further investigations by the Austrian Data Protection Authority (DPA), that need to take action immediately on this and other similar cases that may exist. Once the result of out data access requests, further actions could be started. Because of the dangerous precedent this case could be related to political profiling on a massive scale, the work of the DPA to oversee the implementation of the GDPR is crucial. If they set a strong precedent on this case, other businesses would be discouraged from keeping or starting similar cases of data exploitation in the future.

Epicenter.works
https://epicenter.works/

The post tells something to everybody! (only in German, 07.01.2019)
https://epicenter.works/content/die-post-verraet-allen-was

When the Post takes sides (only in German, 07.01.2019)
https://www.addendum.org/datenhandel/parteiaffinitaet/

Austria’s Post Office under fire over sharing data on political allegiances (11.01.2019)
https://www.thelocal.at/20190111/austrias-post-office-under-fire-over-data-sharing-political

Austrian Post Office to delete customers’ political data (10.01.2019)
https://phys.org/news/2019-01-austrian-office-delete-customers-political.html

Austria’s national post office under fire over data sharing (08.01.2019)
https://economictimes.indiatimes.com/news/international/business/austrias-national-post-office-under-fire-over-data-sharing/articleshow/67444380.cms

(Contribution by Iwona Laub, EDRi member Epicenter.works, Austria)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
28 Jan 2019

Panoptykon files complaints against Google and IAB

By Panoptykon Foundation

On the International Data Protection Day, 28 January 2019, EDRi member Panoptykon filed complaints against Google and the Interactive Advertising Bureau (IAB) under the General Data Protection Regulation (GDPR) to the Polish Data Protection Authority (DPA). The complaints are related to the functioning of online behavioural advertising (OBA) ecosystem.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The complaints focus on the role of Google and IAB as organisations that set standards for other actors involved in the OBA market. They should therefore be treated as data controllers responsible for GDPR infringements.

Arguments used by Panoptykon are based on complaints concerning the same issue by EDRi member Open Rights Group (ORG) and Brave, as well as on evidence provided by a report by Johnny Ryan. The key facts and observations of the complaints are:

  1. data shared by companies within the OBA ecosystem are not necessary for the purposes of serving targeting ads;
  2. companies sharing data have no control over its further use by a potentially unlimited number of other actors that have access to real-time bidding software;
  3. users have no access to their data and no tools for controlling its further use by a (potentially unlimited) number of actors;
  4. those failures are not incidental because they result from the very design of the OBA ecosystem – lack of transparency and the concept of bid request, which, by definition, leads to data “broadcasting”.

Prior to making these complaints, Panoptykon carried its own investigation of the OBA ecosystem in Poland, which confirmed allegations made by ORG and Brave in their complaints, as well as Johnny Ryan’s testimony. Between May and December 2018 Panoptykon sent a number of data access requests to various actors involved in the OBA ecosystem (including Google and leading data brokers) in order to check whether users are able to verify and correct their marketing profiles.

In most cases, companies refused to provide personal data to users based on alleged difficulty with their identification. This argument – made by key players in the OBA ecosystem – confirms that it has been designed to be obscure. Key identifiers used by data brokers to single out users and target ads are not revealed to data subjects that are concerned. It is a “catch 22” situation that cannot be reconciled with GDPR requirements (in particular the principle of transparency).

Along with its complaints, Panoptykon published a report summarising its investigation of the OBA ecosystem, which included interviews with key actors operating on the Polish market, and evidence collected by sending data access requests.

Panoptykon Foundation
https://en.panoptykon.org/

Panoptykon files complaints against Google and IAB Europe (28.01.2019)
https://en.panoptykon.org/complaints-Google-IAB

(Contribution by EDRi member Panoptykon Foundation, Poland)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
25 Jan 2019

Terrorist Content: LIBE Rapporteur’s Draft Report lacks ambition

By Yannic Blaschke

On 23 January 2019, the Rapporteur for the European Parliament Committee on Civil Liberties (LIBE), Daniel Dalton (ECR), published his Draft Report on the proposal for a Regulation on preventing the dissemination of terrorist content online. This Report by the lead Committee of the dossier follows the publishing of the Draft Opinions by the two other European Parliament Committees involved in the debate: the Committee on Internal Market and Consumer Protection (IMCO) and the Committee on Culture and Education (CULT).

Overall, LIBE’s Draft Report addressed only some of the many pressing issues of the Regulation which present serious risks for fundamental rights. Unfortunately, the Report therefore falls somewhat short of the ambitions to which a Committee dealing with civil liberties should aspire. This is even more disappointing after the comprehensive stance taken in the IMCO Draft Opinion, which includes more than twice as many amendments as the LIBE Draft Report.

LIBE’s Draft Report contains, in summary, the following positive points:
– it limits the scope of the Regulation to services that are available to the public (excluding, for example, file lockers from the scope)
– it addresses the need for reporting obligations from competent authorities

However, the Draft Report:
– does not tackle the manifest flaws of the measure of referrals from governments to companies for “voluntary consideration”, which would make Big Tech companies the Internet Police
– does not drastically modify or delete the problematic “proactive measures”, which can only lead to upload filters and other very strict content moderation measures, even though it reminds the legislator about the existing prohibition of general monitoring obligations in the EU
– does not address the problems caused by a lack of alignment of the definition of terrorist content with the Terrorism Directive

On a positive note, the scope of the Terrorist Content Regulation is more narrowly defined in the LIBE Draft Report, being limited now to services which are available to the public. On reporting obligations, it is a welcome addition that the report foresees an evaluation of the Regulation’s impact on the freedom of expression and information in the Union after a maximum of three years following the implementation of the legislation. Regarding the possibility for national authorities to impose proactive measures on online companies, the Draft Report furthermore contains some mitigating clauses, such as a consideration of a platform’s “non-incidental” exposure to terrorist content, or the reminder of the prohibition in EU law of general monitoring obligation for hosting providers. Finally, the Draft Report proposes some adjustments regarding remedies and safeguards. It gives a two week’s deadline for answering complaints by citizens whose content was removed or to which access was denied. The Draft Report also insists that the private complaint mechanisms of internet platforms do not preclude citizens from seeking legal redress before Member State’s courts.

However, Dalton MEP has disappointingly chosen not to address in the referrals of content to platforms for their “voluntary consideration”. These referrals could give national authorities an “escape route” from their human rights obligations by merely suggesting blocking of content which might be unpleasant , but not illegal and thus not suitable to require a removal orders, for a given government. Furthermore, the Rapporteur did not tackle the urgent need of reforming the definition of “terrorist content”, which three United Nations (UN) Special Rapporteurs had previously flagged as a key concern. The vagueness of the definition in the Commission proposal thus persists and  could threaten the work of journalists and NGOs documenting terrorist crimes. Finally, the “proactive measures” have not received the attention and intensive modification they need and they could still lead to de facto general monitoring obligations.

To summarise, the LIBE Draft Report lacks the ambition that would be expected from the Civil Liberties Committee and falls short from the much more comprehensive reworks delivered by the IMCO and CULT Committees. All involved Members of the European Parliament should cooperate and significantly strengthen the negligent and rushed Commission proposal, in particular in regard to the highly dangerous measures of referrals and proactive measures. Serious problems require serious legislation.

Terrorist Content Regulation: Document pool
https://edri.org/terrorist-content-regulation-document-pool/

CULT: Fundamental rights missing in the Terrorist Content Regulation (21.01.2019)
https://edri.org/cult-fundamental-rights-missing-in-the-terrorist-content-regulation/

Terrorist Content: IMCO draft Opinion sets the stage right for EP (18.01.2019)
https://edri.org/cult-fundamental-rights-missing-in-the-terrorist-content-regulation/

Terrorist content regulation – prior authorisation for all uploads? (21.11.2018)
https://edri.org/terrorist-content-regulation-prior-authorisation-for-all-uploads/

EU’s flawed arguments on terrorist content give big tech more power (24.10.2018)
https://edri.org/press-release-eu-terrorism-regulation-an-eu-election-tactic/

Joint Press Release: EU Terrorism Regulation – an EU election tactic (12.9.2018)
https://edri.org/press-release-eu-terrorism-regulation-an-eu-election-tactic/

(Contribution by Yannic Blaschke and Diego Naranjo)

close
23 Jan 2019

EDRi’s Kirsten Fiedler wins Privacy Award

By EDRi

On 22 January, Kirsten Fiedler, current Senior Policy and Campaigns Manager and former Managing Director of European Digital Rights, received the distinguished Felipe Rodriguez Award in celebration of her remarkable contribution to our right to privacy in the digital age.

Why should we defend digital rights and freedoms when there are really pressing and often life-threatening issues out there to fight for? The reason is that the internet and digital communications are seeping into every part of our lives, so our rights online are the basis for everything else we do.

said Fiedler.

I’d like to accept this award on behalf of the entire EDRi team and network. Our strength is in collective, collaborative actions.

Fiedler’s relentless efforts have been crucial to transforming the EDRi Brussels Office from a one-person entity into the current professional organisation with eight staff members. In addition to this, she played an instrumental role in EDRi’s campaigns against ACTA and privatised law enforcement, and has been the engine to the EDRi Brussels office’s growth during the past years.

The Felipe Rodriguez Award is part of the Dutch Big Brother Awards, organised by the EDRi member Bits of Freedom. Previous winners include Kashmir Hill, Open Whisper Systems, Max Schrems, and Edward Snowden. The award ceremony took place on 22 January 2019 in Amsterdam.

Photo: Jason Krüger

Bits of Freedom announces winner of privacy award (09.01.2019)
https://edri.org/bits-of-freedom-announces-winner-of-privacy-award/

Twitter_tweet_and_follow_banner

close
21 Jan 2019

Copyright negotiations begin to derail

By EDRi

The negotiations on the EU’s highly controversial Copyright Directive proposal continue. The last trilogue meeting between Commission, Council and Parliament was originally scheduled for today, 21 January 2019. The event was, however, called off on late Friday evening 18 January by the Romanian Presidency of the EU Council.

It has become increasingly clear that the manifest problems with the text make it hard to find an acceptable compromise on the future of platforms’ and search engines’ liability regimes. A blocking minority formed by Germany, Poland, Belgium, Italy, Sweden, Finland, Slovenia, Hungary and the Netherlands did not approve the Presidency’s revised Council mandate.

This makes it less likely that the EU institutions will find a common position on the deeply flawed Article 13 of the proposal, which will either directly or indirectly require online companies to implement highly error-prone upload filters to search user uploads for copyrighted material. The divisions in the Council are yet another sign of the high degree of polarisation and increasing lack of support for the proposal, which was also highlighted by the fact that even the creative industries called for a halt of negotiations on Article 13 in a joint letter. More than 70 Internet luminaries, the UN Special Rapporteur on Freedom of Expression, civil society organisations, programmers, and a plethora of academics have been highly critical of the proposal from the start.

The suspension of trilogue negotiations does, however, not mean that the fight against upload filters and for the freedom of expression is decided: In fact, it is now more crucial than ever to get in touch with your local Members of the European Parliament (MEPs) and national ministries, and ask them to oppose Article 13.

EDRi continues to follow the negotiations closely and calls all citizens and civil society to act and defend their digital rights through the #SaveYourInternet campaign.

Copyright: Compulsory filtering instead of obligatory filtering – a compromise? (04.09.2018)
https://edri.org/copyright-compulsory-filtering-instead-of-obligatory-filtering-a-compromise/

How the EU copyright proposal will hurt the web and Wikipedia (02.07;2018)
https://edri.org/how-the-eu-copyright-proposal-will-hurt-the-web-and-wikipedia/

EU Censorship Machine: Legislation as propaganda? (11.06.2018)
https://edri.org/eu-censorship-machine-legislation-as-propaganda/

close
21 Jan 2019

CULT: Fundamental rights missing in the Terrorist Content Regulation

By Diego Naranjo

The European Parliament (EP) Committee on Culture and Education (CULT), published on 16 January its Draft Opinion on the proposal for a Regulation preventing the dissemination of terrorist content online. Member of the European Parliament (MEP) Julie Ward, the Rapporteur for the Opinion, has joined Rapporteur for the IMCO Committee Julia Reda MEP, and civil rights group in criticising many aspects of the Commission original proposal. The Rapporteur expresses her concerns regarding threats for “fundamental rights, such as freedom of expression and access to information, as well as media pluralism.”

In the Draft Opinion, CULT proposes a number of changes:

  • Definition of terrorist content: The Opinion suggests aligning the definition of terrorist content with the Terrorism Directive 2017/541/EU and to carve-out educational, journalistic or research material.
  • Definition of hosting service providers: The CULT Committee acknowledges that the definition of these services is “too broad and legally unclear”, and that many services which are not the target of this Regulation would be unnecessarily covered. The Rapporteur suggests covering only those hosting service providers that make the content available to the general public.
  • Removal orders: According to the Opinion, the only authorities competent to issue removal orders should be judicial authorities, since they are the ones with the “sufficient expertise”. Furthermore, the “one hour” time frame to respond to the removal orders is replaced by “without undue delay”. This would allow for more flexibility for smaller service providers.
  • Pro-active measures: The obligation of pro-activity (in practice, to implement upload filters in hosting services) is deleted from the proposal.
  • Finally, the Rapporteur suggests removing the financial penalties in order to avoid smaller providers being overburdened, as well as to prevent the likely scenario “where companies may overly block and remove content in order to protect themselves against possible financial penalties.”

This constitutes, on a general level, a very welcome improvement of the dangerous pitfalls of the Commission’s original proposal. Of particular relevance is the Rapporteur’s assessment that an imposition of proactive measures would amount to a breach of Article 15 of the e-Commerce Directive (which contains the prohibition of general monitoring obligations), as well as the proposed deletion of pro-active measures (upload filters). However, it is unclear how the addition by the Rapporteur in Art. 3 (2) saying that hosting service providers “shall not store terrorist content” could be put in place without upload filters, even if as a safeguard the Rapporteur asks those measures to be “appropriate”.

Another shortcoming of the Draft Opinion is the lack of concern about the highly unaccountable instrument of providing referral capacities to national authorities. For some reason, the Rapporteur has decided not to address this trojan horse, which would directly implement privatised law enforcement in the European Union. Referrals from national authorities, even though with their intent to be just for “voluntary consideration” by private companies, are likely to become the way that pervasive Governments outsource the protection of Freedom of Expression to unaccountable private companies, who are outside of the scope of the Charter of Fundamental Rights.

Even though the Rapporteur has not addressed all of the key issues, there are many positive suggestions in the Draft Opinion. Some of them are in line with the IMCO Committee Draft Opinion, which provided an even more comprehensive proposal for improvement. Given the criticism from both Committees, three UN Special Rapporteurs and a large number of civil society groups, the lead committee, the Civil Liberties (LIBE) Committee, is expected to take all of this criticism on board and comprehensively amend the Regulation.

Draft Opinion of the Committee on Culture and Education on the proposal for a regulation on preventing the dissemination of terrorist content online (16.01.2018)
http://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&reference=PE-632.087&format=PDF&language=EN&secondRef=01

Terrorist Content Regulation: document pool
https://edri.org/terrorist-content-regulation-document-pool

Terrorist Content: IMCO draft Opinion sets the stage right for EP (18.01.2019)
https://edri.org/terrorist-content-imco-draft-opinion-sets-the-stage-right-for-ep/

Terrorist Content Regulation: Warnings from the UN and the CoE (19.12.2018)
https://edri.org/terrorist-content-regulation-warnings-from-the-un-and-the-coe/

The EU Council’s general approach on Terrorist Content Online proposal: A step towards pre-emptive censorship (11.12.2018)
https://edri.org/the-eu-councils-general-approach-on-terrorist-content-online-proposal-a-step-towards-pre-emptive-censorship/

Terrorist Content Regulation: Civil rights groups raise major concerns (05.12.2018)
https://edri.org/terrorist-content-regulation-civil-rights-groups-raise-major-concerns/

Terrorist content regulation – prior authorisation for all uploads? (21.11.2018)
https://edri.org/terrorist-content-regulation-prior-authorisation-for-all-uploads/

(Contribution by Diego Naranjo, EDRi)

Twitter_tweet_and_follow_banner

close
21 Jan 2019

Terrorist Content Regulation: Document Pool

By EDRi

Terrorist networks have grown highly prone to the use of the internet for spreading their propaganda and recruiting followers in recent years. Although the fear of the general public of terrorist attacks certainly puts considerable pressure on policy makers, politicians also strategically use the climate of diffuse anxieties to increase the securitisation of the internet and present themselves as capable, tough leaders. The latest example of such election-motivated policy making is the proposal for a Regulation on preventing the dissemination of terrorist content online, with which the European Commission continues its trend of producing a watershed of “solutions” to terrorist propaganda on the internet.

The proposal contains three main measures to address alleged “terrorist” content:

  1. First, it creates orders issued by (undefined) national authorities to remove or disable access to illegal terrorist content within an hour.
  2. Second, competent authorities can choose to make referrals of terrorist-related potential breaches of companies’ terms of service that would be subject to the voluntary consideration of the companies themselves.
  3. Third, it legislates on (undefined) proactive measures that can lead to an authority requesting a general monitoring obligation.

A major concern for the functioning and freedom of the internet is the extension of the upload filter regime the EU is currently about to introduce for copyright to terrorist content. Requiring internet companies to monitor everything we say on the web does not only have grave implications for the freedom of speech, but it also follows a dangerous path of outsourcing and privatising law enforcement.

EDRi will follow the developments of the Terrorist Content Regulation closely and critically in the next months and provide crucial input to policy makers to ensure that human rights are fully respected in the proposal.

EDRi’s analysis and recommendations
Legislative documents
EDRi’s blogposts and press releases
Other
Key Policy Makers


EDRi’s analysis and recommendations:

Legislative documents:


EDRi’s blogposts and press releases:


Other:


Key Policy Makers:

Opinion Committees:

Key Dates*:

*(note that these dates are TBC and subject to changes):

      • LIBE Committee (Lead Committee)
        • Deadline for Amendments: 15 February
        • Consideration of AMs: 7 March
        • Shadow meetings: 5-6 March or 11-12 March
        • Vote in LIBE Committee of the Report: 21 March
        • Vote in Plenary (1st reading): 25-28 March o 15-18 April
      • CULT Committe
        • Consideration of Draft Opinion:; : 4 February
        • Deadline for amendments: 6 February
        • Vote of the Opinion: 4 March
      • IMCO Committee
        • Consideration of Draft Report: 21 January
        • Deadline for amendments: 23 January  (noon)
        • Consideration of amendments: 20 February
        • Vote of the Opinion: 4 March

      Twitter_tweet_and_follow_banner

close
18 Jan 2019

Terrorist Content: IMCO draft Opinion sets the stage right for EP

By Yannic Blaschke

On 16 January 2019, the European Parliament Committee on Internal Market and Consumer Protection (IMCO) published its draft Opinion on the Regulation to prevent the dissemination of terrorist content online. The Opinion challenges many of the issues from the original Commission proposal. The Opinion from IMCO should “inform” the main Report prepared by the the Civil Liberties Committee (LIBE).

IMCO’s draft Opinion addresses many of the high risks of a detrimental impact on the freedom of expression. In a nutshell, it:

  • deletes referrals and “proactive” measures
  • points out the need to refer to “illegal” terrorist content
  • re-defines the services covered and exclude some
  • clarifies that the competent authorities deciding on the measures implemented by the Regulation need to be judicial authorities
  • implements new wording on transparency and more reporting obligations for Law Enforcement Agencies

The original Commission proposal had previously been criticised by three United Nations Special Rapporteurs and a great number of civil society and human rights organisations.

The draft Opinion states the necessity of “terrorist content” to be “offences committed intentionally” and to be “illegal”. While this seems obvious at first, such wording is crucial for the exclusion of works that merely document or report on terrorist crimes, such as journalistic or human rights defender publications, from the scope of the Regulation . The Opinion further clarifies that only publicly available information should be covered by the legislation and that electronic communication services, blogs and data stored in cloud systems must be excluded.

In regards to the new competencies the legislation is supposed to give to national authorities, the draft Opinion makes clear that only judicial authorities should be able to issue removal orders. This is a significant improvement to ensure due process and much preferable compared to the vague reference to “competent authorities” in the original Commission text.

The Rapporteur Julia Reda MEP has also taken a strong stance on the highly sensitive measures of referrals and proactive measures. Referrals are the practice of forwarding a piece of content (which may or may not be illegal) to a hosting service provider for its “voluntary consideration”; proactive measures are obligations for companies to have measures in place to find and disable access to “terrorist content”. Both of these instruments have deeply problematic implications: There is, for instance, a substantial lack of accountability for public authorities as a result of unlawful deletions of content referred by them; in addition to this, the possibility to impose proactive measures (upload filters) on companies would amount to a general monitoring obligation, something prohibited in EU law. In the IMCO draft Opinion, both the referrals and the “proactive measures” are deleted from the text.

Finally, the draft Opinion highlights the need for extensive documentation: The IMCO Rapporteur proposes to collect information on the number of removals that led to successful detection, investigation and prosecution of terrorist offences. Currently, the Commission states that it has no information about the number of investigations that were initiated after referrals made by Europol under its mandate. Thus, it is reasonable that when introducing similar capacities for national law enforcement, the effectiveness and proportionality of measures against “terrorist content” to supporting the investigation of terrorist acts needs to be critically evaluated.

The IMCO Opinion, as proposed by the Rapporteur, brings many positive changes that should be taken into consideration by the LIBE Committee on its Report, for which Daniel Dalton MEP (ECR) is the Rapporteur. The Parliament is well advised to take into consideration the proposals in this draft Opinion because the improvements on aspects such as on Rule of Law principles, predictability, legality and fundamental rights safeguards.

Draft Opinion of the Committee on the Internal Market and Consumer Protection  on the proposal for a regulation of the European Parliament and of the Council on preventing the dissemination of terrorist content online(COM(2018)0640–C8-0405/2018–2018/0331(COD)) (13.12.2018)
http://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&reference=PE-632.028&format=PDF&language=EN&secondRef=01

Terrorist Content Regulation: Warnings from the UN and the CoE (19.12.2018)
https://edri.org/terrorist-content-regulation-warnings-from-the-un-and-the-coe/

The EU Council’s general approach on Terrorist Content Online proposal: A step towards pre-emptive censorship (11.12.2018)
https://edri.org/the-eu-councils-general-approach-on-terrorist-content-online-proposal-a-step-towards-pre-emptive-censorship/

Terrorist Content Regulation: Civil rights groups raise major concerns (05.12.2018)
https://edri.org/terrorist-content-regulation-civil-rights-groups-raise-major-concerns/

Terrorist content regulation – prior authorisation for all uploads? (21.11.2018)
https://edri.org/terrorist-content-regulation-prior-authorisation-for-all-uploads/

EU Parliament’s anti-terrorism draft Report raises major concerns (10.10.2018)
https://edri.org/eus-flawed-arguments-on-terrorist-content-give-big-tech-more-power/

Twitter_tweet_and_follow_banner

close
16 Jan 2019

Digital rights as a security objective: Abuses and loss of trust

By Yannic Blaschke

Violations of human rights online can pose a real threat to our societies, from election’s security to societal polarisation. In this series of blogposts, we explain how and why digital rights must be treated as a security objective. In this third and final blogpost, we discuss how digital rights violations can exacerbate breaches to the rule of law in EU Member States and risk to undermine the already fragile project of the EU, including the European security aspects.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

In our previous blogpost, we outlined how an unjustified reliance on algorithms can lead to unintentional censorship and new attack vectors for malicious actors. However, the upload filters that feature in the ongoing discussions for the Copyright Directive and the Terrorist Content Regulation also have a big potential for abuses by public authorities.

There’s a number of examples of state authorities misusing copyright for attacks on the freedom of expression and right to information: Chilling examples are for instance Ecuador, where critics of president Correa were flooded with copyright notices, and the recent attempt of the German government to curb quotes from an internal military report by claiming it as copyrighted material. In the context of counter-terrorism legislation, the situation looks even more severe; with the Commissioner for Human Rights of the Council of Europe recently decrying that “the misuse of anti-terrorism legislation has become one of the most widespread threats to freedom of expression, including media freedom, in Europe”. Laws that have given rise to this alarming assessment are for instance the Spanish “gag law”, which has been severely criticised by international human rights organisations, or the French counter-terrorism laws. The dangerous logic of prosecuting offences vaguely framed as “glorification of terrorism” has led to numerous convictions of citizens on the basis of arbitrariness or because of controversial, yet undoubtedly not terrorist opinions and ideas.

What will happen once such disputes about legitimate forms of expression do not go in front of courts any longer, but filter technologies prevent them from ever appearing in public debate in the first place? What if EU governments start to abuse the competences given to them to censor political journalists, human rights defenders, opponents or ideas they do not like, for instance by calling opposition parties or activists terrorists? What would such abuse mean for the Member States’ eroding trust into each other’s capacity to uphold the rule of law?

In the context of terrorism, the European Commission has proposed that law enforcement should have competences to demand from platforms the introduction of automated filtering technologies if they regard the companies’ own “proactive” measures of content moderation as not extensive enough. Furthermore, there shall be a possibility for the authorities to refer to specific pieces of content to internet companies for their “voluntary consideration”, with a high chance of such content to be taken down due to the hosting providers fear of being held liable for content stored on their servers. Such vague and imprecise measures are not only undermining the rule of law and freedom of expression – they are also bound to be misused. If national authorities with an established record of public interference with citizen’s digital rights start using the additional suppressive instruments provided by the EU in the same disproportionate way they do with their national measures, it will not take long until the courts in other Member States begin to question the extent to which the authorities in their jurisdiction can still cooperate with their abusive counterparts. This has already happened in other contexts: For instance, the CJEU’s decision that extraditions to Poland may be halted. In combination with very different interpretations of what constitutes an offence against the public (for instance, the case of Spanish rapper Valtonyc), cases in which the newly created tools will be deployed to censor voices that are seen as illegal one Member State but are seen as perfectly legal in other Member States can and will further divide the cohesion and integrity of the common area of freedom, security and justice. EU wide public security can only be reached through trust among European judiciaries and law enforcement that throughout cooperative cross-border actions, fundamental rights are respected in all Member States. Giving all EU Member State authorities new censorship powers will achieve nothing but the contrary of such trust – and will thus damage, not improve our security.

Despite some major steps ahead in digital freedoms such as the adoption of the General Data Protection Regulation GDPR , we are still far from realising that digital rights are not just fundamental civil liberties, but also a prerequisite for the security and pluralism of our societies. If we want disinformation to stop ravaging public debate, we should not allow that individuals are forced to automatically give waivers to tracking cookies because it gives publishers and the tracking industry some income. If we want to close the vulnerabilities of our public debate forums on the internet, we cannot impose new gateways for disinformation attacks on online platforms. If we want to prevent the new authoritarianism, we cannot give it more tools of censorship through copyright and the silent eroding of civil liberties in counter-terrorism pursuits.

EU citizens’ digital rights are first and foremost, but not only to the benefit of individuals: they must also be regarded as fundamental to the security of our democratic systems and societal cohesion, both within and across European Union countries. To keep our societies open, free and safe, we must place the rights of the individual at the heart of our internet policies.

Digital rights as a security objective: New gateways for attacks (19.12.2018)
https://edri.org/digital-rights-as-a-security-objective-new-gateways-for-attacks/

Digital rights as a security objective: Fighting disinformation (05.12.2018)
https://edri.org/digital-rights-as-a-security-objective-fighting-disinformation/

(Contribution by Yannic Blaschke, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
16 Jan 2019

Advocate General issues two Opinions on “right to be forgotten”

By Yannic Blaschke

On 10 January 2019, the Advocate General (AG) Maciej Szpunar delivered two Opinions to the Court of Justice of the European Union (CJEU) that could have far-reaching implications for the “right to be forgotten”, which aims at enabling individuals to lead an autonomous life without stigmatisation from their past actions.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

A geographical limit to the “right to be forgotten”

In his first opinion, case Google v CNIL (C-507/17), AG Szpunar recommens the CJEU to limit the scope of application of search-engine de-referencing obligations to the territory of the EU. The case at hand was referred to the CJEU after a dispute between search engine operator Google and French Data Protection Authority CNIL. The CNIL had imposed a 100 000 euro fine on Google after the company refused to remove web pages relating to a natural person from all domains listed in its search engine (rather than just EU Member State domains).

In his Opinion, AG Szpunar held that the “right to be forgotten” must be balanced against other fundamental rights, such as the right to data protection and the right to privacy, as well as the legitimate public interest in accessing the information sought. The AG noted that, if worldwide de-referencing were permitted, the EU authorities would not be able to define and determine a right to receive information, especially since public interest in accessing information will necessarily vary from one third State to another, depending on its geographic location. There would thus be a risk that persons in third States would be prevented from accessing information and, in turn, that third States would prevent persons in the EU Member States from accessing information. The AG did, however, not rule out the principal possibility for the existence of cases in which worldwide de-referencing would be justified. He recommended the CJEU to rule that upon receiving a request for de-referencing, search engine providers should not be obliged to implement such measures on all its listed domains. Nevertheless, they should be obliged to implement all possible measures, including geo-blocking, to enforce effective de-referencing for all IP addresses located in the EU, regardless of the used domain.

Search engine operator’s processing of sensitive data

The second Opinion of the AG, case G.C. and Others v CNIL (C-136/17), referred to de-referencing obligations of search engine providers in regard to sensitive categories of data. Following a dispute between the French Data Protection Authority CNIL and the search engine operator Google, Szpunar argued that the prohibitions and restrictions regarding special categories of data (under the previous Data Protection Directive 95/46 EC) cannot apply to the operator of a search engine as if it had itself placed sensitive data on the web pages concerned. Since the activity of a search engine logically takes place only after (sensitive) data have been placed online, those prohibitions and restrictions can, in his opinion, therefore apply to a search engine only by reason of that referencing and, thus, through subsequent verification, when a request for de-referencing is made by the person concerned. Szpunar held, however, that where referencing of sources that store sensitive data occurs, search engine providers have an obligation to react to de-referencing requests after carefully balancing the the right to respect for private life and the right to protection of data with the right of the public to access the information concerned and the right to freedom of expression of the person who provided the information.

Opinions of the Advocates General are not legally binding, but often considerably influence the final verdict of the CJEU. The judgements in both preliminary rulings will be given at a later stage.

Advocate General Szpunar proposes that the Court should limit the scope of the dereferencing that search engine operators are required to carry out to the EU (10.01.2019)
https://curia.europa.eu/jcms/upload/docs/application/pdf/2019-01/cp190002en.pdf

Advocate General Szpunar proposes that the Court should hold that the operator of a search engine must, as a matter of course, accede to a request for the dereferencing of sensitive data (10.01.2019)
https://curia.europa.eu/jcms/upload/docs/application/pdf/2019-01/cp190001en.pdf

Google’s forgetful approach to the “right to be forgotten” (14.12.2016)
https://edri.org/googles-forgetful-approach-right-forgotten/

More “right to be forgotten” confusion (15.09.2015)
https://edri.org/more-right-to-be-forgotten-confusion/

Google now supports AND opposes the “right to be forgotten” (27.08.2014)
https://edri.org/google-now-supports-and-opposes-right-forgotten/

Google and the right to be forgotten – the truth is out there (02.07.2014)
https://edri.org/google-right-forgotten-truth/

Google’s right to be forgotten – industrial scale misinformation? (09.06.2014)
https://edri.org/forgotten/

(Contribution by Yannic Blaschke, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close