05 Feb 2019

Open letter on the Terrorism Database

By Diego Naranjo

On 4 February 2019, EDRi joined dozens of organisations and academics in signing an open letter. The letter criticises, in the Terrorist Content Regulation debate, the blind faith in a database to flag “terrorist content”.

Among our concerns are:

  • Lack of transparency of how the database works, and its effectiveness, proportionality and appropriateness to achieve the goals the Terrorist Content Regulation aims to achieve.
  • How filters are unable to understand the context and therefore they are error-prone.
  • Regardless of the possibility of filters to be accurate in the future, the pervasive online monitoring on disadvantaged and marginalised individuals.

Read the letter below or download it here (pdf):


Civil Society Letter on the Terrorist Content Database
Open letter

4 February 2019
via email

Dear Members of the European Parliament,

The undersigned organizations write to share our concerns about the EU’s proposed Regulation on Preventing the Dissemination of Terrorist Content Online, and in particular the Regulation’s call for Internet hosts to use “proactive measures” to detect terrorist content. We are concerned that if this Regulation is adopted, it will almost certainly lead platforms to adopt poorly understood tools, such as the Hash Database referenced in the Explanatory Memorandum to the Regulation and currently overseen by the Global Internet Forum to Counter Terrorism. Countering terrorist violence is a shared priority, and our point is not to question the good intentions of the Database operators. But lawmakers and the public have no meaningful information about how well the Database or any other existing filtering tool serves this goal, and at what cost to democratic values and individual human rights. We urge you to reject proactive filtering obligations; provide sound, peer-reviewed research data supporting policy recommendations and legal mandates around counter-terrorism; and refrain from enacting laws that will drive Internet platforms to adopt untested and poorly understood technologies to restrict online expression.

The Database was initially developed by Facebook, YouTube, Microsoft, and Twitter as a voluntary measure, and announced to the public in 2016. It contains digital hash “fingerprints” of images and4videos that platforms have identified as “extreme” terrorist material, based not on the law but on their own Community Guidelines or Terms of Service. The platforms can use automated filtering tools to identify and remove duplicates of the hashed images or videos. As of 2018, the Database was said to contain hashes representing over 80,000 images or videos. At least thirteen companies now use the Database, and some seventy companies have reportedly discussed adopting it.

Almost nothing is publicly known about the specific content that platforms block using the Database, or about companies’ internal processes or error rates, and there is insufficient clarity around the participating companies’ definitions of “terrorist content.” Furthermore, there are no reports about how many legal processes or investigations were opened after the content was blocked. This data would be crucial to understand to what extent the measures are effective and necessary in a democratic society, which are some of the sine qua non requisites for restrictions of fundamental rights. We do know, however, of conspicuous problems that seemingly result from content filtering gone awry. The Syrian Archive, a civil society organization preserving evidence of human rights abuses in Syria, for example, reports that YouTube deleted over 100,000 of its videos. Videos and9other content which may be used in one context to advocate terrorist violence may be essential elsewhere for news reporting, combating terrorist recruitment online, or scholarship. Technical filters are blind to these contextual differences. As three United Nations special rapporteurs noted in a December 2018 letter, this problem raises serious concerns about free expression rights under the proposed Regulation. It is far from clear whether major platforms like YouTube or Facebook adequately correct for this through employees’ review of filtering decisions—and it seems highly unlikely that smaller platforms could even attempt to do so, if required to use the Database or other filtering tools.

Failures of this sort seriously threaten Internet users’ rights to seek and impart information. The pervasive monitoring that platforms carry out in order to filter users’ communications also threatens privacy and data protection rights. Moreover, these harms do not appear to be equally distributed, but instead disproportionately disadvantage individual Internet users based on their ethnic background, religion, language, or location—in other words, harms fall on users who might already be marginalized. More extensive use of the Database and other automated filtering tools will amplify the risk of harms to users whose messages and communications about matters of urgent public concern may be wrongly removed by platforms. The United Nations Special Rapporteur on the promotion and protection of human rights and fundamental freedoms while countering terrorism has expressed concern about this lack of clarity, and said that Facebook’s rules for classifying organizations as terrorist are “at odds with international humanitarian law”.

Due to the opacity of the Database’s operations, it is impossible to assess the consequences of its nearly two years of operation. The European public is being asked to rely on claims by platforms or vendors about the efficacy of the Database and similar tools—or else to assume that any current problems will be solved by hypothetical future technologies or untested, post-removal appeal mechanisms. Such optimistic assumptions cannot be justified given the serious problems researchers have found with the few filtering tools available for independent review. Requiring all platforms to use black-box tools like the Database would be a gamble with European Internet users’ rights to privacy and data protection, freedom of expression and information, and non-discrimination and equality before the law. That gamble is neither necessary nor proportionate as an exercise of state power.

EU institutions’ embrace of the database and other filtering tools will also have serious consequences for Internet users all over the world, including in countries where various of the undersigned organizations work to protect human rights. For one thing, when platforms filter a video or image in response to a European authority’s request, it will likely disappear for users everywhere—even if it is part of critical news reporting or political discourse in other parts of the world. For another, encoding proactive measures to filter and remove content in an EU regulation gives authoritarian and authoritarian-leaning regimes the cover they need to justify their own vaguely worded and arbitrarily applied anti-terrorism legislation. Platforms that have already developed content filtering capabilities in order to comply with EU laws will find it difficult to resist demands to use them in other regions and under other laws, to the detriment of vulnerable Internet users around the globe. Your decisions in this area will have global consequences.

Signatories:
Access Now
Africa Freedom of Information Centre
Agustina Del Campo, in an individual capacity (Center for Studies on Freedom of Expression CELE)
American Civil Liberties Union (ACLU)
ApTI Romania
Article 19
Bits of Freedom
Brennan Center for Justice
Catalina Botero Marino, in an individual capacity (Former Special Rapporteur of Freedom of Expression of the Organization of American States
Center for Democracy & Technology (CDT)
Centre for Internet and Society
Chinmayi Arun, in an individual capacity
Damian Loreti, in an individual capacity
Daphne Keller, in an individual capacity (Stanford CIS)
Derechos Digitales · América Latina
Digital Rights Watch
Electronic Frontier Finland
Electronic Frontier Foundation (EFF)
Electronic Frontier Norway
Elena Sherstoboeva, in an individual capacity (Higher School of Economics)
European Digital Rights (EDRi)
Hermes Center
Hiperderecho
Homo Digitalis
IT-Pol
Joan Barata, in an individual capacity (Stanford CIS)
Krisztina Rozgonyi, in an individual capacity (University of Vienna)
Open Rights Group
Open Technology Institute at New America
Ossigeno
Pacific Islands News Association (PINA)
People Over Politics
Prostasia Foundation
R3D: Red en Defensa de los Derechos Digitales
Sarah T. Roberts, Ph.D., in an individual capacity
Southeast Asian Press Alliance
Social Media Exchange (SMEX, Lebanon
WITNESS
Xnet


Open letter on the Terrorist Content Database (05.02.2019)
https://edri.org/files/counterterrorism/20190205-Civil-Society-Letter-to-EP-Terrorism-Database.pdf

Terrorist Content Regulation: Document Pool
https://edri.org/terrorist-content-regulation-document-pool/

Twitter_tweet_and_follow_banner

close
29 Jan 2019

Copyright: Open Letter calling for the deletion of Articles 11 and 13

By EDRi

On 29 January 2019, EDRi, along with a large stakeholder coalition consisting of 87 organisations, sent a letter to the Council’s Working Party on Intellectual Property, European Commission Vice-President Andrus Ansip and the European Parliament trilogue negotiators to ask for a deletion of the controversial Articles 11 and 13 in the Copyright Directive proposal. The letter comes in a crucial moment since the negotiations are stalled after a revised mandate for the Council failed to be adopted on 18 January.

Signatories express the view that a compromise on Article 13 seems more difficult to achieve now that, after criticism from 70 Internet luminaries, the UN Special Rapporteur on Freedom of Expression, civil society organisations, programmers and academics,  even large parts of the creative industries are calling for a halt of negotiations on Article 13. Similar criticism has been raised about the ancillary copyright proposal in Article 11 that has lead to Google threatening to leave the EU market.

Despite two years of negotiations, European policy makers have not managed to find the right balance in the text. Thus, the letter calls to delete both Articles 11 and 13 from the proposal completely in order to allow for a swift continuation of the discussions.

Read the letter below or in pdf format here.


Open Letter calling for the deletion of Articles 11 and 13 in the copyright Directive proposal

Your Excellency Deputy Ambassador,
Dear European Commission Vice-President Andrus Ansip
Dear MEPs Voss, Adinolfi, Boutonnet, Cavada, Dzhambazki, Geringer de Oedenberg, Joulaud, Maštálka, Reda, Stihler,

We are writing you on behalf of business organisations, civil society organisations, creators, academics, universities, public libraries, research organisations and libraries, startups, software developers, EU online platforms, and Internet Service Providers.

Taking note of the failure of the Council to find a majority for a revised negotiation mandate on Friday 18 January, we want to reiterate our position that the manifest flaws in Articles 11 and 13 of the proposal for a Copyright Directive in the Digital Single Market constitute insurmountable stumbling blocks to finding a balanced compromise on the future of Copyright in the European Union. Despite more than two years of negotiations, it has not been possible for EU policy makers to take the serious concerns of industry, civil society, academics, and international observers such as the UN special rapporteur on freedom of expression into account, as the premises both Articles are built on are fundamentally wrong.

In light of the deadlock of the negotiations on Articles 11 and 13, as well as taking into consideration the cautious stance of large parts of the creative industries, we ask you to delete Articles 11 and 13 from the proposal. This would allow for a swift continuation of the negotiations, while the issues that were originally intended to be addressed by Articles 11 and 13 could be tackled in more appropriate legal frameworks than this Copyright Directive.

We hope that you will take our suggestion on board when finalising the negotiations and put forward a balanced copyright review that benefits from wide stakeholder support in the European Union.

Yours sincerely,

Undersigned organisations:

Europe
1. European Digital Rights (EDRi)

2. Allied for Startups

3. Civil Liberties Union for Europe (Liberties)

4. Copyright for Creativity (C4C)

5. Create.Refresh

6. European Bureau of Library, Information and Documentation Associations (EBLIDA)

7. European Internet Services Providers Association (EuroISPA)

8. European Network for Copyright in Support of Education and Science (ENCES)

9. European University Association (EUA)

10. Ligue des Bibliothèques Européennes de Recherche – Association of European Research Libraries (LIBER)

11. Open State Foundation

12. Scholarly Publishing and Academic Resources Coalition Europe (SPARC Europe)

Austria
13. epicenter.works – for digital rights

14. Digital Society

15. Initiative für Netzfreiheit (IfNf)

16. Internet Service Providers Austria (ISPA Austria)

Belgium
17. FusionDirectory

18. Opensides

19. SA&S – Samenwerkingsverband Auteursrecht & Samenleving (Partnership Copyright & Society)

Bulgaria
20. BlueLink Foundation

Czech Republic
21. Iuridicum Remedium (IuRe)

22. Seznam.cz

Denmark
23. IT-Political Association of Denmark

Estonia
24. Wikimedia Eesti

Finland
25. Electronic Frontier Finland (EFFI)

26. Finnish Federation for Communications and Teleinformatics (FiCom)

France
27. April

28. Conseil National du Logiciel Libre (CNLL)

29. NeoDiffusion

30. Renaissance Numérique

31. Uni-Deal

32. Wikimédia France

Germany
33. Bundesverband Deutsche Startups

34. Chaos Computer Club

35. Deutscher Bibliotheksverband e.V. (dbv)

36. Digitalcourage e.V.

37. Digitale Gesellschaft e.V.

38. eco – Association of the Internet Industry

39. Factory Berlin

40. Förderverein Informationstechnik und Gesellschaft (FITUG e.V.)

41. Initiative gegen ein Leistungsschutzrecht (IGEL)

42. Silicon Allee

43. Wikimedia Deutschland

Greece
44. Open Technologies Alliance – GFOSS (Greek Free Open Source Software Society)

45. Homo Digitalis

Italy
46. Hermes Center for Transparency and Digital Human Rights

47. Roma Startup

48. Associazione per la Libertà nella Comunicazione Elettronica Interattiva (ALCEI)

Luxembourg
49. Frënn vun der Ënn

Netherlands
50. Bits of Freedom (BoF)

51. Dutch Association of Public Libraries (VOB)

52. Vrijschrift

Poland
53. Centrum Cyfrowe Foundation

54. ePaństwo Foundation

55. Startup Poland

56. ZIPSEE Digital Poland

Portugal
57. Associação D3 – Defesa dos Direitos Digitais (D³)

58. Associação Nacional para o Software Livre (ANSOL)

Romania
59. APADOR-CH (Romanian Helsinki Committee)

60. Association for Technology and Internet (ApTI)

Slovakia
61. Sapie.sk

Slovenia
62. Digitas Institute

63. Forum za digitalno družbo (Digital Society Forum)

Spain
64. Asociación de Internautas

65. Grupo 17 de Marzo

66. MaadiX

67. Platform in Defence of Freedom of Information (PDL) (added on 31 January 2019)

68. Rights International Spain

69. Xnet

Sweden
70. Dataskydd.net

71. Föreningen för Digitala Fri- och Rättigheter (DFRI)

United Kingdom
72. Coalition for a Digital Economy (COADEC)

73. Open Rights Group (ORG)

International
74. Alternatif Bilişim Derneği (Alternatif Bilişim) (Turkey)

75. ARTICLE 19

76. Association for Progressive Communications (APC)

77. Center for Democracy & Technology (CDT)

78. COMMUNIA Association

79. Derechos Digitales (Latin America)

80. Electronic Frontier Foundation (EFF)

81. Electronic Information for Libraries (EIFL)

82. Index on Censorship

83. International Federation of Library Associations and Institutions (IFLA)

84. Israel Growth Forum (Israel)

85. My Private Network

86. Open Knowledge International

87. OpenMedia

88. SHARE Foundation (Serbia)

89. SumOfUs

90. World Wide Web Foundation

 

EDRi continues to follow the negotiations closely and calls all citizens and civil society to act and defend their digital rights through the #SaveYourInternet campaign.

Copyright: Compulsory filtering instead of obligatory filtering – a compromise? (04.09.2018)
https://edri.org/copyright-compulsory-filtering-instead-of-obligatory-filtering-a-compromise/

How the EU copyright proposal will hurt the web and Wikipedia (02.07;2018)
https://edri.org/how-the-eu-copyright-proposal-will-hurt-the-web-and-wikipedia/

EU Censorship Machine: Legislation as propaganda? (11.06.2018)
https://edri.org/eu-censorship-machine-legislation-as-propaganda/

Twitter_tweet_and_follow_banner

close
28 Jan 2019

noyb files eight strategic complaints on “right to access”

By noyb noyb

A test by EDRi member noyb, a European non-profit organisation for privacy enforcement, shows structural violations of most streaming services. In more than ten test cases noyb was able to identify violations of Article 15 of the General Data Protection Regulation (GDPR) in many shapes and forms by companies like Amazon, Apple, DAZN, Spotify or Netflix. On 18 January 2019, noyb filed a wave of ten strategic complaints against eight companies.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Under the new GDPR, users enjoy a “right to access”. Users are granted a right to get a copy of all raw data that a company holds about them, as well as additional information about the sources and recipients of the data, the purpose for which the data is processed or information about the countries in which the data is stored and how long it’s stored. This “right to access” is enshrined in Article 15 of the GDPR and Article 8(2) of the Charter of Fundamental Rights of the European Union.

noyb put eight online streaming services from eight countries to the test – but no service fully complied. In eight out of eight cases, noyb filed formal complaints with the relevant data protection authorities.

All major providers even engaged in ‘structural violation’ of the law

 said Max Schrems, Director of noyb.

While many smaller companies manually respond to GDPR requests, larger services like YouTube, Apple, Spotify or Amazon have built automated systems that claim to provide the relevant information. When tested, none of these systems provided the user with all relevant data.

“Many services set up automated systems to respond to access requests, but they often don’t even remotely provide the data to which every user has a right. In most cases, users only got the raw data, but, for example, no information about who this data was shared with. This leads to structural violations of users’ rights, as these systems are built to withhold the relevant information,” said Schrems.

While all other streaming services have provided some response to the request of users to access their data at least, the United Kingdom sports streaming service DAZN and the German music streaming service SoundCloud simply ignored the request . However, the responses received were lacking background information, such as the sources and recipients of data or on how long data is actually stored (“retention period”). In many cases, the raw data was provided in cryptic formats that made it extremely hard or even impossible for an average user to understand the information. In many cases certain types of raw data were also missing.

noyb has filed complaints with the Austrian Data Protection Authority (dsb.gv.at) against eight companies, on behalf of ten users. The Austrian authority will have to cooperate with the relevant authorities at the main establishment of each streaming service. As GDPR foresees 20 million euro or 4% of the worldwide turnover as a penalty, the theoretical maximum penalty across the ten complaints could be 18,8 billion euro.

The right to access is a cornerstone of the data protection framework. Only when users can get an idea of how and why their data is stored or shared, they can realistically uncover violations of GDPR and consequently take action. Every user has the right to get a copy of his or her data and to receive additional information. Usually users can fill out a form or send an email to most services. noyb has collected the links and forms for major streaming services on its webpage for everyone to use.

Article 80 of the GDPR foresees that data subjects can be represented by a non-profit association, as individual users are usually unable to file the relevant legal complaints. In this case all ten users are represented by the non-profit organisation noyb.

“noyb is meant to reasonably enforce the new rules, so that the benefits actually reach the users,” Schrems said.

noyb.eu is funded by over 3100 individual supporting members and sponsors. In order to finance the fight against data breaches in the long term, the association is looking for more supporting members. “In 1995 the EU already passed data protection laws, but they were simply ignored by the big players. We now have to make sure this does not happen again with GDPR – so far many companies only seem to be superficially compliant,” said Schrems.

noyb
https://noyb.eu/

Press release: Structural Violation of “Right to Access” and GDPR Complaints against Netflix, Amazon, Spotify, YouTube and Apple filed (18.01.2019)
https://noyb.eu/wp-content/uploads/2019/01/PA_st_EN.pdf

Netflix, Spotify & YouTube: Eight Strategic Complaints filed on “Right to Access” (18.01.2019)
https://noyb.eu/access_streaming/

(Contribution by EDRi member noyb, Austria)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
28 Jan 2019

Period tracker apps – where does your data end up?

By Bits of Freedom

More and more women use a period tracker: an app that keeps track of your menstrual cycle. However, these apps do not always treat the intimate data that you share with them carefully.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

An app that notifies you when to expect your period or when you are fertile can be useful, for example to predict when you can expect to suffer the side effects that for a lot of women come with being on your period. In itself, keeping track of your cycle is nothing new: putting marks in your diary or on your calendar have always been an easy way to take your cycle into account. But sharing data on the workings of your body with an app is more risky.

There seems to be quite a large market for period tracker apps. From “Ladytimer Maandstonden Cyclus Kalender” to “Magic Teen Girl Period Tracker”, from “Vrouwenkalender” to “Flo” – all neatly lined up in different shades of pink in the appstore. “Femtech” is seen as a growing market that has raised billion-dollar investments over the last couple of years by different startups. Are these apps made to provide women with more insight into the workings of their bodies, or to monetise that need?

It’s interesting to look at the kind of data these apps collect. The app usually opens with a calendar overview. In the overview you can input the date of your last period. In addition, you can keep a daily record of how you feel (happy, unhappy, annoyed) and whether you experience blood loss. But for most of these apps it doesn’t end there. Have you had sex? And if so, with or without protection? With yourself or with another person? How would you grade the orgasm? Did you have a stomach ache? Were your bowel movements normal? Did you feel like having sex? Sensitive breasts? An acne problem? Did you drink alcohol? Exercise? Did you eat healthy?

For a number of these questions it is understandable why answering them might be useful, if the app wants to learn to predict in what stage of your cycle you are. But a lot of these questions are quite intimate. And all this sensitive data often seems to end up in possession of the company behind the app. The logical question then is: What exactly does a company do with all this data you hand over? Do you have any say in that? Do they treat it carefully? Is the data shared with other parties?

After digging through a number of privacy statements, it appears that one of the most used apps in the Netherlands, “Menstruatie Kalender”, gives Facebook the permission to show in-app advertisements. It’s not clear what information Facebook gathers about you from the app to show you advertisements. For example, does Facebook get information on when you are having your period?

Another frequently used app in the Netherlands is “Clue”. It’s the only one we found that has a comprehensive and easily readable privacy statement. You can use the app without creating an account in which case data is solely stored locally on your phone. If you do choose to create an account you give explicit consent to share your data with the company. In that case it is stored on secure servers. With your consent it will also be used for academic research into women’s health.

This can not be said of many other apps. Their privacy statements are often long and difficult to read, and require good reading-between-the-lines skills to understand that data is being shared with “partners”. It’s possible that the sensitiveness of your breasts in itself is not very interesting to an advertiser, but by keeping track of your cycle the apps automatically acquire information on the possible start of one of the most interesting periods of your life for marketeers: motherhood.

The most extreme example is Glow, the company behind the period tracker app “Eve”. Their app is focused on the potential desire to have children. The company’s tagline is as straightforward as they come: “Women are 40% more likely to conceive when using Glow as a fertility tracker”. Besides Eve, Glow has three other apps: an ovulation and fertility tracker, a baby tracker and a pregnancy tracker. The apps link to the Glow-community, a network of forums where hundreds of women share their experiences and give each other tips.

But that’s not the only thing that Glow offers. You can’t use a Glow webpage or app without being shown the “Fertility Program”. For 1200-7000 euro, you can enroll in different fertility programs. Too expensive? You are able to take out a cheap loan through a partnership with a bank. And in the end, freezing your eggs, if you are in your early thirties, is the most economically viable option, according to the website.

Turns out that Glow is a company selling fertility products. It has built a number of apps to subtly (and sometimes not so subtly) attract more female customers. As a consumer you think you are using an app for keeping track of your cycle, but in the meantime you are constantly notified of all the possibilities of freezing your eggs, the costs of pregnancy at a higher age, and your limited fertile years. Before you know it, you are lying awake at age 30, wondering whether it would be more “economical” to freeze your eggs.

These apps shed light on what seems to be a contract to which we are forced to consent more and more often. In exchange for the use of an app that makes our lives a little bit easier, we have to give away a lot of personal information, without knowing exactly what happens with it. The fact that these apps deal with intimate information doesn’t mean that the creators treat it more carefully. To the contrary: it increases the market value of that data.

So before you download one of these apps, or advise your daughter to download one, think again. Take your time to read an app’s privacy statement, to know exactly what the company does with your data. But there is also a responsibility for the regulatory body, such as the Autoriteit Persoonsgegevens in the Netherlands, to ensure companies don’t abuse your intimate data.

Are you using one of these apps and do you want to know which data the company has gathered on you, or do you want to have that data erased? You can easily draw up a request which you can send by mail or email using My Data Done Right.

Bits of Freedom
https://www.bitsoffreedom.nl/

Who profits from period trackers? (25.01.2019)
https://www.bitsoffreedom.nl/2019/01/25/who-profits-from-the-period-tracker/

Who benefits from cycle trackers? (only in Dutch, 03.12.2018)
https://www.bitsoffreedom.nl/2018/12/03/wie-profiteert-van-de-cyclustracker/

(Contribution by EDRi member Bits of Freedom; translated from Dutch by volunteer Axel Leering)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
28 Jan 2019

Austrian postal service involved in a data scandal

By Epicenter.works

After a media report from the media outlet “Addendum”, the Austrian postal service faces public outcry over its data gathering and sales activities. The Austrian Post is known for not only exercising their main duty of post delivery, but also selling addresses of Austrian residents to companies and political parties, for advertising. The media report said that not only are addresses being sold, but also sensitive data of 2,2 million Austrian inhabitants.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

The postal service’s data sheet includes a person’s name, address, age and gender, but also more than 40 other data sets, some of which are very sensitive types of personal information. One of those data points is the preference to a political party, which is a “special category of data”, and therefore requires explicit consent for processing. The postal service answered to the public outcry by stating that the data they are collecting on political preference is just an estimated probability, which is generated in a similar way as polls on elections.

Due to a lack of explicit consent, we believe this must be considered a breach of the General Data Protection Regulation (GDPR). To build public pressure, EDRi member epicenter.works provided a form for individuals to easily request access to their data. Within a week, the form was downloaded nearly 2000 times, and sent to the Austrian Posts data protection officer, which lead into wide media coverage by national and international news outlets.

A few days after stating the absolute confidence in the legality of this kind of data collection, the postal service changed their strategy to the opposite, and declared that they intend to delete these records and refrain from selling them further to their clients.

Further investigations by the Austrian Data Protection Authority (DPA), that need to take action immediately on this and other similar cases that may exist. Once the result of out data access requests, further actions could be started. Because of the dangerous precedent this case could be related to political profiling on a massive scale, the work of the DPA to oversee the implementation of the GDPR is crucial. If they set a strong precedent on this case, other businesses would be discouraged from keeping or starting similar cases of data exploitation in the future.

Epicenter.works
https://epicenter.works/

The post tells something to everybody! (only in German, 07.01.2019)
https://epicenter.works/content/die-post-verraet-allen-was

When the Post takes sides (only in German, 07.01.2019)
https://www.addendum.org/datenhandel/parteiaffinitaet/

Austria’s Post Office under fire over sharing data on political allegiances (11.01.2019)
https://www.thelocal.at/20190111/austrias-post-office-under-fire-over-data-sharing-political

Austrian Post Office to delete customers’ political data (10.01.2019)
https://phys.org/news/2019-01-austrian-office-delete-customers-political.html

Austria’s national post office under fire over data sharing (08.01.2019)
https://economictimes.indiatimes.com/news/international/business/austrias-national-post-office-under-fire-over-data-sharing/articleshow/67444380.cms

(Contribution by Iwona Laub, EDRi member Epicenter.works, Austria)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
28 Jan 2019

Panoptykon files complaints against Google and IAB

By Panoptykon Foundation

On the International Data Protection Day, 28 January 2019, EDRi member Panoptykon filed complaints against Google and the Interactive Advertising Bureau (IAB) under the General Data Protection Regulation (GDPR) to the Polish Data Protection Authority (DPA). The complaints are related to the functioning of online behavioural advertising (OBA) ecosystem.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The complaints focus on the role of Google and IAB as organisations that set standards for other actors involved in the OBA market. They should therefore be treated as data controllers responsible for GDPR infringements.

Arguments used by Panoptykon are based on complaints concerning the same issue by EDRi member Open Rights Group (ORG) and Brave, as well as on evidence provided by a report by Johnny Ryan. The key facts and observations of the complaints are:

  1. data shared by companies within the OBA ecosystem are not necessary for the purposes of serving targeting ads;
  2. companies sharing data have no control over its further use by a potentially unlimited number of other actors that have access to real-time bidding software;
  3. users have no access to their data and no tools for controlling its further use by a (potentially unlimited) number of actors;
  4. those failures are not incidental because they result from the very design of the OBA ecosystem – lack of transparency and the concept of bid request, which, by definition, leads to data “broadcasting”.

Prior to making these complaints, Panoptykon carried its own investigation of the OBA ecosystem in Poland, which confirmed allegations made by ORG and Brave in their complaints, as well as Johnny Ryan’s testimony. Between May and December 2018 Panoptykon sent a number of data access requests to various actors involved in the OBA ecosystem (including Google and leading data brokers) in order to check whether users are able to verify and correct their marketing profiles.

In most cases, companies refused to provide personal data to users based on alleged difficulty with their identification. This argument – made by key players in the OBA ecosystem – confirms that it has been designed to be obscure. Key identifiers used by data brokers to single out users and target ads are not revealed to data subjects that are concerned. It is a “catch 22” situation that cannot be reconciled with GDPR requirements (in particular the principle of transparency).

Along with its complaints, Panoptykon published a report summarising its investigation of the OBA ecosystem, which included interviews with key actors operating on the Polish market, and evidence collected by sending data access requests.

Panoptykon Foundation
https://en.panoptykon.org/

Panoptykon files complaints against Google and IAB Europe (28.01.2019)
https://en.panoptykon.org/complaints-Google-IAB

(Contribution by EDRi member Panoptykon Foundation, Poland)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
25 Jan 2019

Terrorist Content: LIBE Rapporteur’s Draft Report lacks ambition

By Yannic Blaschke

On 23 January 2019, the Rapporteur for the European Parliament Committee on Civil Liberties (LIBE), Daniel Dalton (ECR), published his Draft Report on the proposal for a Regulation on preventing the dissemination of terrorist content online. This Report by the lead Committee of the dossier follows the publishing of the Draft Opinions by the two other European Parliament Committees involved in the debate: the Committee on Internal Market and Consumer Protection (IMCO) and the Committee on Culture and Education (CULT).

Overall, LIBE’s Draft Report addressed only some of the many pressing issues of the Regulation which present serious risks for fundamental rights. Unfortunately, the Report therefore falls somewhat short of the ambitions to which a Committee dealing with civil liberties should aspire. This is even more disappointing after the comprehensive stance taken in the IMCO Draft Opinion, which includes more than twice as many amendments as the LIBE Draft Report.

LIBE’s Draft Report contains, in summary, the following positive points:
– it limits the scope of the Regulation to services that are available to the public (excluding, for example, file lockers from the scope)
– it addresses the need for reporting obligations from competent authorities

However, the Draft Report:
– does not tackle the manifest flaws of the measure of referrals from governments to companies for “voluntary consideration”, which would make Big Tech companies the Internet Police
– does not drastically modify or delete the problematic “proactive measures”, which can only lead to upload filters and other very strict content moderation measures, even though it reminds the legislator about the existing prohibition of general monitoring obligations in the EU
– does not address the problems caused by a lack of alignment of the definition of terrorist content with the Terrorism Directive

On a positive note, the scope of the Terrorist Content Regulation is more narrowly defined in the LIBE Draft Report, being limited now to services which are available to the public. On reporting obligations, it is a welcome addition that the report foresees an evaluation of the Regulation’s impact on the freedom of expression and information in the Union after a maximum of three years following the implementation of the legislation. Regarding the possibility for national authorities to impose proactive measures on online companies, the Draft Report furthermore contains some mitigating clauses, such as a consideration of a platform’s “non-incidental” exposure to terrorist content, or the reminder of the prohibition in EU law of general monitoring obligation for hosting providers. Finally, the Draft Report proposes some adjustments regarding remedies and safeguards. It gives a two week’s deadline for answering complaints by citizens whose content was removed or to which access was denied. The Draft Report also insists that the private complaint mechanisms of internet platforms do not preclude citizens from seeking legal redress before Member State’s courts.

However, Dalton MEP has disappointingly chosen not to address in the referrals of content to platforms for their “voluntary consideration”. These referrals could give national authorities an “escape route” from their human rights obligations by merely suggesting blocking of content which might be unpleasant , but not illegal and thus not suitable to require a removal orders, for a given government. Furthermore, the Rapporteur did not tackle the urgent need of reforming the definition of “terrorist content”, which three United Nations (UN) Special Rapporteurs had previously flagged as a key concern. The vagueness of the definition in the Commission proposal thus persists and  could threaten the work of journalists and NGOs documenting terrorist crimes. Finally, the “proactive measures” have not received the attention and intensive modification they need and they could still lead to de facto general monitoring obligations.

To summarise, the LIBE Draft Report lacks the ambition that would be expected from the Civil Liberties Committee and falls short from the much more comprehensive reworks delivered by the IMCO and CULT Committees. All involved Members of the European Parliament should cooperate and significantly strengthen the negligent and rushed Commission proposal, in particular in regard to the highly dangerous measures of referrals and proactive measures. Serious problems require serious legislation.

Terrorist Content Regulation: Document pool
https://edri.org/terrorist-content-regulation-document-pool/

CULT: Fundamental rights missing in the Terrorist Content Regulation (21.01.2019)
https://edri.org/cult-fundamental-rights-missing-in-the-terrorist-content-regulation/

Terrorist Content: IMCO draft Opinion sets the stage right for EP (18.01.2019)
https://edri.org/cult-fundamental-rights-missing-in-the-terrorist-content-regulation/

Terrorist content regulation – prior authorisation for all uploads? (21.11.2018)
https://edri.org/terrorist-content-regulation-prior-authorisation-for-all-uploads/

EU’s flawed arguments on terrorist content give big tech more power (24.10.2018)
https://edri.org/press-release-eu-terrorism-regulation-an-eu-election-tactic/

Joint Press Release: EU Terrorism Regulation – an EU election tactic (12.9.2018)
https://edri.org/press-release-eu-terrorism-regulation-an-eu-election-tactic/

(Contribution by Yannic Blaschke and Diego Naranjo)

close
23 Jan 2019

EDRi’s Kirsten Fiedler wins Privacy Award

By EDRi

On 22 January, Kirsten Fiedler, current Senior Policy and Campaigns Manager and former Managing Director of European Digital Rights, received the distinguished Felipe Rodriguez Award in celebration of her remarkable contribution to our right to privacy in the digital age.

Why should we defend digital rights and freedoms when there are really pressing and often life-threatening issues out there to fight for? The reason is that the internet and digital communications are seeping into every part of our lives, so our rights online are the basis for everything else we do.

said Fiedler.

I’d like to accept this award on behalf of the entire EDRi team and network. Our strength is in collective, collaborative actions.

Fiedler’s relentless efforts have been crucial to transforming the EDRi Brussels Office from a one-person entity into the current professional organisation with eight staff members. In addition to this, she played an instrumental role in EDRi’s campaigns against ACTA and privatised law enforcement, and has been the engine to the EDRi Brussels office’s growth during the past years.

The Felipe Rodriguez Award is part of the Dutch Big Brother Awards, organised by the EDRi member Bits of Freedom. Previous winners include Kashmir Hill, Open Whisper Systems, Max Schrems, and Edward Snowden. The award ceremony took place on 22 January 2019 in Amsterdam.

Photo: Jason Krüger

Bits of Freedom announces winner of privacy award (09.01.2019)
https://edri.org/bits-of-freedom-announces-winner-of-privacy-award/

Twitter_tweet_and_follow_banner

close
21 Jan 2019

Copyright negotiations begin to derail

By EDRi

The negotiations on the EU’s highly controversial Copyright Directive proposal continue. The last trilogue meeting between Commission, Council and Parliament was originally scheduled for today, 21 January 2019. The event was, however, called off on late Friday evening 18 January by the Romanian Presidency of the EU Council.

It has become increasingly clear that the manifest problems with the text make it hard to find an acceptable compromise on the future of platforms’ and search engines’ liability regimes. A blocking minority formed by Germany, Poland, Belgium, Italy, Sweden, Finland, Slovenia, Hungary and the Netherlands did not approve the Presidency’s revised Council mandate.

This makes it less likely that the EU institutions will find a common position on the deeply flawed Article 13 of the proposal, which will either directly or indirectly require online companies to implement highly error-prone upload filters to search user uploads for copyrighted material. The divisions in the Council are yet another sign of the high degree of polarisation and increasing lack of support for the proposal, which was also highlighted by the fact that even the creative industries called for a halt of negotiations on Article 13 in a joint letter. More than 70 Internet luminaries, the UN Special Rapporteur on Freedom of Expression, civil society organisations, programmers, and a plethora of academics have been highly critical of the proposal from the start.

The suspension of trilogue negotiations does, however, not mean that the fight against upload filters and for the freedom of expression is decided: In fact, it is now more crucial than ever to get in touch with your local Members of the European Parliament (MEPs) and national ministries, and ask them to oppose Article 13.

EDRi continues to follow the negotiations closely and calls all citizens and civil society to act and defend their digital rights through the #SaveYourInternet campaign.

Copyright: Compulsory filtering instead of obligatory filtering – a compromise? (04.09.2018)
https://edri.org/copyright-compulsory-filtering-instead-of-obligatory-filtering-a-compromise/

How the EU copyright proposal will hurt the web and Wikipedia (02.07;2018)
https://edri.org/how-the-eu-copyright-proposal-will-hurt-the-web-and-wikipedia/

EU Censorship Machine: Legislation as propaganda? (11.06.2018)
https://edri.org/eu-censorship-machine-legislation-as-propaganda/

close
21 Jan 2019

CULT: Fundamental rights missing in the Terrorist Content Regulation

By Diego Naranjo

The European Parliament (EP) Committee on Culture and Education (CULT), published on 16 January its Draft Opinion on the proposal for a Regulation preventing the dissemination of terrorist content online. Member of the European Parliament (MEP) Julie Ward, the Rapporteur for the Opinion, has joined Rapporteur for the IMCO Committee Julia Reda MEP, and civil rights group in criticising many aspects of the Commission original proposal. The Rapporteur expresses her concerns regarding threats for “fundamental rights, such as freedom of expression and access to information, as well as media pluralism.”

In the Draft Opinion, CULT proposes a number of changes:

  • Definition of terrorist content: The Opinion suggests aligning the definition of terrorist content with the Terrorism Directive 2017/541/EU and to carve-out educational, journalistic or research material.
  • Definition of hosting service providers: The CULT Committee acknowledges that the definition of these services is “too broad and legally unclear”, and that many services which are not the target of this Regulation would be unnecessarily covered. The Rapporteur suggests covering only those hosting service providers that make the content available to the general public.
  • Removal orders: According to the Opinion, the only authorities competent to issue removal orders should be judicial authorities, since they are the ones with the “sufficient expertise”. Furthermore, the “one hour” time frame to respond to the removal orders is replaced by “without undue delay”. This would allow for more flexibility for smaller service providers.
  • Pro-active measures: The obligation of pro-activity (in practice, to implement upload filters in hosting services) is deleted from the proposal.
  • Finally, the Rapporteur suggests removing the financial penalties in order to avoid smaller providers being overburdened, as well as to prevent the likely scenario “where companies may overly block and remove content in order to protect themselves against possible financial penalties.”

This constitutes, on a general level, a very welcome improvement of the dangerous pitfalls of the Commission’s original proposal. Of particular relevance is the Rapporteur’s assessment that an imposition of proactive measures would amount to a breach of Article 15 of the e-Commerce Directive (which contains the prohibition of general monitoring obligations), as well as the proposed deletion of pro-active measures (upload filters). However, it is unclear how the addition by the Rapporteur in Art. 3 (2) saying that hosting service providers “shall not store terrorist content” could be put in place without upload filters, even if as a safeguard the Rapporteur asks those measures to be “appropriate”.

Another shortcoming of the Draft Opinion is the lack of concern about the highly unaccountable instrument of providing referral capacities to national authorities. For some reason, the Rapporteur has decided not to address this trojan horse, which would directly implement privatised law enforcement in the European Union. Referrals from national authorities, even though with their intent to be just for “voluntary consideration” by private companies, are likely to become the way that pervasive Governments outsource the protection of Freedom of Expression to unaccountable private companies, who are outside of the scope of the Charter of Fundamental Rights.

Even though the Rapporteur has not addressed all of the key issues, there are many positive suggestions in the Draft Opinion. Some of them are in line with the IMCO Committee Draft Opinion, which provided an even more comprehensive proposal for improvement. Given the criticism from both Committees, three UN Special Rapporteurs and a large number of civil society groups, the lead committee, the Civil Liberties (LIBE) Committee, is expected to take all of this criticism on board and comprehensively amend the Regulation.

Draft Opinion of the Committee on Culture and Education on the proposal for a regulation on preventing the dissemination of terrorist content online (16.01.2018)
http://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&reference=PE-632.087&format=PDF&language=EN&secondRef=01

Terrorist Content Regulation: document pool
https://edri.org/terrorist-content-regulation-document-pool

Terrorist Content: IMCO draft Opinion sets the stage right for EP (18.01.2019)
https://edri.org/terrorist-content-imco-draft-opinion-sets-the-stage-right-for-ep/

Terrorist Content Regulation: Warnings from the UN and the CoE (19.12.2018)
https://edri.org/terrorist-content-regulation-warnings-from-the-un-and-the-coe/

The EU Council’s general approach on Terrorist Content Online proposal: A step towards pre-emptive censorship (11.12.2018)
https://edri.org/the-eu-councils-general-approach-on-terrorist-content-online-proposal-a-step-towards-pre-emptive-censorship/

Terrorist Content Regulation: Civil rights groups raise major concerns (05.12.2018)
https://edri.org/terrorist-content-regulation-civil-rights-groups-raise-major-concerns/

Terrorist content regulation – prior authorisation for all uploads? (21.11.2018)
https://edri.org/terrorist-content-regulation-prior-authorisation-for-all-uploads/

(Contribution by Diego Naranjo, EDRi)

Twitter_tweet_and_follow_banner

close