12 Nov 2018

Job alert: EDRi is looking for a Senior Policy Advisor

By EDRi

European Digital Rights (EDRi) is an international not-for-profit association of 39 digital human rights organisations from across Europe and beyond. We defend and promote rights and freedoms in the digital environment, such as the right to privacy, personal data protection, freedom of expression, and access to information.

EDRi is looking for a talented and dedicated Senior Policy Advisor to join EDRi’s team in Brussels. This is a unique opportunity to be part of a growing and well-respected NGO that is making a real difference in the defence and promotion of online rights and freedoms in Europe and beyond. The deadline to apply is 2 December 2018. This full-time, permanent position is to be filled as soon as possible.

Key responsibilities:

As a Senior Policy Advisor, your main tasks will be to:

  • Monitor, analyse and report about human rights implications of EU digital policy developments;
  • Advocate for the protection of digital rights, particularly but not exclusively in the areas of platform regulation, surveillance and law enforcement, telecommunications and digital trade;
  • Provide policy-makers with expert, timely and accurate input;
  • Draft policy documents, such as briefings, position papers, amendments, advocacy one-pagers, letters, blogposts and EDRi-gram articles;
  • Provide EDRi members with information about EU’s relevant legislative processes, coordinate working groups, help developing campaign messages and providing the public with information about EU’s relevant legislative processes and EDRi’s activities.
  • Represent EDRi at European and global events;
  • Organise and participate in expert meetings;
  • Maintain good relationships with policy-makers, stakeholders and the press;
  • Support and work closely with other staff members including policy, communications and campaigns colleagues and report to the Executive Director;
  • Contribute to the policy strategy of the organisation;

Desired qualifications and experience:

  • Minimum 3 years of relevant experience in a similar role or EU institution;
  • A university degree in law, EU affairs, policy, human rights or related field or equivalent experience;
  • Demonstrable knowledge of, and interest in privacy, net neutrality, digital trade, surveillance and law enforcement, freedom of expression, as well as other internet policy issues;
  • Knowledge and understanding of the EU, its institutions and its role in digital rights policies;
  • Experience in leading advocacy efforts and creating networks of influence;
  • Exceptional written and oral communications skills;
  • IT skills;
  • Strong multitasking abilities and ability to manage multiple deadlines;
  • Experience of working with and in small teams;
  • Experience of organising events and/or workshops;
  • Ability to work in English and French. Other European languages an advantage.

What EDRi offers:

  • A permanent, full-time contract;
  • A dynamic, multicultural and enthusiastic team of experts based in Brussels;
  • A competitive NGO salary with benefits;
  • A high degree of autonomy and flexibility;
  • An international and diverse network;
  • Internal career growth;
  • Networking opportunities.

How to apply:

To apply, please send a maximum one-page cover letter and a maximum two-page CV in English and in .pdf format to julien.bencze(at)edri.org by 2 December 2018.

We are an equal opportunities employer with a strong commitment to transparency and inclusion. We strive to have a diverse and inclusive working environment. We encourage individual members of groups at risk of racism or other forms of discrimination to apply for this post.

Please note that only shortlisted candidates will be contacted.

Twitter_tweet_and_follow_banner

close
07 Nov 2018

NGOs urge Austrian Council Presidency to finalise e-Privacy reform

By Epicenter.works

EDRi member epicenter.works, together with 20 NGOs, is urging the Austrian Presidency of the Council of the European Union to take action towards ensuring the finalisation of the e-Privacy reform. The group, counting the biggest civil society organisations in Austria such as Amnesty International and two labour unions, demands in an open letter sent on 6 November 2018 an end to the apparently never-ending deliberations between the EU member states.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

It is today 666 days since the European Commission launched its proposal. The e-Privacy regulation is an essential aspect for the future of Europe’s digital strategy and a necessity for the protection of modern democracies from ubiquitous surveillance networks. Echoing European citizens rightful demands for protections of their online privacy, the organisations ask the Austrian Presidency to lead the way into a new privacy era by concluding the e-Privacy dossier by 2019.

The letter comes in a context in which a parliamentary inquiry from the Austrian Social Democratic party tries to shed light on the lobby connections of the Austrian government regarding the hampering of secure communications for its citizens. Right now, the Austrian government’s position is closely aligned with the interests of internet giants like Facebook and Google, big telecom companies and the advertisement industry.

The Austrian government has recently fast-tracked negotiations on the controversial e-evidence proposal, which would weaken the rule of law and foster further surveillance of citizens’ online behaviour. This is a stark contrast to the meager effort Austrian representatives put into negotiations around legislative proposals that aim to protect the fundamental right to privacy – a topic missing from the Austrian Council Presidency agenda.

In order to ensure that e-Privacy laws will not be used as excuse for the establishment of new repressive instruments, epicenter.works demands a clear commitment to the prohibition of data retention. Data retention has been found unconstitutional in different European countries, while epicenter.works was plaintiff in the 2014 proceedings of the European Court of Justice (ECJ) annulling the data retention directive. A circumvention of the ECJ’s ban through the e-Privacy regulation could expose EU citizens to indiscriminate mass-surveillance and severely undermine trust in EU institutions.

Open Letter sent to Austrian Government (in German only, 06.11.2018)
https://epicenter.works/content/offener-brief-wir-brauchen-eprivacy

Parliamentary inquiry from the Austrian Social Democratic Party (in German only, 29.10.2018)
https://www.parlament.gv.at/PAKT/VHG/XXVI/J/J_02174/index.shtml

Council continues limbo dance with the ePrivacy standards (24.10.2018)
https://edri.org/council-continues-limbo-dance-with-the-eprivacy-standards/

ePrivacy: Public benefit or private surveillance? (24.10.2018)
https://edri.org/eprivacy-public-benefit-or-private-surveillance/

ECJ: Data retention directive contravenes European law (09.04.2014)
https://edri.org/ecj-data-retention-directive-contravenes-european-law/

(Contribution by Thomas Lohninger, EDRi member epicenter.works)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
07 Nov 2018

UN Special Rapporteur analyses AI’s impact on human rights

By Chloé Berthélémy

In October 2018, the United Nations (UN) Special Rapporteur for the promotion and protection of the right to freedom of opinion and expression, David Kaye, released his report on the implications of artificial intelligence (AI) technologies for human rights. The report was submitted to the UN General Assembly on 29 August 2018 but has only been published recently. The text focuses in particular on freedom of expression and opinion, privacy and non-discrimination. In the report, the UN Special Rapporteur David Kaye first clarifies what he understands by artificial intelligence and what using AI entails for the current digital environment, debunking several myths. He then provides an overview of all potential human rights affected by relevant technological developments, before laying down a framework for a human rights-based approach to these new technologies.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

1. Artificial intelligence is not a neutral technology

David Kaye defines artificial intelligence as a “constellation of processes and technologies enabling computers to complement or replace specific tasks otherwise performed by humans” through “computer code […] carrying instructions to translate data into conclusions, information or outputs.” He states that AI is still highly dependent on human intervention, as humans need to design the systems, define their objectives and organise the datasets for the algorithms to function properly. The report points out that AI is therefore not a neutral technology, as the use of its outputs remains in the hands of humans.

Current forms of AI systems are far from flawless, as they demand human scrutiny and sometimes even correction. The report considers that AI systems’ automated character, the quality of data analysis as well as systems’ adaptability are sources of bias. Automated decisions may produce discriminatory effects as they rely exclusively on specific criteria, without necessarily balancing them, and they undermine scrutiny and transparency over the outcomes. AI systems also rely on huge amounts of data that has questionable origins and accuracy. Furthermore, AI can identify correlations that can be mistaken for causations. David Kaye points at the main problem of adaptability when losing human supervision: it poses challenges to ensuring transparency and accountability.

2. Current uses of artificial intelligence interfere with human rights

David Kaye describes three main applications of AI technology that pose important threats to several human rights.

The first problem raised is AI’s effect on freedom of expression and opinion. On one hand, “artificial intelligence shapes the world of information in a way that is opaque to the user” and conceals its role in determining what the user sees and consumes. On the other, the personalisation of information display has been shown to reinforce biases and “incentivize the promotion and recommendation of inflammatory content or disinformation in order to sustain users’ online engagement”. These practices impact individuals’ self-determination and autonomy to form and develop personal opinions based on factual and varied information, therefore threatening freedom of expression and opinion.

Secondly, similar concerns can be raised in relation to our right to privacy, in particular with regard to AI-enabled micro-targeting for advertisement purposes. As David Kaye states, profiling and targeting users foster mass collection of personal data, and lead to inferring “sensitive information about people that they have not provided or confirmed”. The few possibilities to control personal data collected and generated by AI systems put into question the respect of privacy.

Third, the Special Rapporteur highlights AI as an important threat to our rights to freedom of expression and non-discrimination due to AI’s increasingly-allocated role in the moderation and filtering of content online. Despite some companies’ claims that artificial intelligence can support exceeded human capacities, the report sees the recourse to automate moderation as impeding the exercise of human rights. In fact, artificial intelligence is unable to resist discriminatory assumptions or to grasp sarcasm and the cultural context for each piece of content published. As a result, freedom of expression and our right not to be discriminated against can be severely hampered by delegating complex censorship exercises to AI and private actors.

3. A set of recommendations for both companies and States

Recalling that “ethics” is not a cover for companies and public authorities to neglect binding and enforceable human rights-based regulation, the UN Special Rapporteur recommends that “any efforts to develop State policy or regulation in the field of artificial intelligence should ensure consideration of human rights concerns”.

David Kaye suggests human rights should guide development of business practices, AI design and deployment and calls for enhanced transparency, disclosure obligations and robust data protection legislation – including effective means for remedy. Online service providers should make clear which decisions are made with human review and which by artificial intelligence systems alone. This information should be accompanied by explanations of the decision-making logic used by algorithms. Further, the “existence, purpose, constitution and impact” of AI systems should be disclosed in an effort to improve the level of individual users’ education around this topic. The report also recommends to make available and publicise data on the “frequency at which AI systems are subject to complaints and requests for remedies, as well as the types and effectiveness of remedies available”.

States are identified as key actors responsible for creating a legislative framework hospitable to a pluralistic information landscape, preventing technology monopolies and supportive of network and device neutrality.

Lastly, the Special Rapporteur provides useful tools to oversee AI development:

  1. human rights impact assessments performed prior, during and after the use of AI systems;
  2. external audits and consultations with human rights organisations;
  3. enabled individual choice thanks to notice and consent;
  4. effective remedy processes to end human rights violations.

UN Special Rapporteur on Freedom of Expression and Opinion Report on AI and Freedom of Expression (29.08.2018)
https://freedex.org/wp-content/blogs.dir/2015/files/2018/10/AI-and-FOE-GA.pdf

Civil society calls for evidence-based solutions to disinformation
(19.10.2018)
https://edri.org/civil-society-calls-for-evidence-based-solutions-to-disinformation/

(Contribution by Chloé Berthélémy, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
07 Nov 2018

My Data Done Right launched: check your data!

By Bits of Freedom

On 25 October 2018 EDRi member Bits of Freedom launched My Data Done Right – a website that gives you more control over your data. From now on you can easily ask organisations what data they have about you, and ask them to correct, delete or transfer your data.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Use your rights

On 25 May 2018, new privacy rules entered into force in Europe. Based on these rules you have several rights that help you to get more control over your data. However, these rights can only have effect if people can easily exercise them. That is why Bits of Freedom developed My Data Done Right.

Generate and keep track of your requests

With My Data Done Right, you can easily create an access, correction or removal request, or a request to transfer your data. You no longer have to search for the contact details in privacy statements. This information on more than 1000 organisations is already collected on the website. You don’t have to prepare the request yourself either, but it is automatically generated based on your input. You only have to send it.

My Data Done Right also contains a few other useful options. You can receive a reminder about your request by email or in your calendar, so that you don’t forget the request you’ve sent. At the moment, you can generate requests in English and Dutch. Soon there will also be an option to share your experiences with us through a short questionnaire.

Cooperation across Europe

The launch is a starting point for the further development of My Data Done Right. We plan to continue expanding the database with organisations, but also to make My Data Done Right available for all people in the European Union.

Together with other digital rights organisations and volunteers, Bits of Freedom will work on versions of My Data Done Right for other EU countries and grow our database to include many more organisations to which you can address your requests. Do you want to help? Please contact Bits of Freedom!

My Data Done Right
https://www.mydatadoneright.eu/

GDPR Today: Stats, news and tools to make data protection a reality (25.10.2018)
https://edri.org/the-gdpr-today-stats-news-and-tools-to-make-data-protection-a-reality/

Press Release: GDPR: A new philosophy of respect (24.05.2018)
https://edri.org/press-release-gdpr-philosophy-respect/

A Digestible Guide to Individual’s Rights under GDPR (29.05.2018)
https://edri.org/a-guide-individuals-rights-under-gdpr/

(Contribution by David Korteweg , EDRi member Bits of Freedom, the Netherlands)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
07 Nov 2018

Facebook fails political ads tests several times

By Chloé Berthélémy

On 28 June 2018, Facebook announced it had set forth a compulsory “Paid for by” feature, limiting anonymity by requiring to submit a valid ID and proof of residence. This had been introduced in reaction to a series of election interferences in the past year through foreign political advertising on social media platforms. This tool was supposed to help reducing “bad ads”, fighting against election manipulation and online disinformation. Since then, several experiments have been conducted to see whether ad manipulation is still possible on Facebook. Facebook failed all of these tests.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

On 25 and 30 October 2018, VICE News – an online media service publishing news documentaries and reports owned by Vice Media, a North American digital media and broadcasting company – revealed that it had conducted experiments on Facebook’s advertisement system. VICE News declared that it managed to pretend to be US Vice President Mike Pence, the US Democratic National Committee Chairman Tom Perez, the Islamic State and 100 US senators posting and sponsoring political ads on Facebook. The low quality of Facebook’s screening is also challenged by the absurdity of the pages approved to share the ads such as ‘Cookies for Political Transparency’ and ‘Ninja Turtles PAC’. As explained by VICE News, tricking the system was not difficult or did not require extra specific knowledge or competences.

On 31 October 2018, Business Insider revealed that it had also managed to set up a fake NGO page and run ads “paid for by Cambridge Analytica”.

VICE News and Business Insider’s investigations show that anyone can lie about the sponsoring of political ads. As a joint civil society report published by EDRi, Access Now and the Civil Liberties Union for Europe shows, unless the deeper problem of the business model that feeds these concerns is addressed, online disinformation or the underlying issue of online manipulation will not be resolved.

Informing the “disinformation” debate by EDRi, Access Now and Civil Liberties (18.10.2018)
https://edri.org/files/online_disinformation.pdf

Facebook’s political ad tool let us buy ads “paid for” by Mike Pence and ISIS by William Turton, VICE News (25.10.2018)
https://news.vice.com/en_us/article/wj9mny/facebooks-political-ad-tool-let-us-buy-ads-paid-for-by-mike-pence-and-isis

We posed as 100 Senators to run ads on Facebook. Facebook approved all of them by William Turton, VICE News (30.10.2018)
https://news.vice.com/en_us/article/xw9n3q/we-posed-as-100-senators-to-run-ads-on-facebook-facebook-approved-all-of-them

We ran 2 fake ads pretending to be Cambridge Analytica — and Facebook failed to catch that they were frauds by Shona Ghosh, Business Insider (31.10.2018)
https://www.businessinsider.nl/facebook-approved-political-ads-paid-for-by-cambridge-analytica-2018-10/

(Contribution by Chloé Bérthélemy, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
07 Nov 2018

Brussels up close – Experiences from the EDRi exchange programme

By Epicenter.works

Learning and knowing abstractly how the EU works is one thing, seeing it up close and doing advocacy work right there is quite another! I am a Policy Advisor for the Austrian EDRi member organisation “epicenter.works – for digital rights” and, in October 2018, I spent two weeks with the EDRi office in Brussels. My aim was to get a better understanding of EU law making and advocacy.

Having a background in the field of criminology and law, I was excited to start right away with a rather new dossier on cross border access to data in police investigations: the e-Evidence Regulation. We will probably have to work on e-Evidence for a long time and I am glad to have had the opportunity to familiarise myself early on with this dossier and discuss its many flaws – several of which are quite intricate – with the EDRi policy team.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Working in Brussels in person has been a big step forward for me in understanding EDRi’s policy work and will enable me to make a better contribution to it in the future. I took part in developing a strategy for proposed amendments that I have started working on during my stay there. As part of EDRi’s e-Evidence working group, I will continue this work on both a national and EU level.

In what at times felt like quite a meeting-marathon, I have had the chance to accompany Maryant Fernandez Perez and Chloé Berthélémy to meetings on e-Evidence with several national Permanent Representations to the EU. I also attended an event organised by the German region North Rhine-Westphalia with German European Commission officials, the European Parliament, the German bar association and police forces on the topic.

As if all that was not enough, I was also briefed on and have familiarised myself with the Terrorism Regulation, and the current state of plans for an ePrivacy regulation. On these dossiers I joined Joe McNamee, Diego Naranjo, Yannic Blaschke and Estelle Massée (Access Now) in meetings with companies and stakeholders. It is an important experience to see how different such meetings can be shaped, depending on the strategy (our own or that of our interlocutors), the culture of the country, company or institution, the knowledge of the dossier and the extent of agreement and so on.

It was also instructive to experience EDRi’s coordination with its member organisations from the other side and see the planning and communication that goes into the adoption and execution of joint strategies among EDRi and its many members. I want to thank the entire EDRi team for welcoming me warmly and for making the exchange program a truly great experience. Many thanks also to the Digital Rights Fund for the financial support of my travel!

Finally, I want to recommend this exchange to any EDRi member. Sometimes it is small things that matter, like hearing Maryant say in her introductory words of a meeting: “We are here to represent 39 member organisations all over Europe” and see and experience what those words mean in practice. It is a fact I knew, of course, but its effect can seem elusive at times, when working on the fronts of national politics, having Brussels only in the back of one’s mind. Therefore, I recommend going to Brussels and seeing from up close how EU politics are made and experience what EDRi is and how it works. Spoiler: EDRi is important and it works best in close cooperation with its members.

epicenter.works
https://epicenter.works/

12 days of digital rights in Brussels. Was it Christmas?
https://edri.org/12-days-of-digital-rights-in-brussels-was-it-christmas/

EDRi’s “Brussels Exchange Programme” – turning theory into practice (07.02.2018)
https://edri.org/edris-brussels-exchange-programme-turning-theory-into-practice/

(Contribution by Angelika Adensamer, EDRi member epicenter.works, Austria)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
06 Nov 2018

Welcoming our new Executive Director Claire Fernandez!

By EDRi

EDRi is happy to announce that we found a new Executive Director! Claire Fernandez will join the organisation on 19 November 2018, and will be in charge of the leadership, mission and strategy of the organisation, its financial sustainability and oversight, and the daily management of the operations. Claire’s joining of the organisation is part of a wider leadership change and transition in our Brussels office team.

Since February 2013, Claire has worked as the Deputy Director of the European Network Against Racism (ENAR). EDRi and ENAR partnered up earlier this year to draw up some core principles in the fight against illegal content online.

Prior to her role in ENAR, she worked as an independent human rights consultant, leading the Open Society Foundations’ campaign on the reform of the European Court of Human Rights and revising the Council of Europe Commissioner for Human Rights’ Report on the human rights of Roma. Previously, Claire Fernandez was an adviser to the Council of Europe Commissioner for Human Rights. From 2008 to 2010, she represented the Organization for Security and Cooperation in Europe (OSCE) in Bosnia and Kosovo, advising local authorities on good governance and minorities’ rights. She holds a Master degree in Human Rights from the Robert Schuman University in Strasbourg, France.

I am grateful for the opportunity to work with this impressive network and staff on digital rights, which are now increasingly recognised as the cornerstone of human rights, rule of law and democracy.

said Claire.

The Brussels office staff, the EDRi board and the EDRi members warmly welcome Claire. We all look forward to working with her!

Read more:

Upcoming EDRi leadership change: A message from Joe and Kirsten (29.03.2018)
https://edri.org/upcoming-edri-leadership-change-message-joe-kirsten/

Twitter_tweet_and_follow_banner

close
25 Oct 2018

The GDPR Today – Stats, news and tools to make data protection a reality

By EDRi

25 October 2018 marks the launch of GDPR Today – your online hub for staying up-to-date with the (real) life of the new EU data protection law, the General Data Protection Regulation (GDPR). The project will monitor the implementation of the law across Europe by publishing statistics and sharing relevant news around key subjects.

GDPR Today, led by several EDRi member organisations, aims to complement our association’s past support for the data protection reform.

Katarzyna Szymielewicz, vice-president of EDRi and co-founder and president of Panoptykon Foundation

The initiative will prioritise building knowledge around legal guidelines and decisions, data breaches, new codes of conduct, tools facilitating individuals’ exercise of rights, important business developments and governmental support for data protection authorities. The GDPR Today is an instrument aimed at data protection experts, activists, journalists, lawyers, and anyone interested in the protection of personal data.

Our goal with GDPR Today is to present facts to the public on the implementation of the law, so that those interested can follow how the GDPR is both shaping the EU digital market and helping people regain control over their personal data.

Estelle Massé, Senior Policy Analyst and Global Data Protection Lead at Access Now

The GDPR has so far often been portrayed as a burden, and the focus has been on so-called non-functional elements which remain untested and often created misunderstanding around the functional ones. The GDPR Today will put facts on the implementation of the law at the centre of the debate.

Read the first edition of the GDPR Today here: https://www.gdprtoday.org/

Twitter_tweet_and_follow_banner

close
24 Oct 2018

ENDitorial: YouTube puts uploaders, viewers & itself in a tough position

By Bits of Freedom

A pattern is emerging. After blocking a controversial video, YouTube nonpologises for doing so, and reinstates the video… just to block it again a few months later. The procedures around content moderation need to improve, but that’s not all: more needs to change.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

In June 2018, EDRi member Bits of Freedom reported that YouTube had already taken down a Dutch pro-choice NGO Women on Waves’ accounts three times in 2018, each time without proper justification. As if that wasn’t ridiculous enough, their account was taken down a fourth time just as they were being interviewed by the Dutch television program Nieuwsuur about the previous takedowns, again without notice, and without a satisfactory explanation. YouTube subsequently did what it has done many times before: the company issued a nonpology and reinstated the account. Based on experience, it is a question of when, not if, it gets removed again.

It’s odd that an account can be wrongfully blocked several times over the course of just a few months. One would expect that, after an account has been wrongfully blocked once or, at worst, twice, moderators would receive a warning that triggers a process in which a(n additional) person is involved as soon as the account is recommended for blocking. However, at best, this would only prevent the most obvious mistakes. Whether there’s a properly functioning process in place to block videos or accounts or not, there will always be controversies. The company will not be able to prevent the occasional moderation error from happening.

YouTube is in a near-monopoly position when it comes to uploading and watching videos, and it has a huge reach. Every decision YouTube makes about whether a video can be accessed through its platform has the possibility of having an enormous impact. This becomes especially clear regarding videos that deal with controversial topics. Nieuwsuur gives a few examples: bodily integrity, sexual freedom, and cannabis. Of course you’ll always be able to find someone somewhere in the world who has a problem with these topics, which is probably the reason for YouTube to ban certain videos about these topics upfront, and to quickly remove other videos as soon as someone complains. Videos and accounts disappear if one or more viewers report them as offensive, or if YouTube’s computers detect certain images or combinations of words.

This puts everyone in a tough position: the creator, the viewer and the platform itself. Creators see their videos fall off the internet from time to time and can’t do anything about it. Viewers can’t watch the videos they want to watch, regardless of their feelings about certain topics. Platforms will never be able to please everyone; opinions will continue to differ. Moreover, due to public and political pressure, a company can no longer decide for itself how to run its platform.

The only solution to all this lies in ensuring that everyone – the uploader, viewer and the platform – has options to choose from. The only way to do that is to ensure that multiple platforms exist side by side. Each with their own interests, considerations, and audience. It enables creators to choose the platform that fits them best. As a viewer you can choose a platform that is as open-minded as you are. And the platform can go back to making its own decisions about what it deems acceptable and what not.

And the beauty of it all: in this scenario the procedures for moderating content become less crucial. If a platform handles complaints in a very sloppy way, then one can simply choose a better functioning alternative, because they aren’t dependent on that particular platform.

YouTube puts uploaders, viewers and itself in a tough position (25.10.2018)
https://www.bitsoffreedom.nl/2018/10/24/youtube-puts-uploaders-viewers-and-itself-in-a-tough-position/

Women on Waves’ three YouTube suspensions this year show yet again that we can’t let internet companies police our speech (28.06.2018)
https://www.bitsoffreedom.nl/2018/06/28/women-on-waves-three-youtube-suspensions-this-year-show-yet-again-that-we-cant-let-internet-companies-police-our-speech/

YouTube censors Dutch organizations’ videos (only in Dutch)
https://nos.nl/nieuwsuur/artikel/2244146-youtube-censureert-video-s-nederlandse-organisaties-kanaal-weer-op-zwart.html

(Contribution by Rejo Zenger, EDRi member Bits of Freedom, the Netherlands)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
24 Oct 2018

Council continues limbo dance with the ePrivacy standards

By Yannic Blaschke

It’s been six-hundred-fifty-two days since the European Commission launched its proposal for an ePrivacy Regulation. The European Parliament took a strong stance towards the proposal when it adopted its position a year ago, but the Council of the European Union is still only taking baby steps towards finding its position.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

In their latest proposal, the Austrian Presidency of the Council continues, unfortunately, the trend of presenting the Council with suggestions that lower privacy protections that were proposed by the Commission and strengthened by the Parliament. In the latest working document that was published on 19 October 2018, it becomes apparent that we are far from having reached the bottom of what the Council sees as acceptable in treating our personal data as a commodity.

Probably the gravest change of the text is to allow the storing of tracking technologies on the individual’s computer without consent for websites that partly or wholly finance themselves through advertisement, provided they have informed the user of the existence and use of such processing and the user “has accepted this use” (Recital 21). The “acceptance” of such identifiers by the user as suggested is far from being the informed consent that the General Data Protection Regulation (GDPR) established as a standard in the EU. The Austrian Presidency text will put cookies which are necessary for a regular use (such as language preferences and contents of a shopping basket) on the same level as the very invasive tracking technologies which are being pushed by the Google/Facebook duopoly in the current commercial surveillance framework. This opens the Pandora’s box for more and more sharing, merging and reselling citizen’s data in huge online commercial surveillance networks, and micro-targeting them with commercial and political manipulation, without the knowledge of the person whose private information is being shared to a large number of unknown third parties.

One of the great added values of the ePrivacy Regulation (which was originally intended to enter into force at the same point in time as the GDPR) is that it’s supposed to raise the bar for companies and other actors who want to track citizens’ behaviour on the internet by placing tracking technologies on the users’ computers. Currently, such an accumulation of potentially highly sensitive data about an individual mostly happens without real knowledge of individuals, often through coerced (not freely given) consent, and the data is shared and resold extensively within opaque advertising networks and data-broker services. In a strong and future-proof ePrivacy Regulation, the collection and processing of such behavioural data thus needs to be tightly regulated and must be based on an informed consent of the individual – an approach that becomes now more and more jeopardised as the Council seems to become increasingly favourable to tracking technologies.

The detrimental change of Recital 21 is only one of the bad ideas through which the Austrian Presidency seeks to strike a consensus: In addition, there is for instance the undermining of the protection of “compatible further processing” (which is itself already a bad idea introduced by the Council) in Article 6 2aa (c), or the watering down of the requirements for regulatory authorities in Article 18, which causes significant friction with the GDPR. With one disappointing “compromise” after another, the ePrivacy Regulation becomes increasingly endangered of falling short on its ambition to end unwanted stalking of individuals on the internet.

EDRi will continue to observe the developments of the legislation closely and calls everyone in favour of a solid EU privacy regime that protects citizens’ rights and competition to voice their demands to their member states.

Five Reasons to be concerned about the Council ePrivacy draft (26.09.2018)
https://edri.org/five-reasons-to-be-concerned-about-the-council-eprivacy-draft/

EU Council considers undermining ePrivacy (25.07.2018)
https://edri.org/eu-council-considers-undermining-eprivacy/

Your ePrivacy is nobody else’s business (30.05.2018)
https://edri.org/your-eprivacy-is-nobody-elses-business/

e-Privacy revision: Document pool (10.01.2017)
https://edri.org/eprivacy-directive-document-pool/

(Contribution by Yannic Blaschke, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close