copyright

In the digital era, copyright should be implemented in a way which benefits creators and society. It should support cultural work and facilitate access to knowledge. Copyright should not be used to lock away cultural goods, damaging rather than benefitting access to our cultural heritage. Copyright should be a catalyst of creation and innovation. In the digital environment, citizens face disproportionate enforcement measures from states, arbitrary privatised enforcement measures from companies and a lack of innovative offers, all of which reinforces the impression of a failed and illegitimate legal framework that undermine the relationship between creators and the society they live in. Copyright needs to be fundamentally reformed to be fit for purpose, predictable for creators, flexible and credible.

22 May 2019

Google-Huawei case highlights the importance of free software

By Free Software Foundation Europe - FSFE

Google denies the Chinese IT giant Huawei access to Google’s proprietary components of the Android mobile operating system, which threatens IT security. This highlights the importance of free software for technology users, public bodies, and businesses.

Following the US administration’s decision to effectively ban American companies from doing trade with the Chinese company Huawei, Google suspended all business with the company. This affects all software which is not covered under free software licences. In practice, Huawei’s upcoming and potentially also current phones will no longer get support and updates for the Android operating system. They will also not have access to the proprietary Google apps and services like Gmail and Google Play. Especially the latter will put Huawei users at risk, because without access to the default app store on most Android phones they will miss important security updates for the apps installed through it.

Google offers only a base version of Android under a free software licence, but bundles it together with proprietary apps and services. The non-free components of most stock Android devices have numerous downsides for users, as EDRi member Free Software Foundation Europe (FSFE) has documented since 2012. The current case demonstrates that even tech giants like Huawei face similar dependencies and vendor lock-in effects as any individual users if they rely on proprietary software.

The following lessons can be drawn from this case:

  1. Users should favour free software operating systems and applications on their computing devices. With proprietary software, they are on the receiving end only, and vendors may deny them access to crucial security updates. Free software enables control of technology, and the more important that technology becomes in our daily lives, the more relevant free software becomes for users. For Android, the FSFE helps users to regain more control with its Free Your Android initiative.
  2. Governments and especially the European Union should invest more resources in free software to gain independence from large enterprises and other states. The current case highlights the lack of influence the EU has on outside technology providers. Instead of waiting for a future European IT monopolist to enter the stage, the EU and its members states should invest in free software development and focus on supporting local free software organisations as well as businesses. This would effectively foster the inner-European market and enable independence for European citizens and the EU economy. This step is essential for avoiding exposing European infrastructure to shutdowns controlled by external factors.
  3. Companies should use as much free software as possible in their supply chains. Proprietary software makes a company dependent on its vendor and this vendor’s government. The current case shows that the US was able to force Google to stop delivery of its proprietary products – but could not stop delivery of the free software components of Android. Had Huawei invested more resources in free software apps and services, the US strategy would not have hit them as hard. Although the current events are linked to the scrutiny the Chinese company is under right now, it is obvious that this could happen to any other company based in any other country as well.

The earlier allegations against Huawei already showed that code for all critical infrastructure should be published under a free software licence. The latest episode of the Huawei affair illustrates that the same applies to apps and services. Just days before the European elections, this should be a wake-up call for the next constituent Parliament to ask the European Commission for European Directives that foster independence of European technical infrastructure and that build on free software, starting with the demand to release publicly funded software as public code.

Free Software Foundation Europe (FSFE)
https://fsfe.org/

Three conclusions to draw from Google denying Huawei access to software (20.05.2019)
https://fsfe.org/news/2019/news-20190520-01.en.html

Free Your Android!
https://freeyourandroid.org

Public Money, Public Code
https://publiccode.eu

Huawei case demonstrates importance of Free Software for security (05.02.2019)
https://fsfe.org/news/2019/news-20190205-01.en.html

(Contribution by EDRi member Free Software Foundation Europe – FSFE, Europe)

close
22 May 2019

ePrivacy: Private data retention through the back door

By Digitalcourage

Blanket data retention has been prohibited in several court decisions by the European Court of Justice (ECJ) and the German Federal Constitutional Court (BVerfG). In spite of this, some of the EU Member States want to reintroduce it for the use by law enforcement authorities – through a back door in the ePrivacy Regulation.

The ePrivacy Regulation

The ePrivacy Regulation, which is currently under negotiation, is aimed at ensuring privacy and confidentiality of communications in the electronic communications, by complementing and particularising the matters covered in the General Data Protection Regulation (GDPR). Confidentiality of communications is currently covered by the ePrivacy Directive dating back to 2002. A review of this piece of legislation is long overdue, but Member States keep delaying the process and therefore not updating necessary protections for online privacy in the EU.

Ever since 2017, the EU Ministers of Justice and Interior have been “deliberating” the Tele2 verdict by the European Court of Justice. The Court had declared the blanket retention of telecommunications metadata inadmissible. Yet the EU Member States are unwilling to accept this ruling. During an informal discussion in Valetta on 26 and 27 January 2017, the Justice and Interior Ministers expressed their wish for “a common reflection process at EU level on data retention in light of the recent judgments of the Court of Justice of the European Union” (Ref. EU Council 6713/17) to implement EU-wide data retention. This process was set in motion in March 2019 by the Presidency of the Council of the European Union. A sub-group of the Council’s Working Party on Information Exchange and Data Protection (DAPIX) was put in charge. From the very beginning, this reflection process has mainly served the purpose of finding opportunities to implement yet another instance of data retention on the EU level. This has been proven by documents published by EDRi member Statewatch.

Instead of complying with the clear rulings by the European Court of Justice (Tele 2 and Digital Rights Ireland), the responsible ministers are doing everything they can to “resurrect” data retention, potentially using ePrivacy as a basis for a new era of data retention. In a working document (WK 11127/17), the Presidency of the EU Council in 2017 concluded in addition to a specific data retention legislation it would be desirable to also collect citizens’ communications data (metadata) in ePrivacy to avoid so companies can use it for commercial purposes. The logic behind being, probably, to circumvent CJEU case law by not imposing an obligation on companies but having the data available when law enforcement needs it thanks to ePrivacy.

Private data retention

In plain words, this means: If the courts will not allow mass data retention, service providers will simply be given incentives to do so by their own choice. That is why the ePrivacy Regulation is being watered down by Member States in order to give the service providers manifold permissions to store data for a wide variety of reasons (see Article 6 of the draft ePrivacy Regulation). Those responsible are relying on the assumption that the providers’ appetite for data will be sufficient even without an explicit obligation to retain data.

The immediate problem with this type of private data retention is the fact that it weakens the protection of all users’ personal data against data hungry corporations whose main interest is making profit. What’s even worse is that, once again, a governmental function is being outsourced to private corporations. These corporations are not subject to democratic scrutiny, and they are given ever more power over the countries concerned.

In Germany, the hurdles for criminal investigators to get access to data are already very low. The e-mail provider Posteo, for example, had to pay a fine because they were unable to provide the criminal investigators the IP addresses from which a certain e-mail account had been accessed. Posteo simply hadn’t stored those data; they were erased as soon as they were received. The Court declared the fine to be justified. This decision could easily lead to a situation where private companies prefer to err on the side of caution and store even more data, just to avoid such fines.

The draft ePrivacy Regulation as proposed by the European Commission in 2017 placed relatively strict duties on service providers regarding data protection. For example, they were obliged to either erase or anonymise all data that was no longer needed. This is diametrically opposed to the goal of private data retention, and the DAPIX task force noticed it, too. As the Presidency of the EU Council statedservice providers will be given the freedom to use and store data in order to prevent “fraudulent use or abuse”. And these data could then be picked up by law enforcement doing criminal investigation.

No data retention through the back door!

EDRi member Digitalcourage wanted to know how the German government argued with respect to the data retention issue, and submitted a request for the disclosure of documents related to it. Unfortunately, the request was largely denied by the Council of the European Union, long after the legal deadline was missed. The secretariat declared that a disclosure would be a threat to public safety – the risk to the relationship of trust between the Member States and Eurojust, the EU agency dealing with judicial co-operation in criminal matters among agencies of the Member States, would be too severe. Furthermore, such a disclosure would threaten ongoing criminal investigation or judicial procedures. No further details were given. Digitalcourage lodged an appeal against this dismissal, but in addition to being asked for patience, they haven’t received an answer from the European Commission. Several requests pursuant to the Freedom of Information Act have also been submitted to German ministries.

It is unbelievable to imagine policy makers contemplating existing and potential new surveillance laws that would clearly be illegal. However, this is exacly what the DAPIX task force is doing, and they are doing it behind closed doors. The changes they propose can be found in the current draft ePrivacy Regulation. Digitalcourage will continue to request documents from the EU and the German government. As soon as the trilogue negotiations between EU Council, Commission and Parliament begin, the concerns will be voiced our concerns and a demand: No data retention through the back door!

This article was first published at https://digitalcourage.de/blog/2019/eprivacy-private-data-retention-through-the-back-door

Digitalcourage
https://digitalcourage.de/en

ePrivacy: Private data retention through the back door (in German, 18.04.2019)
https://digitalcourage.de/blog/2019/eprivacy-private-vorratsdatenspeicherung-durch-hintertuer

(Contribution by EDRi member Digitalcourage, Germany)

close
22 May 2019

Christchurch call − pseudo-counter-terrorism at the cost of human rights?

By Claire Fernandez

The Prime Minister of New Zealand Jacinda Arden showed compassionate and empathetic leadership in her response to the Christchurch terrorist attack on a mosque in her country on 15 March 2019. On 16 May in Paris, Arden and the French President Emmanuel Macron co-launched the Christchurch Call to Action to Eliminate Terrorist and Violent Extremist Content Online.

The day before, EDRi joined a meeting the New Zealand government held with civil society and academics. The purpose of the meeting was to present the call and to hear recommendations moving forward on the call implementation and joint work to combat terrorism and white supremacy.

While the approach of the New Zealand government is sensible, and the final text of the call to action does include human rights safeguards for a free and open internet, the initiative is naïve as it relies on questionable companies and governments’ practices, inefficient in combatting terrorism, and opens the door to serious human rights breaches.

A “sacrificed process”

In the words of Arden herself, civil society consultations were “sacrificed” to allow for a swift process and for the call to be launched, on the occasion of the Tech for Good conference and the G7 Digital Ministers meeting. NGOs and other stakeholders such as journalists, academics and the technical community did not get a chance to submit contributions before the finalisation of the call. The rushed timeline was an obstacle to any meaningful participation in the process. The lack of anti-racism organisations or organisations from the Global South in the consultative meeting in Paris is a major gap for an initiative purporting to address “violent extremism” globally.

Failure to address social media business model

The call to action refrains from criticising and questioning the business model of Google, Amazon, Facebook, and Apple in order to get them to sign the initiative. However, as long as profit is made mainly from behavioural advertising revenue which increases by showing polarising, violent or illegal content, the entire system will continue to promote such content and lead people to share it. Human nature and all of its addictions are encouraged and amplified by opaque artificial intelligence.

Human rights concerns

As state authorities are unable to call out the big tech for larger issues, the Christchurch call places emphasis on content removal and filtering of broadly and ill-defined content. “Terrorist and violent extremist” content can be left to the appreciation of law enforcement authorities and companies, which opens risks of arbitrariness against legitimate dissent from groups at risk of racism, human rights defenders, civil society organisations or political activists. Solutions such as upload filters or rapid removals of content can be turned into censorship and are error-prone, as the European Commission acknowledges, by stating that “biases and inherent errors and discrimination can lead to erroneous decisions”. Invaluable and unique evidence of human rights abuses committed by groups or governments can also disappear, as examples from the war in Syria show. UN Special Rapporteur on human rights and counter-terrorism estimates that around 67% of people affected by counter-terrorism or security policies are human rights defenders.

Handing over policing powers and regulating freedom of expression to the private sector, with no accountability or possibility of redress, is highly problematic for the rule of law. Companies’ terms of services do not replace laws when it comes to assessing what is legal and what is not. In addition to this problem, there should be redress mechanisms to review whether in fact only illegal terrorist content has been removed − otherwise human rights will be at risk in countries that do not have the same respect for the rule of law than New Zealand.

Algorithms used to prevent uploading or delete content are not transparent, and do not allow for accountability or redress mechanisms. Therefore unaccountable removal of content and incentives for over-removal of content must be explicitly rejected. Likewise, law enforcement authorities must be held accountable by being obliged to submit transparency reports regarding the requests to remove content, including the number of investigations and criminal cases opened as a result of these requests. There are many initiatives addressing the broad range of “harmful online content” such as the upcoming G7 Biarritz Summit, France and the UK’s online harms/platform duties proposals and the EU Regulation on Terrorist Content Online. The overall impact of initiatives that risk limiting freedom of expression needs to be evaluated based on evidence. This is currently not the case.

Broader societal efforts are needed to effectively combat terrorism – online and offline. These include education, social inclusion, questioning the impact of austerity, accountability for politicians using hate speech and stigmatising rhetoric, and real community involvement.

What the YouTube and Facebook statistics aren’t telling us (24.04.219)
https://edri.org/what-the-youtube-and-facebook-statistics-arent-telling-us/

Commission working document – Impact assessment accompanying the Proposal for a Regulation of the European Parliament and of the Council on preventing the dissemination of terrorist content online (12.09.2018)
https://ec.europa.eu/commission/sites/beta-political/files/soteu2018-preventing-terrorist-content-online-swd-408_en.pdf

Report of the Special Rapporteur on the promotion and protection of human rights and fundamental freedoms while countering terrorism on the role of measures to address terrorism and violent extremism on closing civic space and violating the rights of civil society actors and human rights defenders (18.02.2019)
https://www.ohchr.org/Documents/Issues/Terrorism/SR/A_HRC_40_52_EN.pdf

(Contribution by Claire Fernandez, EDRi)

close
22 May 2019

Why should we vote in the EU elections?

By EDRi

What are your plans for the coming days? We have a suggestion: The European elections will take place – and it’s absolutely crucial to go and vote!

In the past, the EU has often defended our digital rights and freedoms. This was possible because the Members of the European Parliament (MEPs) – who we, the EU citizens, elected to represent us in the EU decision-making – are open to hearing our concerns.

So, what exactly has the EU done for our digital rights?

Privacy

The EU has possibly the best protection for citizens’ personal data: the General Data Protection Regulation (GDPR). This law was adopted thanks to some very dedicated European parliamentarians, and it enhances everyone’s rights, regardless of nationality, gender, economic status and so on. Since the GDPR came into effect, we now have for example the right to access our personal data a company or an organisation holds on us, the right to explanation and human intervention regarding automated decisions, and the right to object to profiling measures.

You can read more about your rights under the GDPR here: https://edri.org/a-guide-individuals-rights-under-gdpr/

Net neutrality

Europe has become a global standard-setter in the defence of the open, competitive and neutral internet. After a very long battle, and with the support of half a million people that responded to a public consultation, the principles that make the internet an open platform for change, freedom, and prosperity are upheld in the EU.

In June 2015, negotiations between the three European Union institutions led to new rules to safeguard net neutrality – the principle according to which everyone can communicate with everyone on the internet without discrimination. This principle was put at risk by the ambiguous, unbalanced EU Commission proposal, which would have undermined the way in which the internet functions. In 2016, the Body of European Regulators for Electronic Communications (BEREC) was tasked with publishing guidelines to provide a common approach to implementing the Regulation in the EU Member States. In June 2016, BEREC published the draft guidelines that confirm strong protections for net neutrality and open internet.

ACTA

In 2012, the MEPs voted against an international trade agreement called the Anti-Counterfeiting Trade Agreement (ACTA), which, if concluded, would have likely resulted in online censorship. It would have had major implications for freedom of expression, access to culture and privacy, it will harm international trade and stifle innovation. Therefore, people decided to demonstrate and there were protests against this draft agreement in over 200 European cities calling for a rejection. In the end, the Parliament listened to the concerns of the people and voted against ACTA.

Protecting whisteblowers

Whistleblowers fight for transparency, democracy and the rule of law, reporting unlawful or improper conduct that undermine the public interest and our rights and freedoms. In 2017, the European Parliament called on legislation to protect whistleblowers, making a clear statement recognising the essential role of whistleblowers in our society. This Resolution started the process of putting into place effective protections for whistleblowers throughout the EU. In April 2019, the Parliament adopted the new Directive, which is still to be approved by the EU Council.

Your vote matters for digital rights

In many occasions, the EU Parliamentarians have stood for our rights and freedoms. It’s important that also the new EU Parliament will be a strong defender of our digital rights – because there are so many important fights coming up.

The European elections are one of the rare occasions where we can take our future and the future of Europe into our own hands. Your vote matters. Please go and vote for digital rights on 23-27 May!

You can find more information about the elections online, for example at https://www.european-elections.eu, https://www.thistimeimvoting.eu/ and https://www.howtovote.eu/.

close
22 May 2019

Facebook lies to Dutch Parliament about election manipulation

By Bits of Freedom

On 15 May 2019, Facebook’s Head of Public Policy for the Netherlands spoke at a round table in the House of Representatives about data and democracy. The Facebook employee reassured members of parliament that Facebook has implemented measures to prevent election manipulation. He stated: “You can now only advertise political messages in a country, if you’re a resident of that country.” Nothing seems to be further from the truth.

Dutch EDRi member Bits of Freedom wanted to know if it were possible to target Dutch voters from a foreign country, using the type of post and method of advertising that were employed in, among others, the “Leave” campaign in the UK. From Germany, they logged in to a German Facebook account, created a new page, and uploaded a well-known Dutch political meme. They then paid to have it shown to Dutch voters and settled the bill using a German bank account. Contrary to what Facebook led members of parliament to believe, there was nothing that stood in their way of doing so.

The other way around was just as easy. Facebook failed to stop Bits of Freedom from targeting German voters interested in German political parties Christian Democratic Union of Germany (CDU) and Alternative for Germany (AfD) with a CDU/AfD meme, even though they were using a Dutch Facebook account, had signed in from the Netherlands, and payed for the ad with a Dutch bank account. Better yet, Facebook suggested to add to their demographic, people with the additional interests “nationalism” and “military”. Thanks, Facebook!

We’re not dealing with a company that occasionally messes up. Facebook has time and time again exhibited a complete disregard for our democracy, freedom of expression, and privacy. Therefore, Bits of Freedom called on the House of Representatives to take action. On 20 May, on the Dutch current affairs television program “Nieuwsuur”, Labour Party (PvdA) leader Lodewijk Asscher responded: “Facebook promises to do better, and time and time again their promises prove worthless. Facebook says all the right things but in reality is a threat to democracy.” Liberal MP Kees Verhoeven (D66) added: “As far as I’m concerned, now is the time we stop relying on self-regulation and trusting companies’ promises, and start regulating.”

This article was first published at https://www.bitsoffreedom.nl/2019/05/21/facebook-lies-to-dutch-parliament-about-election-manipulation/

Bits of Freedom
https://www.bitsoffreedom.nl/

Nieuwsuur: Facebook lies about political advertising (20.05.2019)
https://www.npostart.nl/nieuwsuur/20-05-2019/VPWON_1297109

Bits of Freedom
https://www.bitsoffreedom.nl/

Nieuwsuur: Facebook lies about political advertising (only in Dutch, 20.05.2019)
https://www.npostart.nl/nieuwsuur/20-05-2019/VPWON_1297109

Steps you can take to minimise the political ads you see online (19.05.2019)
https://privacyinternational.org/long-read/2913/steps-you-can-take-minimise-political-ads-you-see-online

(Contribution by Evelyn Austin, EDRi member Bits of Freedom, the Netherlands)

close
20 May 2019

NGOs call to ensure fundamental rights in copyright implementation

By EDRi

Today, on 20 May 2019, EDRi and 41 other organisations sent an open letter to the European Commission. The letter is calling for the inclusion of civil society in the implementation process of the newly adopted Copyright Directive through the upcoming stakeholder dialogue.

The stakeholder dialogue is a consultation process mandated by the Copyright Directive. It will serve as an opportunity for relevant stakeholders to discuss the transposition and implementation of the infamous Article 13 (Article 17 in the final text) of the Copyright Directive.

The signatories of the letter have on numerous occasions throughout the legislative debate on the copyright reform expressed their explicit concerns about the fundamental rights questions that will arise during the implementation of the Directive.

The letter highlights that the participation of organisations representing internet users in the consultation process is crucial for ensuring that fundamental rights are properly considered, especially in cases where the Directive requires internet platforms to disable access to or remove user-uploaded content. A diverse working group can ensure that the fears around automated upload filters are not realised. It can assist in creating guidelines under which both content-sharing service providers and rightsholders respect the Charter of Fundamental Rights of the European Union.

You can read the letter below, or download it here (pdf).


20 May 2019

Dear President Juncker,
Dear First Vice-President Timmermans,
Dear Vice-President Ansip,
Dear Commissioner Gabriel,
Dear Director General Roberto Viola,

The undersigned stakeholders represent fundamental rights organizations, the knowledge community (in particular libraries), free and open source software developers, and communities from across the European Union.

The new Directive on Copyright in the Digital Single Market has been adopted and, as soon as it is published in the Official Journal, Member States will have two years to implement the new rules. Article 17, on ‘certain uses of protected content by online services’, foresees that the European Commission will issue guidance on the application of this Article.

The undersigned organisations have, on numerous occasions throughout the legislative debate on the copyright reform, expressed their very explicit concerns (1) about the fundamental and human rights questions that will appear in the implementation of the obligations laid down on online content-sharing service providers by Article 17. These concerns have also been shared by a wide variety of other stakeholders, the broad academic community of intellectual property scholars, as well as Members of the European Parliament and individual Member States. (2)

We consider that, in order to mitigate these concerns, it is of utmost importance that the European Commission and Member States engage in a constructive transposition and implementation to ensure that the fears around automated upload filters are not realized.

We believe that the stakeholder dialogues and consultation process foreseen in Article 17(10) to provide input on the drafting of guidance around the implementation of this Article should be as inclusive as possible. The undersigned organisations represent consumers and work to enshrine fundamental rights into EU law and national-level legislation.

These organisations are stakeholders in this process, and we call upon the European Commission to ensure the participation of human rights and digital rights organisations, as well as the knowledge community (in particular libraries), free and open source software developers, and communities in all of its efforts around the transposition and implementation of Article 17. This would include the planned Working Group, as well as other stakeholder dialogues, or any other initiatives at consultation level and beyond.

Such broad and inclusive participation is crucial for ensuring that the national implementations of Article 17 and the day-to-day cooperation between online content-sharing service providers and rightholders respects the Charter of Fundamental Rights by safeguarding citizens’ and creators’ freedom of expression and information, whilst also protecting their privacy. These should be the guiding principles for a harmonized implementation of Article 17 throughout the Digital Single Market.

Yours sincerely,
Balázs Dénes
Executive Director
Civil Liberties Union for Europe (Liberties)

Association for Progressive Communications
APADOR-CH
ApTi Romania
Article 19
Associação D3 – Defesa dos Direitos Digitais
Associação Nacional para o Software Livre – Portugal
Bits of Freedom
BlueLink Foundation
Center for Media & Democracy
Centrum Cyfrowe Foundation
Civil Liberties Union for Europe
Coalizione Italiana Libertà e Diritti civili
COMMUNIA association for the Public Domain
Creative Commons
Digitalcourage
Digitale Gesellschaft e. V.
Electronic Frontier Finland
Electronic Frontiers Foundation
Elektronisk Forpost Norge
epicenter.works
European Digital Rights (EDRi)
Fitug e.v.
Hermes Center
Hivos
Homo Digitalis
Human Rights Monitoring Institute
Hungarian Civil Liberties Union
Index on Censorship
International Federation of Library Associations and Institutions (IFLA)
Irish Council for Civil Liberties
IT-Pol Denmark
La Quadrature du Net
Metamorphosis Foundation
Nederlands Juristen Comité voor de Mensenrechten (NJCM)
Open Rights Group
Peace Institute
Privacy First
Rights International Spain
Vrijschrift
Wikimedia Deutschland e. V.
Wikimedia Foundation
Xnet

1 Human rights and digital rights organisations: https://www.liberties.eu/en/news/delete-article-thirteen-open-letter/13194
2 Academics from the leading European research centres: https://www.create.ac.uk/blog/2019/03/24/the-copyright-directive-articles-11-and-13-must-go-statement-from-european-academics-in-advance-of-the-plenary-vote-on-26-march-2019/
Max Plank Institute: https://www.ip.mpg.de/fileadmin/ipmpg/content/stellungnahmen/Answers_Article_13_2017_Hilty_Mosconrev-18_9.pdf
Universities: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3054967
Researchers: https://www.southampton.ac.uk/assets/imported/transforms/content-lock/UsefulDownloads_Download/A6F51035708E4D9EA3582EE9A5CC4C36/Open%20Letter.pdf
UN special rapporteur on the promotion and protection of the right to freedom of opinion and expression: https://www.ohchr.org/Documents/Issues/Opinion/Legislation/OL-OTH-41-2018.pdf

Background

After two and a half years of inter-institutional negotiations, the European Parliament adopted the Directive on Copyright in the Digital Single Market on 26 March 2018. Rules bind the European Commission to hold stakeholder dialogues and the consultation process after the Directive is published in the Official Journal of the European Union. Member states will then have two years to implement the regulation. Regarding Article 17, a guidance on the application will be issued by the European Commission in order to help national implementation processes.

Read more

Open letter on the Copyright Directive stakeholder dialogue (20.05.2019)
https://edri.org/files/copyright/20190517-EDRI_copyright_open_letter.pdf

EU Member States give green light for copyright censorship (15.04.2019)
https://edri.org/eu-member-states-give-green-light-for-copyright-censorship/

Filters Incorporated (19.04.2019)
https://edri.org/filters-inc/

Censorship machine takes over internet (26.03.2019)
https://edri.org/censorship-machine-takes-over-eu-internet/

Copyright reform: Document pool
https://edri.org/copyright-reform-document-pool/

close
15 May 2019

EDRi strategic planning: A collective journey towards a strengthened network

By Claire Fernandez

On 6 and 7 April, European Digital Rights (EDRi) network held its General Assembly in London. EDRi members elected three new Board members and a new President, Anna Fielder, a long-standing privacy expert, who succeeds Andreas Krisch – EDRi President of ten years who lead the network through years of achievements and milestones.

This General Assembly also marked the kick off of our strategic planning process that will lead to the adoption of EDRi’s strategy for the next five years in 2020. This is the beginning of a year-long journey of consultations and reflection to develop a new plan for continuing to efficiently work together as the leading European network of NGOs, successfully defending human rights in the digital environment.

We are committed to a transparent and inclusive process to strengthen the network’s theory of change. Looking back at our past achievements, we will be looking at long-standing and emerging topics, tactics and network and organisational development. At the heart of the strategy, we will put efforts to strengthen a network of organisations, and to equip individuals working for these organisations to deal with raising challenges, and to feel safe and empowered to be actors of change.

A new digital deal

In a changing environment, EDRi must adjust to stay relevant and offer redress to people affected by human rights violations online.

EDRi’s recent achievements on data protection, privacy and freedom of expression make our network more relevant than ever. Since our last strategy was adopted in 2015, there is not a day without data exploitations being uncovered, States attempting to curb freedom of expression online, and digitalisation increasing opportunities and risks for people. While more actors embrace this change, only a few continue to prioritise human rights as their key concern.

The weight of corporate lobby makes it hard for civil society organisations to have a meaningful contribution. “Alone you go faster, but together you go further,” says an African proverb. New connections with the broader human rights movement are needed to make our voice heard, while EDRi’s expertise and unique perspective remains intact and essential.

What we will do?

As a network of NGOs, we will look at some of the following questions to add strategic guidance, whether in our collective or individual work:

  • What have we learned since the last strategy in terms of organisation and work areas and how does this inform our strategic process?
  • How can we better define EDRi’s vision and mission?
  • What is the future of digital rights in Europe?
  • What are the changes we want to see happening in the field of digital rights in Europe to ensure protection of human rights in the digital envirdonment for all?
  • Who should we work with inside and outside the “digital rights bubble”?
  • How is the change most likely to happen? What are some methods and decision-making processes we need to get there?
  • What kind of resources do we need in order to achieve our goals?

Next steps on our journey

Based on a first survey on members and observers, we held a fruitful workshop with the participants of our General Assembly. An advisory group of members and Board members has been established to provide expert input, and we are currently processing the data collected to date. At the end of May, the Brussels office staff and Board will meet for a day to further work on the draft strategy, which will be followed by a consultation inside the membership and with external advisors. The second half of the year will be dedicated to formal reviews on the draft strategy.

We would like to thank our members based in the UK Open Rights Group, Article 19 and Privacy International for hosting the General Assembly in London as well as FabRiders and Aspirations for the skillful facilitation and invaluable advice. We welcome an ongoing dialogue with you on how we can aspire for a strategy that provides value for EDRi members, observers, and the broader movement. If you want to contribute to this collective reflection, please get in touch!

close
15 May 2019

NGOs and academics warn against Deep Packet Inspection

By Jan Penfrat

Today, on 15 May 2019, European Digital Rights, together with 45 NGOs, academics and companies from 15 countries sent an open letter to European policymakers and regulators warning against the widespread use of privacy-invasive Deep Packet Inspection (DPI) technology in the EU. The letter addresses the ongoing negotiations of Europe’s new net neutrality rules, in which some telecom regulators appear to be pushing for the legalisation of DPI technology.

Deep Packet Inspection allows telecom companies to examine the content of our communications. Information about which apps we use, which videos we watch, and which news articles we read should be off limits for the telecom industry. Yet, with the proliferation of zero-rating in all but two European countries, the industry has started to deploy DPI equipment on a large scale in order to charge certain data packages differently or to throttle services and cram more internet subscribers in a network already running over capacity.

EDRi and its members have for many years advocated in favour of strong net neutrality rules that protect people’s privacy and prevent the discrimination of selected types of internet traffic.

And yes, Europe’s current net neutrality rules indeed ban DPI technology that examines specific user information for the purpose of treating traffic differently. Yet, a mapping of zero-rating offers in Europe conducted by EDRi member Epicenter.works identified 186 telecom services which potentially make use of DPI technology. Most regulators have so far turned a blind eye on these net neutrality violations. Instead of fulfilling their enforcement duties, they seem to now aim at watering down the rules that prohibit DPI.

The negotiations of Europe’s new net neutrality rules are expected to continue behind closed doors and will be followed by a public consultation in autumn 2019. The final rules are then expected to be decided in March 2020.

EDRi and its member organisations will continue to fight for strong net neutrality rules in Europe that protect people’s privacy and prevent the discrimination of selected types of internet traffic.

You can download the letter here (pdf).

Read more:

Net neutrality wins in Europe! (29.08.2016)
https://edri.org/net-neutrality-wins-europe/

Zero rating: Why it is dangerous for our rights and freedoms (22.06.2016)
https://edri.org/zero-rating-why-dangerous-for-our-rights-freedoms/

A study evaluates the net neutrality situation in the EU (13.02.2019)
https://edri.org/a-study-evaluates-the-net-neutrality-situation-in-the-eu/

close
08 May 2019

It starts with free Pokémon Go, it ends with Bolsonaro

By Bits of Freedom

Chile was the first country in the world to have a net neutrality law, but it is not enforced at all. A simple search across mobile internet providers shows a large offer of “free” data if you’re using platforms such as Facebook, Twitter, Instagram, Spotify, or Pokémon Go. This is called “zero-rating” and means people don’t have to pay for using some services like they would for others. It’s a violation of net neutrality.

These perks are crucial in the decision of millions of prepaid phone users who need to optimise their top-ups. This has led to a class divide where those with the economic means have access to the unlimited options of the internet, while those who need to be mindful of their expenses are constrained to the services of big tech corporations.

Class is central in the discussion about net neutrality. Supporters of net neutrality claim that without this regulatory framework users would have differentiated packages according to their economic means, and consequently there would be a first and a second-class internet. Those in favor of zero rating – and against net neutrality – refer to the same class divide, but now as an argument towards mitigating the cost of data plans for those in economic need.

Nobody wants to be the villain who opposes free Pokémon Go.

But the dystopia of corporations that are permitted to offer their services zero-rated doesn’t end there. The profound social infiltration of services owned by Mark Zuckerberg has led to scenarios in which entire communities rely on Whatsapp groups or Facebook fan pages as sources of information. Do you see where I’m going? Cambridge Analytica, anyone?

On 1 January 2019, Jair Bolsonaro became the president of Brazil. He is a right-wing politician who is in favour of torture, of the destruction of the Amazon rainforests, and of the criminalisation of homosexuality. The Guardian prepared a piece on how Whatsapp, a service used by 120 million Brazilians, proved to be a very effective tool to mobilise support for Bolsonaro. Whatsapp was used to promote his fascist promises, harass users who questioned these proposals and, of course, to send out big shipments of fake news.

According to the information that circulated in these Whatsapp groups, Bolsonaro’s opponent wanted to legalise pedophilia and incest, and his rival party was preparing a mandatory “gay kit” for 6-year-olds in Brazil’s public schools. For sure it is very easy to discard this as fake news if you have unlimited internet access to fact-check, or if you have a support system of informed people who will tell you the truth. But what happens when you’re restricted to a single-platform ecosystem where those calling out fake news are harassed, and where support for it is amplified by likes and social acceptance?

The relation between zero rating and the spreading of fascism might at first sight seem very distant. However, a closer look at the human motivations and interactions that take place in the virtual “free” spaces, and the economic interests of tech business, reveals a systematic information attack on the most vulnerable users of mobile internet who are forced to inhabit these environments of digital garbage.

Advocates and policymakers, who are well-versed in internet topics and hold the privilege of accessing secure and legitimate communications and information channels, can choose to blame the users of these services. They can choose to expect people to ignore fake news and spend their limited megabytes on the interactive visualisations of the New York Times instead of on Facebook with the people they know. However, it will be much more fruitful to work on strategies that guarantee a strict enforcement of net neutrality – including a ban on zero rating – through an interdisciplinary approach that includes community, tech and regulatory work.

It is important to fight for a vision where, at least in theory, all of us, regardless of our economic situation, are able to access and participate in an ecosystem of truthful information and open collaboration. We cannot abandon those with fewer means to the digital junk content that promotes fascism and generates toxic revenue for the big internet platforms.

This article was first published at https://www.bitsoffreedom.nl/2019/04/29/it-starts-with-free-pokemon-go-it-ends-with-bolsonaro/.

Zero rating: Why it is dangerous for our rights and freedoms (22.06.2016)
https://edri.org/zero-rating-why-dangerous-for-our-rights-freedoms/

Two years of net neutrality in Europe – 31 NGOs urge to guarantee non-discriminatory treatment of communications (30.04.2019)
https://edri.org/two-years-of-net-neutrality-in-europe-29-ngos-urge-to-guarantee-non-discriminatory-treatment-of-communications/

(Contribution by Danae Tapia, Mozilla fellow at EDRi member Bits of Freedom, the Netherlands)

close
08 May 2019

Austria: New “responsibility” law will lead to self-censorship

By Epicenter.works

Shortly after the EU gave green light to upload filters, two laws were proposed in Austria, with the alleged goal of tackling online hate speech, that rang the alarm bells.


The law “on care and responsibility on the net” forces media platforms with forums to store detailed data about their users in order to deliver them in case of a possible offence not only to police authorities, but also to other users who want to legally prosecute another forum user. Looking at the law in detail, it is obvious that they contain so many problematic passages that their intended purpose is completely undermined.

According to the Minister of Media, Gernot Blümel, harmless software will deal with the personal data processing. One of the risks of such a system would be the potential for abuse from public authorities or individuals requesting a platform provider the person’s name and address with the excuse to wanting to investigate or sue them − and then use the information for entirely other purposes. Rather than improving safety online, the resulting “chilling effect” will lead to individuals avoiding sharing their most controversial opinions on a forum that possesses their detailed personal data. In essence, this is a way of imposing self-censorship on individuals. The proposed laws would concern only a handful of platforms in Austria. The aim is quite clear: to diminish the public democratic discourse and to try to intimidate those who think differently politically.

This law is not alone in restricting online freedoms. During its EU Council presidency, Austria decided to not make a clear rejection of unlawful data retention − a concept that has already been judged several times as contrary to fundamental rights by the European Court of Justice. Furthermore, back at home, the Austrian Federal Government is trying to collect as much data as possible about citizens and with new surveillance laws and porn filters based on the British model. The goal is clear: To monitor people every step of the way on the internet and limit their leeway.

Austria’s Chancellor Sebastian Kurz argues that the internet is not a legal vacuum and that laws apply to the online world too. He’s right. The Charter of Fundamental Rights of the European Union also applies to the online world. We must not allow this creeping undermining and abolition of privacy, data protection, freedom of expression and participation in political discourse. It is time to stand up for these fundamental rights. Let us demand a transparent state instead of a transparent citizen!

Epicenter.works
https://epicenter.works/

EU Member States give green light for copyright censorship (15.04.2019)
https://edri.org/eu-member-states-give-green-light-for-copyright-censorship/

New EU data retention at Austria’s initiative
https://fm4.orf.at/stories/2975759/

(Contribution by Iwona Laub, EDRi member Epicenter.works, Austria)

close