11 Jul 2018

Danish High Court ruling on data retention use and file sharing cases

By IT-Pol

On 7 May 2018, the Eastern High Court in Denmark delivered a ruling that internet service providers (ISPs) are not required to disclose subscriber information in file sharing cases. This represents a major change of the previous legal practice in Denmark, where rightsholders were routinely granted access to subscriber information for alleged file sharers, even if the identification required access to retained data from mandatory data retention.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Two Danish law firms have specialised in legal action against file sharers in cooperation with the German file sharing monitoring company MaverickEye UG, which is well known from similar activities in other European countries. MaverickEye monitors BitTorrent file sharing networks and collects IP addresses of the participants in the BitTorrent swarm. In order to verify that the copyrighted work is made available from the IP address in question, a small piece of the file is downloaded using a modified BitTorrent client. For each copyrighted work, MaverickEye provides information about IP addresses and timestamps to the relevant rightsholders or, in most cases, the specialised law firms that represents them. The next step is to seek a court order requiring the ISPs to identify the actual subscribers that have used the IP addresses at the specific time. This is the critical step and legal practice varies among EU Member States.

If the subscriber names and addresses can be obtained from the ISPs, the law firm can either file a lawsuit demanding compensation for copyright infringement or send a letter to the subscriber with a proposed settlement for the case. The latter is generally the preferred option since lawsuits are expensive, and the alleged copyright infringement using BitTorrent is often limited to a single film or TV-series episode/season. In Denmark, the settlement offer from the law firm has typically been a payment of 200 to 300 euros for a single film. Only a handful of lawsuits have been filed with Danish courts, so most claims have either been settled or dropped if the subscriber denies having taken part in the alleged file sharing activity.

Based on Danish case law for three file sharing cases at the two High Courts around 2008, the subscriber does not automatically become legally responsible for file sharing from the IP address. The rightsholder must prove who has committed the file sharing act in order to obtain compensation. This burden of proof can be very difficult to meet if the subscriber for instance has an open WiFi network, has allowed guests to use his/her internet connection, or if there are several persons in the household. Most subscribers are probably not aware of this, so it is quite likely that many cases have been settled by paying the offered settlement amount of 200 to 300 euros.

The current wave of legal action started in 2014, and according to information from the recent High Court ruling of 7 May 2018, the two Danish law firms have obtained subscriber information for some 200,000 IP addresses. This shows the massive scale of the monitoring operation of file sharing networks by MaverickEye. Access to subscriber information for a large number of IP addresses has also been reported in Sweden by TorrentFreak, incidentally involving the same Danish law firm as the present case.

Each court application for subscriber identification consists of a large number of IP addresses, for example 4000 IP addresses in the case ruled by the High Court on 7 May 2018. Because of the Danish data retention law, ISPs hold information about assignment of dynamic IP addresses for 12 months, so there is no urgent need for the law firm to quickly seek a court order for subscriber identification when information about the file sharing activity has been received from MaverickEye. A large batch of IP addresses from the same ISP can be collected before seeking the court order for subscriber identification from that ISP.

Until recently, this assembly-line strategy by the two law firms to send letters to alleged file sharers did not meet any legal challenges. In most cases, Danish ISPs do not object to a court application for subscriber information, and there is no court hearing for the application. The sole purpose of the court order, which is granted without any objections, is to provide a legal basis for the ISP to disclose the personal data (subscriber information) to the rightsholder.

However, between 2016 and 2017 the large Danish ISPs finally changed their response strategy and started to object to the court applications for subscriber information. Besides the administrative cost of handling the large number of requests for subscriber information and the increasing news media reporting of ISP customers complaining about file sharing allegations based on information obtained by law firms from their own ISP, the Tele2 data retention judgment (joined cases C-203/15 and C-698/15) of the Court of Justice of the European Union (CJEU) also played a major role.

According to the Tele2 judgment, general and undifferentiated (blanket) data retention is illegal under EU law. Moreover, access to the retained data, whether from (illegal) blanket data retention or targeted data retention, must be limited to what is strictly necessary. For criminal offences, the Tele2 judgment specifically states that access can only be granted for serious crime. Paragraph 115 of the Tele2 does not completely rule out that access to the retained data can be granted for civil claims, as there is an indirect reference to the Promusicae case C-275/06. However, when access to the retained data for criminal offences is strictly limited to serious crime, it does not seem to be proportionate to grant access to the retained data in civil proceedings involving only a minor copyright infringement, such as file sharing of a single film or TV series.

In a case involving Telenor and TeliaSonera, the District Court of Frederiksberg considered the data protection issues (noting that it was unclear whether this had been done in previous cases), but followed the established practice of ruling in favour of the rightsholder on 24 October 2017, that is ordering the disclosure of subscriber information. The ISPs appealed the court decision to the Eastern High Court. The ruling from the High Court on 7 May 2018, which reverses the ruling from the District Court and blocks disclosure of the subscriber information, is mainly based on an interpretation of the e-Privacy Directive 2002/58/EU and case law of the CJEU in Tele2, Promusicae and Bonnier C-461/10.

The e-Privacy Directive imposes an obligation of confidentiality on ISPs with respect the subscribers’ use of the internet. ISPs must delete traffic data, such as assignment of dynamic IP addresses, when it is no longer needed for the purpose of the transmission of a communication. According to statements to the High Court given by Telenor, TeliaSonera and a third ISP not involved in the case (TDC), information about assignment of dynamic IP addresses to individual subscribers is retained for at most 3-4 weeks for operational purposes. Therefore, the necessary information is only available in a special system for law enforcement access because of the Danish data retention law which has a mandatory 12-month retention period.

The High Court then considers the case law of Promusicae and Bonnier, and notes that the e-Privacy Directive does not preclude national legislation which requires disclosure of subscriber information in civil proceedings on copyright infringement, but that it must be possible to consider the opposing interests in an application for disclosure.

In the present case, the High Court finds that there are compelling reasons against disclosure. The information needed to identify the subscribers is only available because of the data retention obligation, and the sole purpose of the data retention provisions is to enable the police to obtain access to retained data for the purpose of investigation and prosecution of criminal offences. The Court is aware that the civil claims cannot be pursued without access to subscriber information, and that it is likely that there has been a substantial copyright infringement. After balancing the opposing interests, the Court finds that this does not outweigh the confidentiality of communication for the subscribers under the e-Privacy Directive. Therefore, the request for disclosure of subscriber information is denied. The decisive factor in the High Court ruling is the Danish data retention law which limits access to the retained data for the purpose of investigation and prosecution of criminal offences.

Read more:

Denmark: Our data retention law is illegal, but we keep it for now, EDRi (08.03.2017)
https://edri.org/denmark-our-data-retention-law-is-illegal-but-we-keep-it-for-now/

ISPs Win Landmark Case to Protect Privacy of Alleged Pirates, TorrentFreak (08.05.2018)
https://torrentfreak.com/isps-win-landmark-case-protect-privacy-alleged-pirates-180508

Copyright Trolls Hit Thousands of Swedish ‘Pirates’ With $550 ‘Fines’ (23.10.2017)
https://torrentfreak.com/copyright-trolls-hit-thousands-of-swedish-pirates-with-550-fines-171023/

(Contribution by Jesper Lund, IT – Pol, EDRi member, Denmark)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
11 Jul 2018

ENDitorial: The Commission’s new filtering adventure

By Joe McNamee

In September 2017, the European Commission adopted a “Communication” on illegal content online, full of demands that somebody – but not them and not the Member States – should do something to fight illegal content online. With this move, the European Commission managed to generate some good publicity for itself.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

In January 2018, the European Commission decided to push the limits of vagueness and made a statement calling for “more efforts and faster progress” in fighting illegal content. More than what? There are no benchmarks. Faster than what? There are no benchmarks, no safeguards, no review processes, no statistics beyond how much is removed more quickly – perfect for generating good press coverage.

Delighted by the good press coverage that it was able to generate from such vague and populist demands, the ink was hardly dry on the Communication before the European Commission started writing a “Recommendation” on illegal content online, full of demands that somebody – but not them and not the Member States – should do something to fight illegal content online. This was eventually adopted in March 2018. The press coverage was favourable.

Like a drunk staggering from lamppost to lamppost, looking for support rather than illumination, the European Commission is now – again – looking for a new press release. In September, in the complete absence of any data about anything else except “more” being deleted “faster” and the need for even more to be deleted even faster, the European Commission plans to launch a Directive.

What will the Directive contain? Will we see review processes to protect free speech? No, except maybe the right to complain to internet companies in cases where they remove content on the basis of the law (i.e. never). Or to avoid counterproductive impacts that makes the threat of terrorism greater? No. Or statistics on the actual impact beyond populist slogans of “more and faster”? No.

In a stunning example of the Commission’s current lack of self-awareness, it is planning to adopt the proposal for the legislation in the same week that its controversial legislation on copyright (which includes upload filtering) returns to the European Parliament.

So, in September, the news will be, the Commission is getting serious about illegal content online. After all somebody – but not the Commission and not the Member States – should really do something. Anything. Regardless of the cost.

Read more:

Commission’s position on tackling illegal content online is contradictory and dangerous for free speech (28.09.2017)
https://edri.org/commissions-position-tackling-illegal-content-online-contradictory-dangerous-free-speech/

Q&A on the Recommendation on Measures to “effectively tackle illegal content online” (01.03.2018)
https://edri.org/qa-the-recommendation-on-measures-to-effectively-tackle-illegal-content-online/

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
05 Jul 2018

Press Release: EU Parliamentarians support an open, democratic debate on Copyright Directive

By EDRi

EU Parliamentarians decided today, 5 July 2018, that the initial text proposed as the Copyright Directive Reform needs to be re-opened for edits. This will allow the 751 MEPs to propose amendments in September and call for the deletion of the notorious Article 13.

This positive vote happened because people stood up and demanded better legislation. Despite a huge campaign to discredit the views of citizens as being simply the product of ‘disinformation’, their voice was heard in the end

said Diego Naranjo, Senior Policy Advisor at EDRi.

The European Parliament (EP) has taken note of the criticism raised by a large numbers of individuals, civil society groups, creators, academics and the World Wide Web’s inventor against the Censorship Machine. Regardless of the huge lobbying from the rightsholder industry, the EP has decided to have open discussions aimed at finding an optimal text that balances all views on the Directive and that removes the risk of privatised censorship.

The European Parliament will now be able, in an open debate, to improve the text and defend freedom of expression ahead of the next elections

said Diego Naranjo, Senior Policy Advisor at EDRi

Citizens have made themselves heard via thousands of emails and tweets, hundreds of calls and a petition signed by nearly 1 million people. Representatives who preserve their citizens’ freedoms will reap the benefits during 2019 elections.

Read more:

Save Your Internet
https://saveyourinternet.eu

Action plan against the first obligatory EU internet filter (28.06.2018)
https://edri.org/strategy-against-the-first-obligatory-eu-internet-filter/

JURI MEPs ignore expert advice and vote for mass internet censorship (20.06.2018)
https://edri.org/strategy-against-the-first-obligatory-eu-internet-filter/

Copyright reform: Document pool
https://edri.org/copyright-reform-document-pool/

Twitter_tweet_and_follow_banner

close
02 Jul 2018

How the EU copyright proposal will hurt the web and Wikipedia

By Guest author

Wikimedia is an integral part of a large movement of civil society stakeholders, technologists, creators, and human rights defenders, who all recognize the importance of a free and open web for culture, progress, and democracy. Our movement is working to promote freedom online for the benefit of all. Our efforts in this public policy realm are all the more important in an era of increasing restrictions on free speech and free access to knowledge across the globe, which directly threaten the mission and vision of Wikimedia and its projects, such as Wikipedia.

This is why we strongly oppose the proposed EU Copyright Directives and urge the Members of the European Parliament to reconsider proceeding with the version recently adopted by the Legal Affairs Committee. We are concerned because these flawed proposals hurt everyone’s rights to freedom of expression and Europe’s ability to improve the welfare of its citizens online.

Next week, we expect the European Parliament to vote in plenary on whether to proceed with the version adopted by the Committee. If the Members of the European Parliament reject it, there will be another opportunity to fix much of the current proposal’s broken requirements. Now may be the last opportunity to improve the directive.

We oppose this EU copyright package because of its detrimental effects on internet freedom, access to knowledge, and collaboration online. We believe that:

  • The requirement for platforms to implement upload filters is a serious threat for freedom of expression and privacy. Our foundational vision depends on the free exchange of knowledge across the entirety of the web, and beyond the boundaries of the Wikimedia projects.
  • A new exclusive right allowing press publishers to restrict the use of news snippets will make it more difficult to access and share information about current events in the world, making it harder for Wikipedia contributors to find citations for articles online.
  • The proposal does not support user rights, is missing strong safeguards for the public domain, and does not create exceptions that would truly empower people to participate in research and culture.

We believe that enactment of this copyright package will significantly decrease in the amount of content that will be freely accessible to all across the globe. The costs involved in preemptively filtering content that may violate broad conceptions of copyright are likely also to lead to concentration of content determinations in a small number of large platforms.

The Wikimedia movement has opposed the flawed proposal for copyright reform since the first disappointing draft was first presented almost two years ago, after initially encouraging preparatory steps. Early on, we warned about the implications of imposing upload filters and called for urgently needed safeguards for the public domain. More recently, we wrote about the negative effects of mandatory automatic content detection on collaboration and freedom of expression.

Wikimedia communities around the world have been actively opposing the EU copyright package. In May, German and Bulgarian Wikipedia communities ran banners opposing Article 13 of the copyright proposal. Several European Wikimedia user organizations, including Wikimedia chapters in France, Estonia, Germany, Denmark, and Spain, took part in a Day of Action against the the copyright package on June 12, writing blog posts and tweeting under the hashtag #SaveYourInternet. The German chapter has even organized an event in the streets of Berlin and more language communities are currently debating how to engage.

The proposal is a serious threat to our mission as it fails to truly modernize copyright and ensure the law keeps up with reality. Wikimedia can only achieve its mission when everyone is able to share information freely and contribute to the collection of knowledge on Wikipedia and its sister projects, and when we support such freedom and openness. The copyright proposal conflicts with that mission.

Although Wikimedia operates non-commercial websites, which may benefit from exemptions in certain parts of the EU copyright package, our overarching mission is heavily dependent on a free and open internet ecosystem.

We strongly urge the European Parliament to reject this bad copyright package in the vote next week and reconsider the opportunity to create a balanced and forward-looking law. It’s not too late: you can help stop this bad proposal. Take action by contacting your Member of European Parliament at changecopyright.org, and spread the word in your circles and on social media!

Eileen B. Hershenov, General Counsel
Wikimedia Foundation

(contribution by By , Wikimedia Foundation)

close
28 Jun 2018

Action plan against the first obligatory EU internet filter

By Andreea Belu

Many feared (and some hoped) that the European Parliament’s JURI Committee vote on the 20th of June would be the end of our campaign, as well as the end of the open internet. Not so fast, the censorship machine is not a done deal!

In the next months, our main focus will be on ensuring that the position of the whole EU Parliament is against censorship machines. For this to happen, we will address all 750 MEPs around key moments. Below you will find an easy, yet detailed explanation of the opportunities we will have to challenge the censorship machine as the law makes its way through the EU decision-making process.

Step 1 – Before the summer break: Don’t allow JURI to negotiate in secret with the censorship-supporting EU Council

On 20 June, not only did JURI approve the Censorship Machine, but it also gave itself permission to represent the whole of the EU Parliament and fast-forward straight to negotiations with the EU Council.  This permission to negotiate is called a “negotiation mandate” and “negotiations” mean in this case secretive discussions to produce a “compromise” between the two different versions of the censorship machine: the one from JURI and the equally bad one already agreed by EU Member States in the EU Council. What could possibly go wrong?

Among the 750 MEPs, forces are mobilizing in order to challenge JURI’s self-permission for closed-doors negotiations on behalf of the Parliament, with the Council. As a result, it is very likely that on the 5th of July, the 750 Parliamentarians will have a vote on whether the JURI will keep their negotiation mandate.

ACTION 1 : We need to ask the 750 Parliamentarians to support having a proper democratic debate and to oppose the start of opaque negotiations! All EU countries have representatives among the 750 MEPs, which means your representatives are there too. They need to be asked to attend the meeting and vote in support of your freedoms. In order to succeed. we need at least half of the votes to oppose the JURI negotiation mandate.

Step 2 – September: Ask MEPs to #ThinkAgain and reconsider the text of the Censorship Machine proposal

If we succeed with STEP 1, the report JURI voted on the 20th of June will be re-opened for public discussions. MEPs can introduce new amendments (new edits) to the original Copyright Directive proposals, including amendments asking to delete Article 13 (the “Censorship Machine”).

ACTION 2: Contact your representatives and ensure that there are amendments calling for the deletion of Article 13 and that such amendments are be supported. If good amendments get adopted, then a better text can be negotiated in the next stage of the process.

Step 3: End of 2018/early 2019: Encourage MEPs to vote against the censorship machine during the final vote

Either after an unsuccessful step 1 (if the negotiation mandate is approved, using the bad text adopted by JURI) or after step 2 (on the basis of a better text, after new amendments are voted on by the full Parliament), JURI will start negotiations with the Council. After the JURI vote, we published an overall predicted timeline of the file’s next steps. Once negotiations on the Copyright Directive proposal end, all MEPs will vote on the final text. The vote will happen during another Plenary session, this time the final one.

ACTION 3: We need to encourage all MEPs to support a good, legally sound, balanced Copyright Directive, that defends the rights and freedoms of everybody – and that means a Directive without a censorship machine – voting against the entire Directive, if necessary, although we really hope this will not be necessary. By that stage, it will only be a few weeks until the 2019 European Parliament election. This context will give you more power than normal to make your voice heard.

 

We have seen over 150.000 mails sent to JURI, together with over 15 000 tweets and 500 calls addressing our representatives. Protests are growing across Europe and the censorship machine gains importance day by day. In Brussels, we can see the difference that the engagement of ordinary citizens is having in the Parliament’s discussions. It is true that if JURI would have voted against the censorship machine, the fight would have become significantly easier. But JURI is only one Committee from the whole EU Parliament. Our chances are still high! Moreover, we are used to challenges. We had challenges in our fight against ACTA and we won. We had obstacles when we fought for net neutrality and we won.

We know that a smooth sea never made a skilled sailor.
Will you join us onboard?

 

Read more:

Save Your Internet
https://saveyourinternet.eu

Moving Parliament’s copyright discussions into the public domain (27.06.2018)
https://edri.org/moving-parliaments-copyright-discussions-into-the-public-domain-2-0/

We can still win: Next steps for the Copyright Directive (20.06.2018)
https://edri.org/next-steps-copyright-directive-article-13/

MEPs ignore expert advice and vote for mass internet censorship (20.06.2018)
https://edri.org/press-release-meps-ignore-expert-advice-and-vote-for-mass-internet-censorship/

Member states agree on monitoring and filtering of internet uploads (20.05.2018)
https://edri.org/eu-member-states-agree-on-monitoring-filtering-of-internet-uploads/

Trialogues: The system that undermines transparency and democracy (20.04.2016)
https://edri.org/trilogues-the-system-that-undermines-eu-democracy-and-transparency/

close
28 Jun 2018

Re-Deconstructing upload filters proposal in the copyright Directive

By Diego Naranjo

This week we have published a new analysis of the proposal for upload filters in the Copyright Directive proposal. The paper is a new paragraph-by-paragraph analysis of relevant parts in the text adopted by the Legal Affairs Committee of the European Parliament (JURI Committee). The work complements our first analysis of the initial proposal by the Commission.

In our new analysis you will be able to read the relevant adopted texts by JURI, on the left, and our explanation of what these paragraphs mean in practice on the right. The necessity of our new detailed analysis lies in the recent misunderstandings about what the text means and what it doesn’t mean. This text could become the basis of the negotiations between the EU Parliament and the Council. MEPs will decide whether the Copyright Directive is built on closed-door negotiations or an open, democratic debate during the 5th of July vote in the EU Parliament’s Plenary vote.

Your action is crucial to keep an open internet which is free of upload filters.
Act now: Ask you MEPs to attend and vote in your interest on the 5th of July! #SaveYourInternet!

Read more:

Moving Parliament’s copyright discussions into the public domain (27.06.2018)
https://edri.org/moving-parliaments-copyright-discussions-into-the-public-domain-2-0/

We can still win: Next steps for the Copyright Directive (20.06.2018)
https://edri.org/next-steps-copyright-directive-article-13/

Press Release: MEPs ignore expert advice and vote for mass internet censorship (20.06.2018)
https://edri.org/press-release-meps-ignore-expert-advice-and-vote-for-mass-internet-censorship/

EU Censorship Machine: Legislation as propaganda? (11.06.2018)
https://edri.org/eu-censorship-machine-legislation-as-propaganda/

Copyright Directive: Busting the myths (13.12.2017)
https://edri.org/censorship-machine-busting-myths/

(Contribution by Diego Naranjo, Senior Policy Advisor)

close
27 Jun 2018

Moving Parliament’s copyright discussions into the public domain

By Joe McNamee

With just eleven months to go before the 2019 European elections, European citizens’ reactions to certain aspects of the Copyright Directive mean that there is more interest than ever in what decisions are being made by the European Parliament, as well as how these decisions are made. This is great news for pro-Europeans and a great opportunity for the Parliament to demonstrate its democratic credentials… or great news for Eurosceptics if the Parliament fails to deliver. It is clear for all stakeholders involved that in order for the legitimacy of the Copyright file to be maintained, JURI’s negotiation mandate needs to be rejected next week. 

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

On the 5th of July there will be a vote on whether to start secretive, undemocratic closed-door “trilogue” meetings with the EU Council, or to have a public discussion of the full Parliament in September.

An open debate and an opportunity for all MEPs to have their say on this clearly very important topic would greatly benefit the democratic process. For this reason, civil society has urged MEPs to vote for a public debate on the Directive and, therefore, against the negotiating mandate.

However, representatives of the copyright lobby, as well as certain Parliamentarians claim that citizens are misinformed. In order to clarify the issues, we have prepared a detailed line by line analysis of the adopted text in Article 13 in the Legal Affairs Committee (JURI).

A public discussion would help clarify some of the misunderstandings that have been circulating:

1. “This is only about Google and Facebook”

The definition describing the companies that are covered is very unclear. Indeed, Axel Voss MEP said on German TV that he was not even sure if Google and Facebook are covered and that the scope of the Directive will be subject to interpretation by the Court of Justice of the European Union (Zapp, NDR TV, 13 June 2018).

2. “This is only about videos and music”

Article 13 covers all kinds of content that can be uploaded – text, images, music, audiovisual content and even choreography.

3. “The JURI text does not include any mention of upload filters”

The text refers to:
–  “measures leading to the non-availability of copyright or related-right infringing works or other subject-matter” (Article 13.1) – which means upload filters
–  based on the relevant information provided by rightholders (Article 13.1a) – which means the lists of files to be filtered out
– such as implementing effective technologies (recital 38) – which means upload filters.

4. “The proposal says that the Charter of Fundamental Rights must be respected in the agreements between rightsholders and service providers”

The Charter of Fundamental Rights is binding on Member States and the European Commission. It is not binding on agreements between private companies.

5. “No personal data will be processed by the filters”

The proposal says that there must be a complaints mechanism in place. How can users complain about their work being filtered when it will be
impossible to match the complainant with the material that has been filtered?

6. “Memes are not covered”

The EU Copyright Exception for parody has been implemented differently across the EU and not implemented at all in some Member States. Therefore, unquestionably, memes are covered by the proposal and would be filtered by very imperfect algorithms, if the proposal is adopted in its current form.

7. “Agreements have to be “appropriate and proportionate””

Yes, this is true. But for whom do they need to be appropriate and proportionate? Logically, they need to be appropriate and proportionate for the parties to the agreement – and users are not parties to the agreements!

8. “There is an obligatory complaints mechanism”

Article 13 makes it clear that internet companies are free to impose their own terms and conditions. So, internet companies would have a choice – admit that content was being filtered on the basis of the law and implement a complicated and expensive complaints mechanism – or filter on the basis of their terms and conditions and avoid the expense of implementing a complaints mechanism. They won’t implement a meaningful complaints mechanism!

9. “There is no general monitoring obligation”

A general obligation to monitor all uploads searching for millions of text, audio, audiovisual and image files is a general monitoring obligation.

Read more:

Re-Deconstructing upload filters proposal in the copyright Directive (28.06.2018)
https://edri.org/redeconstructing-article13/

We can still win: Next steps for the Copyright Directive (20.06.2018)
https://edri.org/next-steps-copyright-directive-article-13/

Press Release: MEPs ignore expert advice and vote for mass internet censorship (20.06.2018)
https://edri.org/press-release-meps-ignore-expert-advice-and-vote-for-mass-internet-censorship/

EU Censorship Machine: Legislation as propaganda? (11.06.2018)
https://edri.org/eu-censorship-machine-legislation-as-propaganda/

Copyright Directive: Busting the myths (13.12.2017)
https://edri.org/censorship-machine-busting-myths/

(Contribution by Joe McNamee, EDRi Executive Director)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
27 Jun 2018

NCC publishes a report on tech companies’ use of “dark patterns”

By Maria Roson

Today, the Norwegian Consumer Council (NNC), a consumers group active on the field of digital rights, has published a report on how default settings and “dark patterns” are used by techs companies such as Facebook, Google and Microsoft to nudge users towards privacy intrusive options.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The term “dark patterns” refers to the practices used to deliberately mislead users through exploitative nudging. The NNC describes them as “features of interface design crafted to trick users into doing things that they might not want to do, but which benefit the business in question, or in short, nudges that may be against the user’s own interest”.

The General Data Protection Regulation (GDPR) requires services to be developed according to the principles of data protection by design and data protection by default and obliges companies to make a lawful use of their users’ data. With the entry into operation of the GDPR last May, the three companies had to update the conditions of use of their services, which they did by using a wide variety of “dark patterns”. The report focuses on five of them which overlap with each other and that together form the big picture of how companies mislead users to “chose” invasive instead of data protection-friendly options. This is done by putting in place the following mechanisms:

1. Default settings

Facebook and Google hide and obscure the privacy settings, making it much easier and visible for the user to accept the most intrusive options.

2. Taking the hand of the user to mislead him

Usually, the services push users to accept unnecessary data collection through a combination of positioning and visual cues. Facebook and Google go a step further by requiring a much larger amount of steps to limit data collection, in order to disincentive citizens to protect themselves.

3. Invasive options go first

All three companies presented as the positive option the settings that maximise data collection, creating doubts on the user and even ethical dilemmas. The companies do not explain the full consequences of their choices but frame their messages focusing on the theoretical positive sides of allowing wider data collection, such as the improvement of the user experience.

4. Rewards and punishments

A typical nudging strategy is to use incentives to reward the “right” choice, and punish choices that the service provider deems undesirable. The reward is often described as “extra functionality” or a “better service” (without making clear what this means in practice), while the punishment might be the loss of functionality or the deletion of the account if they decline, which has been the strategy of Facebook and Google. 5. Time pressure: When it came to completing the settings review, all the three services put pressure on the user to complete them at a time determined by the service provider. This was made without a clear option for the user to postpone the settings review and not making clear either whether the user could still use the service or not.

The report concludes that these service providers are just giving users the “illusion of control” while nudging them toward the options more desirable for the companies.

Read more:

DECEIVED BY DESIGN: How tech companies use dark patterns to discourage us from exercising our rights to privacy (27.06.2018)
https://fil.forbrukerradet.no/wp-content/uploads/2018/06/2018-06-27-deceived-by-design-final.pdf

GDPR: noyb.eu filed four complaints over “forced consent” against Google, Instagram, WhatsApp and Facebook (25.08.2018)
https://noyb.eu/wp-content/uploads/2018/05/pa_forcedconsent_en.pdf

GDPR explained
https://gdprexplained.eu/

(Contribution by Maria Roson, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
27 Jun 2018

Key modifications in the Whistleblowers Directive proposal

By Xnet

The fact that the European Commission has drafted a proposal for a Directive for the protection of whistleblowers is welcome news. It is the result of the prolonged efforts of many activist organisations and several EU policy-makers, particularly in the European Parliament. Nevertheless some changes have to be done to secure the objectives of the draft Directive and the rights of access to information. The European Commission opened consultations for the Directive, that allow feedback to be provided until 13 July 2018. In order to help you participate in the consultations, X-net created a model law for the full protection of whistleblowers , and here X-net brings you its views on the key changes needed in the Directive proposal.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

1. Broadening the definition of a whistle-blower (and the importance of “public interest” justifications)

The first concern is to be found in the draft’s definition of whistle-blower. While the definition is very broad, it is nevertheless restricted to persons reporting illegal activities that are in some way connected to their work environment. The vast majority of cases of whistle-blowing fall into the category of employees. However there also are numerous examples where the wrongdoing is detected by a persons who have no working relationship with the body/persons committing the wrongdoing in question.

What is more, in X-net’s extensive experience working with whistle-blowers, at least 15% of the incidents do not involve any employment relationship. The whistle-blower may be someone who is personally affected by a crime, or a researcher, journalist or activist who uncovers evidence, as was the case with Ramsay Orta or the Flexispy whistle-blowers. In other cases, the whistle-blower may be in a personal relationship with those involved in the plot (e.g. the Pujol case in Spain).

It is X-net’s belief that it is absolutely necessary to ensure that all citizens are afforded the protection they deserve when reporting wrongdoing. This  is particularly important when there is inadequate protection of journalists and other persons that ensure that information in the public interest reaches the public. (See point 4 on “intermediaries and facilitators”).

If it is true that “persons who report information about threats of harm to the public interest (…) make use of their right to freedom of expression… [which] encompasses media freedom and pluralism”(Par.21), then every citizen is entitled to equal whistleblower protections. Union citizenship offers substantive equal treatment rights, including the constitutionally protected liberty “to participate in the democratic life of the Union” (TEU, Title II, Article 10). If it exempts  non-workplace whistleblowers from special protections, the Directive would fall short of respecting the rights and freedoms guaranteed by the EU treaties (Article 11 of the Charter of Fundamental Rights of the European Union and Article 10 of the European Convention on Human Rights).

Another position that X-net considers inappropriate is the attempt to link the effectiveness of the evidence obtained in reporting illicit acts to issues of morality. We believe that the aim of this Directive must be to facilitate the discovery of grave injustices, and that for the purpose of this objective, it is irrelevant whether the person who uncovers them does so with good or bad intentions, as long as their reports correspond to the facts. For this reason, we believe that requiring protection for the whistle-blower to be offered, “provided that the respondent acted for the purpose of protecting the general public interest” hinders and runs counter to the Directive’s objective.

Finally, and more generally, Article 14 (g) of the proposed Directive refers to “coercion, intimidation, harassment or ostracism at the workplace” when, in practice, such reprisals are not confined to the workplace environment. They can be exacted on workers and non-workers alike and, more often than not, occur outside this environment – in the private realm of the whistleblower. Thus, X-net strongly suggests that the “workplace” restriction be eliminated.

X-net understands the intention of the European Commission to limit the scope of the draft Directive so as not to encroach on Member State competencies or areas of law covered by existing legislation. However, X-net suggests that the scope of the Directive state explicitly in a new provision that the Directive covers wrongdoing that affects the public interest, otherwise we leave a considerable number of potential whistle-blowers unprotected.

2. Ensuring anonymity of the source

The confidentiality provisions in the draft Directive are insufficient. The ability to lodge a formal complaint anonymously must be ensured, as the European Parliament recommended in its Resolution of 24 October 2017 on legitimate measures to protect whistle-blowers, arguing that “…the option to report anonymously could encourage whistle-blowers to share information which they would not share otherwise; (…) stresses that the identity of the whistle-blower and any information allowing his or her identification should not be revealed without his or her consent; considers that any breach of anonymity should be subject to sanctions” (paragraph 49).

As X-net states in its model law, there is “a situation of asymmetry of forces between the public and institutions or corporations, making it impossible in practice for people to fulfil their duty as citizens to report any wrongdoing of which they may be aware, as well as to report improper behaviour, irregularities or illegal activities.”

The use of technological tools allows us to be more efficient in protecting the confidentiality and anonymity of those who provide relevant information. This makes it possible for us to correct this asymmetry. We must preserve the anonymity of private persons because they are vulnerable when they expose themselves to serve the common good.

The difference between anonymity and confidentiality resides in the fact that anonymity is the only way a source of information can wholly manage her or his own protection and the use that is made of the information. The weaknesses and porosity of reporting systems based solely on confidentiality have been amply demonstrated. Besides, there are additional and evident dangers in centralising all the power (information) in just a few hands, namely those of company directors and senior office holders in the public administration, leading to serious, massive abuses, as has already happened at other times in history.

3. Freedom to determine the most appropriate channel for disclosure

The third problem encountered is that the proposed Directive does not encourage the whistleblower to choose the most appropriate reporting channel. This will undermine much of the usefulness of the Directive, if left unamended.

In cases where whistle-blowers have used the internal channels of the entity they wished to report for abuses, X-net has observed that this usually resulted in destruction of evidence and personal suffering.

The extensive obligation included in the draft Directive requiring complaints be lodged internally first, forcing the whistle-blower to prove that she or he has good reasons for not doing so, would prevent many of the worthy objectives of this Directive from being realised. These “good reasons” are not defined and would lead in some cases to arbitrary decisions by the state or the courts, discouraging action. In fact, in the vast majority of cases, the whistle-blower would not be protected under such circumstances (see the cases of Snowden or Luxleaks, among countless others).

It is entirely legitimate to discourage the infliction of needless harm to an entity’s reputation. However, the use of internal complaint mechanisms are not necessarily appropriate and whistle-blowers need to be able to choose the most effective course of action. In the case of Snowden or in the case of Luxleaks, for example, such a mechanism would not have led to any effective reforms.

Any obligation to first make use of internal channels should be both circumscribed and linked to evidence of their demonstrated effectiveness. Along these lines, X-net suggests the inclusion of provisions that would help to guarantee the effectiveness of internal channels (e.g. independent reviewer, the mechanism allows for anonymity). This would encourage entities to establish more effective internal mechanisms.

4. The Protection of intermediaries and facilitators also be assured

In X-net’s model law on the Protection of Whistle-blowers the facilitator is defined as “a person or legal entity that contributes, facilitates or aids the whistle-blower in revealing or making public information constituting reason to blow the whistle/disclose of wrongdoing.

In the vast majority of cases, citizen platforms, NGOs, journalists and trade unionists are indispensable in helping the whistleblower, and they also suffer serious retaliations. The case of Luxleaks in which the journalist has been sentenced as the whistle-blower, is just one example.
While the role of intermediaries and facilitators is valued in the introduction to the Directive, this should be reflected in explicit protections for the entitites taking on such roles in the text of the Directive. It is essential they receive the same protection consistently throughout the provisions of the Directive.

Specifically, and by way of example, Article 15.7 of the draft Directive covers only the ‘worker’ and not the person that publishes it. Moreover, the definition of ‘report’ and ‘reporting person’ (Art.3 “Definition) should include whoever facilitates or publishes the information, if we really wish to protect the freedom of the press and information.

5. Addressing the misuse of data protection (and other rights and freedoms)

One of the purposes of protecting whistle-blowers is to redress the asymmetrical power dynamic between powerful entities and citizens. We have long observed that powerful interests initiate lawsuits for slander or violation of “intellectual property” rights or trade secrets (the cause of the long battle during the adoption of the 2016 Trade Secrets Directive ). A clear provision is needed in the Directive in order that these elements cannot be used as an excuse to undermine and inhibit public interest reporting and freedom of information.

In recent years, we have witnessed a surge in the misuse data protection rights to challenge whistle-blower protections. X-net works to actively promote and protect the fundamental rights to privacy and data protection. It equally promotes the importance of transparency in public institutions and large corporations, and believe that society benefits when power asymmetry between the citizen and powerful entities is reduced.

Data protection cannot and should not be used to dissuade people from reporting illegal activity (this is clear in the GDPR, articles 85-86). X-net does not believe that such protections should be equally applied to members of the public and public servants or heads of companies whose activities can have an impact the majority of the population. Whistle-blowers are neither saints nor devils. Their personal reasons are their own. The romantic aura surrounding whistle-blowers must be corrected, so the practice of denouncing abuses becomes the norm in a democratic society, and not a heroic act. This must be the ultimate goal of the Directive.

This the shorter version of the original Xnet’s article you can read here.

(Contribution by X-net, EDRi member, Spain)

Read more:

The European Parliament calls for protection of whistleblowers (31.10.2017)
https://edri.org/european-parliament-calls-protection-whistleblowers/

The EU must take action to protect whistleblowers (31.05.2017)
https://edri.org/eu-must-take-action-protect-whistleblowers/

Protecting whistleblowers – protecting democracy (31.01.2017)
https://edri.org/protecting-whistleblowers-protecting-democracy/

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
27 Jun 2018

Restoring freedom of expression in Spain: end the “gag law”

By Maria Roson

Spain has been one of the countries of the European Union that has most shamefully stood out for its government’s attitude against freedom of expression and information. During the government of former President Mariano Rajoy, the Spanish parliament passed the controversial “gag law”- as it was popularly known – which entered into force on 1 July 2015. This law amended the Spanish penal code by, among other things, reinforcing the penalties of “glorification of terrorism” and “humiliation of the victims of terrorism” and introducing limitations on protests and imposing administrative sanctions against demonstrators.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

One of the most obvious consequences that this law has had for freedom of expression and information online have been the criminal cases against many political activists, artists, and politicians because of their tweets. In its last report “Tweet…if you dare: How counter-terrorism laws restrict Freedom of Expression in Spain”, Amnesty International denounces the lack of legitimate purpose of the law, considering it too broad and too vague and with an evident purpose of targeting those expressing dissident opinions against the Spanish political system.

Among the limitations that this law has imposed on online activity are:

Arbitrarily restricting access to websites that promote or advocate “terrorism”

The text is written with such an ambiguous wording that it condemns not only the dissemination of criminal content but also simple access to it. This implies that accessing these websites is, itself, a crime, regardless of whether the person simply wanted to be informed or whether they are actually involved in a terrorist activity.

“Seriously disturbing the public order”

Without any definition of what the law considers to be “seriously disturbing the public order”. This ambiguity has lead to arbitrary fines to journalists when they were covering a public event.

Organizing online protests

The gag law punishes “unauthorised protest” which could be fined between 30,000 and 600,000 euro if the protest takes part near institutions such as the Spanish parliament, which happened with the protest organised by the “7N against gender violence”.

Posting pictures of police officers which imply a “danger for their personal of family security”

The doubt is of course what does “danger” mean. How exactly will the law measure “danger”? Again, it is not defined. The result is freedom of expression is curtailed, with fines ranging from 600 to 30,000 euro, and with such extreme consequences as fining a women for posting a picture of a police car parked illegally in a parking spot reserved for people with disabilities.

Penalising content sharing platforms

Platforms such as the sport streaming website “Rojadirecta”. Despite the legitimate intent to limit copyright infringements, the consequences of this measure will be creating legal uncertainty for hundreds of small businesses that have nothing to do with infringements.

Restriction of online protests

The “gag rule” punishes with criminal penalties the dissemination of messages on the internet which may be considered as “glorification or justification” of terrorism or “the dissemination of slogans” which may incite others to commit offences. This has undoubtedly been the most controversial part of the law and the most arbitrarily applied. Under the pretext of committing “glorification of terrorism”, an extremely abusive interpretation of this offence has been used. As a consequence, rappers, professional puppeteers and visual artists have been charged or prosecuted by the Spanish justice because of the politically content of their lyrics, plays or even the meaning of their artistic pieces.

The other battlefield has been Twitter where, since 2014, four coordinated police operations – called the “Spider Operations” – led to a big number of people arrested for posting messages and jokes on social media platforms referring, among other topics, to ETA’s terrorist attacks addressed to members of the Franco dictatorship. One of the most famous cases was the conviction of the rapper “Strawberry” for tweeting about ETA’s terrorist attacks. Although most of the people accused were released without charges or were not imprisoned , there are particularly worrying cases such as the recent convictions of rappers Pablo Hassel and Valtonyc, the latter currently on the run.

After almost 3 years since this law was approved, one of the first tasks of the new Spanish government is to take down the “gag law”. The idea of fixing the law by making amendments within the law, as the Socialist party has pointed out, is not enough. As associations such as the Platform for the Defence of Freedom of Information (Plataforma en Defensa de la Libertad de Información), Amnesty International, Rights International Spain and Spanish EDRi member X-Net have expressed, the only solution is to call for the repeal of the law.

Read more:

Amnesty International Report: “Tweet…if you dare. How counter-terrorism laws restrict Freedom of Expression in Spain” (13.03.2018)
https://www.amnesty.org/download/Documents/EUR4179242018ENGLISH.PDF

UN Rapporteur demands respect for freedom of expression online (14.06.2017)
https://edri.org/un-rapporteur-demands-respect-for-freedom-of-expression-online/

Xnet: Legislation that restricts freedom of expression of action and organization in the Spanish State (available only in Spanish) (01.12.2015)
https://xnet-x.net/leyes-coartan-libertad-expresion-accion-organizacion/

Spanish Citizens’ Security law: There is still some hope (21.06.2015)
https://edri.org/spanish-citizens-security-law-hope-not-lost/

Spanish Citizens’ Security Bill: Many restrictions, few freedoms (28.01.2015)
https://edri.org/spanish-citizens-security-bill-many-restrictions-few-freedoms/

(Contribution by Maria Roson, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close