freedom of expression

Freedom of expression is one of the key benefits of the digial era. The global information society permits interaction on a scale that was previously unheard of – promoting intercultural exchange and democracy. Consequently, the protection of this freedom is central to much of EDRi’s work.

23 May 2017

EU action needed: German NetzDG draft threatens freedom of expression

By Maryant Fernández Pérez

On 22 May 2017 six civil society and industry associations sent an open letter to eight EU Commissioners asking to take action against the German bill on “Enforcement on Social Networks”, the “NetzDG”.

This bill asks social media companies to take down content, including perfectly legal material, that social media companies like Facebook can arbitrarily label as “hate speech”, “fake news”, “pornographic content”, among other categories. In addition, the draft law de facto imposes filtering of content, despite the fact that such technology cannot understand context and will, therefore, inevitably lead to still more legal content being deleted. The basic aim of the bill is, of course, well-intentioned. However, the way this bill is drafted appoints social media companies as arbiters of legality and “the truth”. Furthermore, this bill breaches EU law, which establishes that all restrictions to fundamental rights, including freedom of expression, must be provided for by law, necessary and proportionate (Article 52 of the Charter of Fundamental Rights of the European Union). In addition, EU law also prohibits imposing general monitoring obligations on companies. If adopted, this unprecedented law would serve as a bad example for other states, including countries with serious democratic deficits.

In its role as the “Guardian of the Treaties”, the European Commission has the duty to ensure the draft law is compatible with EU law, including the EU Charter of Fundamental Rights. The letter explains that this “would translate into at least responding to Germany with a detailed opinion expressing incompatibility with EU law under the ongoing notification process“. EU Commissioners took a “solemn oath” not to be influenced (“either to seek nor to take instructions from any Government or from any other institution”) by Member States in exercising their duties.

This letter adds to previous criticism expressed by EDRi and leading experts, such as Professor Wolfgang Schulz. The letter is still open for signatories in the coming days. You can read the letter below or here (PDF).

Dear President Juncker,
dear First Vice-President Timmermans,
dear High Representative Mogherini,
dear Vice-President Ansip,
dear Vice-President Katainen,
dear Commissioner Bieńkowska,
dear Commissioner Jourová,
dear Commissioner Oettinger,

The signatories to this letter represent civil and human rights organisations as well as industry bodies representing the Internet technology sector. We are writing to call on the Commission to ensure compliance of Germany’s draft Network Enforcement Law (as notified on 27 March 2017) with EU law, including the EU Charter of Fundamental Rights.

While no one would object to the aim of curbing illegal hate speech and other unlawful content online, the draft law would unquestionably undermine freedom of expression and information. In practice, a distinction between content that is ‘manifestly’ unlawful and not manifestly unlawful is only very difficult to make. The legality of individual statements must always be assessed in its specific context. This, coupled with very tight time limits (24 hours or 7 days) for takedowns and draconian sanctions, will strongly incentivise online companies to simply take reported content down, thereby chilling freedom of speech online.

Beyond that, the draft law also requires social networks to immediately remove or block any copies of the unlawful content that are located on the platform. This obligation would, in practice, necessitate content filters searching the whole platform to automatically take down content in a fully undifferentiated manner. Automatically identified content, which is used in a totally different context, e.g. a parody, would be taken down because filters are ‘blind’ to contextual circumstances. We would like to stress that these kind of content filters would be unprecedented in a free democracy — so far only a handful of countries with serious democratic deficits require similar systems.

With respect to the above, we also see grave conflicts with established EU law. In various cases the European Courts stressed that measures put in place to protect a public interest, including the protection of a fundamental right, must strike an appropriate balance with other fundamental rights. We do not see how a proposal that profoundly undermines freedom of expression would pass that test.

Furthermore, the law’s content filtering requirement runs counter to EU law that protects fundamental rights by prohibiting general monitoring obligations.

In addition, the draft law will also have negative economic implications for the EU. The German draft law is a national measure which will lead to far greater regulatory fragmentation and runs against the Commission’s policy agenda as well as the spirit of a Digital Single Market. That is particularly obvious with respect to the obligation to store removed content within the Federal Republic.

On the basis of our concerns, we call on the Commission to live up to its role of guardian of the Treaties and make sure national rules are compliant with EU law and case law. In concrete terms, this would translate into at least responding to Germany with a detailed opinion expressing incompatibility with EU law under the ongoing notification process.

We would like to thank you for your time and attention.

With kind regards,

James Waterworth, Vice-President, CCIA Europe
Siada El Ramly, Director General, EDiMA
Joe McNamee, Executive Director, EDRi
Fanny Hidvégi, European Policy Manager, Access Now
TJ McYntyre, Chair, Digital Rights Ireland
Jens-Henrik Jeppesen, Director, Center for Democracy and Technology


19 May 2017

Looking back on our 2016 victories


Technological advancements in the digital world create new opportunities but also new challenges for human rights. Especially in the past year, the fear of extremism on the one side and extreme measures on the other resulted in the desire for swift political action and made defending citizen’s rights and freedoms online a difficult task. In 2016, our European network faced demands for increased state surveillance and restrictions on the freedom of expression by private companies, and decreased protection of personal data and privacy. Our annual report 2016 (pdf) gives you an overview of EDRi’s campaigns across the European countries and our key actions at EU level.

Despite our struggles, our members, observers, national and international partners, supported by many individuals who contributed to our work, successfully protected digital rights in a number of areas.

We successfully advocated for a reform of privacy rules in electronic communications (ePrivacy) and played a key role in the civil society efforts that led to the adoption of the EU’s General Data Protection Regulation (GDPR) in April 2016.

We scored a big success in our top priority issue and secured net neutrality in Europe. This victory was the outcome of more than five years of hard work and the input from over half a million citizens responded to the net neutrality consultation in 2016.

We released influential analysis that contains implementation guidelines for the General Data Protection Regulation. We published two documents highlighting the numerous, unpredictable flexibilities in the legislation and how they should be implemented.

We published the “Digital Defenders”, a comic booklet to help kids make safer and more informed choices about what to share and how to share online. It turned out to be a huge success – the original English version of the booklet has been downloaded from our website over 25 000 times and published in Serbian, Turkish, German, Greek, Spanish and Italian, with other translations on the production line.

While we regret the adoption of an ambiguous Directive, we successfully requested the deletion of many harmful parts that were proposed in the course of the legislative discussions and the clarification of some of the ambiguous language.

Our criticism of the new so-called Privacy Shield was echoed by many experts in the European institutions and bodies (the European Parliament, the European Data Protection Supervisor, and the European Ombudsman) and led to mainly negative press coverage for the Commission and continued pressure for a more credible solution.

Read more in our Annual Report 2016!

Our finances can be found on pages 43-44.


17 May 2017

ENDitorial: Commissioners’ oath – a broken promise on fundamental rights

By Joe McNamee

On 3 May, 2010, the entire European Commission travelled to the Court of Justice of the European Union (CJEU) in Strasbourg to, for the first time in the history of the Union, take an oath that included a solemn declaration to “respect the Treaties and the Charter of Fundamental Rights of the European Union in the fulfilment of all [its] duties”. On that day, the President of the Commission stated that “the oath of independence [from Member States] and respect for the EU Treaties is more than a symbolic act”.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

The Charter of Fundamental Rights, in line with all major human rights instruments, requires restrictions on fundamental rights to be necessary and prescribed by law. This is the most basic of obligations – without law there is no democratic process, without law there is no accountability, without law there is no predictability about what is permitted and what is prohibited.

The Charter is binding on Member States (when implementing EU law) and the Commission itself. It is not directly binding on the companies we rely on to host our websites, to host our social media posts, and to provide us with access to the internet. This offers a huge loophole for pushing “voluntary” restrictions that would not be permitted by law.

In October 2013, the Commission proposed its Telecoms Single Market (TSM) Regulation, where it tried and failed to include a loophole that would have allowed internet access providers “manage” traffic on their networks to prevent or impede (undefined) “serious crimes”. Restriction? Yes. Provided for by law? No. Predictable? No.

In June 2016, the Commission pushed internet companies into signing a “code of conduct” on “hate speech”. This code publicly demotes the rule of law to a secondary status behind the terms of service of internet companies. Complaints would have to be reviewed on the basis of the companies “rules and community guidelines” and only “where necessary”, national laws. Restriction? Yes. Provided for by law? No. Predictable? No.

In September 2016, the Commission published its proposal for a new Copyright Directive. It includes measures to undermine legal security of internet companies, making it easier to coerce them into “voluntarily” restricting content. The Directive places non-specific obligations on companies to “prevent” the upload of material (videos, text, audio, pictures… whatever) to their servers. Restriction? Yes. Provided for by law? Not clearly. Predictable? No.

Then the Commission launched its Communication on the implementation of the Digital Single Market (DSM) Strategy for Europe. What’s new? The abandonment of any measures to bring about a predictable legal framework in this area and new ambitions for lawless restrictions. These include demands for internet companies to engage in “proactive” searching for “illegal” content to remove, without defining who should assess whether the content is “illegal”, and to propose “technical solutions” for this. As a little joke, the text adds that this would be “in full respect of fundamental rights” – not “the Charter of Fundamental Rights”, but unspecified, lower case “fundamental rights”.

On 10 May, 2017, 24 Members of the European Parliament (MEPs), recognising the Commission’s abandonment of its obligations, signed a letter asking for a predictable legal framework.

The President of the European Commission said in 2010: “The oath of independence and respect for the EU Treaties is more than a symbolic act”. As it turned out, it was not even a symbolic act.

MEPs want notice and action directive: Open letter sent to Commissioner Ansip (10.05.2017)

Killing parody, killing memes, killing the internet? (08.05.2017) 

ENDitorial: Transparency and law-making on EU copyright – mutually exclusive? (05.04.2017)

The copyright reform: A guide for the perplexed (02.11.216)

FAQ: EU Code of Conduct on illegal hate speech (02.11.2016) 

New documents reveal the truth behind the Hate Speech Code (07.09.2016)

(Contribution by Joe McNamee, EDRi)



17 May 2017

Reclaim the net! Copyright and online freedoms at re:publica17

By Guest author

It is hard to count how many times we have been saying that the current European copyright regime is outdated. Sometimes the focus is on the negatives: what it should not be like. The ongoing copyright reform reinforces that tendency with proposals such as the content filter. However, at re:publica17, an annual gathering of media experts, activists and techies, we talked about the positive vision regarding access to culture and freedom of expression.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

Our rights need better law-making

The joint session of European Digital Rights (EDRi) and Centrum Cyfrowe “Reclaim the net! Copyright and online freedoms” had two goals. First, we presented how culture could be put back in the hands of the people. We painted a positive image of a world where users’ rights are respected. The right to access quality education encompasses using all sorts of materials, from text and pictures to music and multimedia, to make the learning interesting and relevant. The right to create and share content enables individual artistic expression when a new work is created based on somebody else’s creation: be it a cat meme, a song cover, or fan fiction. Text and data mining provides for a better understanding of the world, which is key to a brighter future. Accessing information of any kind should be possible without anyone limiting it.

This vision should be the reality of the digital age. We should be able to use technology to the benefit of an active, educated, creative society. We should be able to make informed choices based on information that is freely accessible. The current copyright reform does not give justice to the opportunities that are technologically possible to achieve. We are served licenses instead of exceptions, a censorship machine, and a demonstrated failed proposal to “tax” news aggregators.

Our everyday copysins

During the second part of the workshop, we explained how the current outdated norms to access culture (namely copyright laws) lead users to violate copyright, often without knowing they are doing something illegal. They are, in a way, “copysinners”. We discussed how streaming and filesharing or using somebody else’s picture as a social media avatar are common practices and sometimes unavoidable. Under the current law, these actions are at best legally doubtful if not evidently illegal. Yet they happen all the time.

The participants of the workshop agreed that a new, modern copyright adapted to the digital age is needed. There should not be absurd new rights for publishers. Legal use of works such as subtitling should be allowed, and geoblocking forbidden. It should be made clear that the use of user-generated content is legal, and the right to private copy should be guaranteed.

Now, with the copyright reform discussed in the European Parliament, the time is ripe to check if policy-making can be based on common sense. These are the things any of us can do:

----------------------------------------------------------------- Support our work with a one-off-donation! -----------------------------------------------------------------

Copyright guide for the perplexed

Document pool for the copyright directive proposal

EDRi’s position on copyright

COMMUNIA policy recommendations

Reclaim the net! Copyright and online freedoms at re:publica17 (15.05.2017)

(Contribution by Anna Mazgal, Centrum Cyfrowe Foundation, Poland, and Diego Naranjo, EDRi)



17 May 2017

AVMSD: European Parliament set to vote whether it’s allowed to vote

By Maryant Fernández Pérez

On 18 May 2017, Members of the European Parliament (MEPs) will vote on whether they want to work on the Audiovisual Media Services Directive reform now, or let a handful of MEPs represent over 500 million EU citizens in the so-called “trilogue” negotiations between the European Parliament, the European Commission and the Council of the EU, representing the Member States’ governments.

On 25 April 2017, the majority of MEPs (17 against 9, with 4 abstentions) represented in the European Parliament Committee on Culture and Education (CULT) adopted a report on the AVMSD reform. Thanks to the work of some parliamentarians, the report opposes upload filtering. Despite this, the text does not fully safeguard users’ freedom of expression and opinion online. Therefore, EDRi sent a letter to the MEPs asking them not to give their green light to the report and vote against the trilogue mandate for the CULT Committee.

Why should the European Parliament adopt a position before the trilogue negotiations start?

  • Key definitions, such as “video-sharing platforms” and “user-generated content” are not clear;
  • Video-sharing platforms, including social media companies, are asked to regulate content that is not necessarily illegal, such as incitement to hatred based on a list of criteria, including “political opinions or any other opinions”. This lack of precision and clarity can only lead to arbitrary and unaccountable decisions by companies. In a system that respects the rule of law, restrictions on freedom of expression must be provided for by law, and be necessary and proportionate;
  • Companies are asked to have a “self-regulatory” role in the moral development of children, without any assessment of how appropriate this is, and what implications it can have;
  • The Parliament needs a strong mandate to counter some of the worrying positions of the Council.
  • The Parliament should take the time to reflect on the text, adopt amendments to defend the fundamental rights of the citizens it represents, and bring legal certainty and clarity. This would contribute to making the process more democratic;
  • Once the trilogue negotiations are concluded, the other MEPs will not have real power to change the text. While amendments to the political agreement reached in trilogues would be possible, political majorities are very difficult to reach because it would reopen the negotiations.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

Letter to MEPs asking to reject trilogue mandate and adopt a first reading position (17.05.2017)

Audiovisual Media Services Directive reform: Document pool (15.05.2017)

AVMSD reform: Document pool infographic (15.05.2017)

CULT report on the AVMSD reform (10.05.2017)

AVMS Directive: It isn’t censorship if the content is mostly legal, right? (27.04.2017)

Trilogues: the system that undermines EU democracy and transparency (20.04.2016)

(Contribution by Maryant Fernández Pérez, EDRi)



17 May 2017

ALTwitter – profiling with metadata


When we are sharing links, events or ideas through social media, we leave behind a trace of metadata: when and how often, which days of the week, in which language, using which hashtag, linking to which users or websites, and so on. Those details might not say much when we look at each piece of information separately, but when combined, they can show some interesting things – and no-one can avoid being profiled on the basis of these metadata, which are publicly available to anyone who bothers looking for them.

EDRi’s Ford-Mozilla Open Web Fellow Sid Rao created a platform called ALTwitter, which combines the metadata collected from public Twitter accounts of the Members of the European Parliament (MEPs) and presents them graphically. Without going through all their tweets, one can learn a lot about their work areas, the devices they use, the types of websites they refer to, when they are the most active, and so on. What we can learn about the person only based on these metadata is indeed much more than we would first expect!

Why is this important? The aim of the Hakuna Metadata project is to show that the metadata that we generate by our daily activities online contains an immense amount of information. A big part of metadata is collected for marketing purposes, to target advertising to us according to our habits and preferences, to influence our decisions on what to buy. Our activity patterns and information on the devices we use can be used for making assumptions about, for example, how wealthy we are.

Most companies are collecting more information about us than they actually need for offering us their services. They can then sell this information to marketing companies, data brokers, political campaigns and others, who will be able to connect the metadata collected from different sources. Based on that, they will draw conclusions about us. This could affect our own behaviour, but also what services we are offered and at what price, which can also lead to discrimination.

What about those politicians who have been profiled with the help of Sid’s ALTwitter metadata platform? Who is tweeting way too much? Who has strange tweeting habits? What are the hot topics? Are MEPs tweeting from their own devices? Check out their profiles here and find out how much detailed and sometimes surprising information can be drawn from publicly available metadata that we are giving out, often without the intention to do so.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

ALTwitter #hakunametadata: Twitter metadata profiles of the Members of European Parliament

Hakuna Metadata – Let’s have some fun with Sid’s browsing history! (03.05.2017)

Hakuna Metadata – Exploring the browsing history (22.03.2017)



17 May 2017

UK Digital Economy Act: Millions of websites could be blocked

By Guest author

The Digital Economy Act has become law in the United Kingdom. This wide-ranging law has several areas of concern for digital rights, and could seriously affect privacy and freedom of expression of internet users.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

One of the main concerns is that it will compel legal pornographic websites to verify the age of their users. The British Board of Film Classification (BBFC) has been given the power to fine or instruct ISPs to block websites that fail to provide age verification, which could mean that thousands of websites containing legal content could be censored.

On 10 May 2017, EDRi member Open Rights Group (ORG) received a response to their Freedom of Information (FOI) request on the correspondence between BBFC and MindGeek, the company developing age verification technology. The response revealed that for the Digital Economy Bill to be effective in preventing children from accessing pornography, the government would need to block over four million websites.

The law will also extend the maximum prison sentence for online copyright infringement to ten years. ORG has raised concerns that the wording of this offence is too broad and could in theory be used against file sharers. It could also be exploited by ”copyright trolls”, that is law firms who send letters to threaten users who are suspected of unauthorised downloading of copyrighted works with the possibility of legal procedures – even though there may not be evidence to support this.

The Digital Economy Act also gives the police the power to disable mobile phones that they believe might be used for crimes. ORG has criticised this power, as it pre-empts criminal behaviour.

Finally, the Act includes new powers for sharing data across government departments. Even if the definitions of these new powers were improved during the parliamentary process, they are still too broad, and leave room for practices that dramatically threaten citizens’ fundamental rights to privacy.

The UK Digital Economy Bill: Threat to free speech and privacy

FOI response reveals porn company’s proposals for UK to block millions of porn sites

Digital Economy Act: UK police could soon disable phones, even if users don’t commit a crime

(Contribution by Pam Cowburn, EDRi member Open Rights Group, the United Kingdom)



17 May 2017

BBA Germany 2017: Espionage, threats, tracking, provoking cyber wars

By Guest author

The annual German Big Brother Awards were bestowed by EDRi member Digitalcourage on 5 May 2017 in Bielefeld, Germany. The event drew much media attention, as one of the awardees threatened the organiser with legal action.

The awardees are informed of their awards a few days in advance and invited to respond or appear at the gala. In a response to their notification, the awardee in the “Politics” category threatened to sue Digitalcourage for libel. When a local paper heard of this threat and managed to obtain off-the-record confirmation of the awardee’s identity, it decided to break the press embargo requested by Digitalcourage and reported about the issue. This caused other media outlets to follow suit. Digitalcourage issued a statement that they would not be perturbed by such intimidation. At the time of writing this article, they have not heard of any actual legal action.

The awardee in the “Politics” category was the Turkish-Islamic Union for Religious Affairs (DİTİB). Imams at DİTİB – with ties to the Turkish government and its secret service MİT – are said to have conducted political espionage on DİTİB members and visitors, exposing them to persecution by the Turkish state. According to a December 2016 newspaper report, a call had been issued by the Presidency of Religious Affairs (Diyanet) to Turkish embassies and consulates to compile dossiers on people suspected of being followers of the influential Islamic cleric Fethullah Gülen. Charges were filed immediately, but the investigation did not start before chancellor Angela Merkel returned from a state visit to Turkey in February 2017. By this time, ten of sixteen imams had been recalled to Turkey under directions from Diyanet, which employs imams and sends them to Germany. The award speech criticised DİTİB for trying to treat the issue as just an internal affair and avoid public scrutiny, and the German society for failing to ensure a free exercise of the Islamic faith.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

The “Workplace” award went to the company Planung für Logistik & Transport GmbH (PLT, “Planning for Logistics and Transport”) for promoting a “staff tracker”. This device provides employers with real-time information on the movements of their employees. This is exacerbated by the fact that PLT’s promotional material misrepresents the legal situation, claiming that Germany’s minimum wage requires such extreme tracking with precision to the minute and metre. There is even an apparent suggestion for concealed deployment of the tracker, making its use clearly illegal. The examples offered on the company’s website do not provide evidence to support the claims that customers are using the devices via company’s internal agreements, and the jury doubted the substance of these claims.

In the “Economy” category the German IT business association Bitkom was called out for their uncritical promotion of big data and for obtrusively lobbying against data protection. Their lobbying has recently targeted the adaptation of German data protection law following the enactment of the EU’s General Data Protection Regulation (GDPR). Bitkom has taken to calling basic principles of data protection “outdated” and “over-regulated”, trying to introduce new and vaguer notions such as “data wealth” and “data sovereignty” as alternatives. Worryingly, their language has been closely echoed in recent speeches from the three relevant ministers and also from chancellor Merkel. The objective of Bitkom’s lobbying obviously is to reduce state regulation, replace it by “voluntary self-regulation” and give companies a free rein for pursuing big data business models. The award called on the government to stop fulfilling the IT industry’s every wish, and it called on companies to capitalise on German companies’ expertise regarding data protection and privacy.

The Technical University of Munich (TUM) and Ludwig-Maximilians-Universität München (LMU Munich) received the “Education” award for cooperating with the online learning provider Coursera. The speech presenting the award called data about students’ learning results a “treasure trove”, pointing out that Coursera as a company reserves the right to exploit this data commercially. Universities were warned not to make participation in online courses a condition for earning mandatory credits. If an appropriate European platform does not exist, it should be the universities’ duty to create such platform.

In the “Authorities and Administration” category, the award went to the German Federal Military (“Bundeswehr”) and the Federal Minister of Defence, Ursula von der Leyen. They have recently decided to join the global “cyber” arms race and set up a “Commando Cyberspace and Information Space”. The award speech reminded that these activities were lacking parliamentary control, democratic oversight and even legal basis. Empowering units to mount their own cyber attacks on other countries and their infrastructure could create new risks of military action that might affect civilians and open pathways to a quick escalation of cyberspace activities into a regular war.

The company Prudsys received the award in the “Consumer Protection” category for offering software that facilitates price discrimination. This software sets a price according to what it can find out about the individual customer, not according to a product’s value. As a consequence, two different people may have to pay a different price for the same product. Prudsys is promoting its technology to online shops, but also to physical retail stores through the use of electronic price tags. The award criticised this as “a world driven by greed” where there are “no humans, no individual products, no satisfaction, no service […] – only numbers” and customers lose out due to a grave imbalance of knowledge between them and the traders.

The audience at the gala were asked to name the one award they found particularly impressive, astonishing, shocking or revolting. This year the clear winner, with about a third of the votes, was the “Authorities and Administration” award to the German military and the Minister of Defence.

All the award speeches have been translated into English, and the gala was also recorded and simultaneously interpreted into English. The original recording has already been published, with the English version to follow.

----------------------------------------------------------------- Support our work with a one-off-donation! -----------------------------------------------------------------

The German Big Brother Awards 2017, English translations

Videos of the German Big Brother Awards

(Contribution by Sebastian Lisken, EDRi member Digitalcourage)



17 May 2017

Dutch ban on zero-rating struck down – major blow to net neutrality

By Guest author

20 April 2017 was a bad day for net neutrality in the Netherlands, and possibly also in the rest of Europe. The court of Rotterdam struck down the general ban on price discrimination, including zero-rating, as enacted in the Dutch Telecommunications Act. The court held that the categorical ban on price discrimination is “evidently” in violation of the European net neutrality Regulation.

How did we get here?

In its net neutrality guidelines, the Body of European Regulators of Electronic Communications (BEREC) left some room for interpretation as regards the issue of zero-rating. However, in May 2016, the Dutch legislator made it crystal clear that under the net neutrality Regulation price discrimination is prohibited. This was in accordance with the country’s history of upholding a strong net neutrality law, including the ban on zero-rating.

Shortly after the Dutch legislator passed this law, mobile operator T-Mobile launched a new service that enabled subscribers to use certain music streaming services without having to tap into their monthly data package. This zero-rating policy violates an important principle of net neutrality. Providers shouldn’t act as gatekeepers: all traffic should be treated equally, without discrimination, and every service should be accessible online under the same terms. When T-Mobile allows its customers free access to some services but not to others, they create a strong incentive to use one service over another, even if the alternative might be “better”. For instance, T-Mobile users currently don’t pay for the use of music streaming services Spotify and Tidal, but do pay for the use of similar services offered by Apple and Google. If you have a hard time feeling sorry for Apple and Google, consider this: in the future it is highly likely that large companies that have enough leverage and are able to meet telecoms providers’ terms and conditions will end up on the list of services offered free of cost under zero-rating, while users will still be obliged to pay for using new, smaller, perhaps more innovative or foreign services.

EDRi member Bits of Freedom urged the Dutch telecoms regulator ACM to protect the freedom of end users and enforce the net neutrality rules against T-Mobile. Fortunately, ACM did just that, and decided that T-Mobile’s offering was indeed in violation of net neutrality rules. T-Mobile subsequently appealed this decision and went to the court of Rotterdam.

----------------------------------------------------------------- Support our work - make a recurrent donation! -----------------------------------------------------------------

What did the court say?

The court ruled that the net neutrality Regulation does not categorically prohibit price discrimination, but requires a case by case analysis. The court found that the prohibition on discrimination of Article 3(3) applies only to the technical treatment of traffic, but not to the pricing of the internet access service. The categorical ban on price discrimination enacted in the Dutch Telecommunications Act is therefore “evidently” in violation of the Regulation. Although this is a matter of interpretation of European law, the court did not see any reason to refer the case to the Court of Justice of the European Union (CJEU) for a preliminary ruling. The court further remarked that the Dutch legislator did not have the competence to enact any rules determining the scope of the non-discrimination rule of the Regulation, given the direct effect of the Regulation.

Because ACM’s decision against T-Mobile relied solely on its finding that T-Mobile violated the categorical ban on price discrimination, the court limited its ruling to whether such a ban exists under the Regulation or not; the court did not assess whether T-Mobile’s offering violates the net neutrality Regulation irrespective of whether such a ban exists or not.

Next step: The regulator must regulate

If this ruling is not appealed, the strong net neutrality laws we have grown accustomed to in the Netherlands since 2012 will be a thing of the past. This would also set a bad precedent for net neutrality in the rest of Europe. On 11 May, Bits of Freedom therefore sent a letter to ACM urging it to appeal the court ruling, and requesting it to take enforcement measures against T-Mobile based on the fact that their offering still violates the applicable net neutrality rules.

----------------------------------------------------------------- Support our work with a one-off-donation! -----------------------------------------------------------------

Press release: Dutch government prohibits price discrimination for internet access (17.05.2016)

Translation of Dutch net neutrality provisions (25.05.2016)

Net neutrality wins in Europe! (29.08.2016)

ACM should enforce net neutrality vigorously (only in Dutch, 16.05.2017)

(Contribution by David Korteweg, EDRi member Bits of Freedom, The Netherlands; translation by Tom Rijndorp)



17 May 2017

Big Data for Big Impact – but not only a positive one

By Guest author

Technology has changed and keeps dramatically changing our everyday life by transforming the human species to advanced networked societies. To celebrate this digital revolution, 17 May is dedicated to the “World Telecommunication and Information Society Day” (WTISD-17).

The theme for this year’s celebration is “Big Data for Big Impact”. Not so surprisingly, the buzzword “big data” echoes in our daily commutes over the internet world. The chosen theme focuses on harnessing the power of big data to turn complex and imperfect pieces of data into a meaningful and actionable source of information for the social good.

Big data has a potential to improve society – much like electricity or antibiotics. From health care and education to urban planning and protecting the environment, the applications of big data are remarkable. However, big data comes with big negative impacts. Big data can be used – by both advertisers and government agencies – to violate privacy. The power of big data can be exploited to monitor every single detail of people’s activities globally.

With 29 million streaming customers, Netflix is one of the largest providers of commercial media in the world. It has also become a trove of data for advertisers as it collects data on users’ activities – what, when and where they are watching, what device they are using, when they fast-forward, pause or stop. Just imagine a representative of Netflix sitting behind your couch, looking over your shoulder and making notes whenever you turn on the service. This applies to many online services, such as Google, Amazon, Facebook or YouTube.

Mass surveillance initiatives by intelligence agencies such as the US National Security Agency (NSA) and the UK Government Communications Headquarters (GCHQ) take this power to the next level to knock down every bit of personal space. Without big data, the scale at which such profiling is done today would not be possible.

It is very tempting to use the benefits of big data for all sorts of purposes. Hiring new employees based on their social media activities, granting insurances based on fitness tracker data, airport security check ups and future crime predictions based on cell phone call logs, to mention a few. But there are some fundamental problems with applying big data to services.

The first problem is that, knowingly or unknowingly, we all have biases when making decisions. If decisions made by millions of employers, policemen or judges over a long period are collected together, it brings in all those biases, on a bigger scale. Big data may just refer to a large chunk of unstructured data, but the insights deduced from it will rely on machine learning – which accumulates all possible biases, such as gender and race. Algorithmic decision-making could turn out to be more biased than ever before, which would have a terrible effect on the society.

The second problem is the error rates: A study on automatic face recognition software found that the error rates can vary between 3% and 20%. This means that your face could match with one in the database of potential terrorist the next time you go to the airport and you could be pulled out for questioning or get into even more trouble. This is happening in the international airport transit on a daily basis. It is not possible to create 100% accurate models, and every time the assumptions are made on a missing data sample, the errors are inevitable.

Therefore, when dealing with big data, it is crucial to be extremely cautious about the quality and sources of the data, as well as about who can access it, and to what extent. If a data set stemming from diverse sources is handled with special care and anonymised thoroughly to protect privacy rights, big data can be used to solve complex societal problems. But if it is left unregulated or not properly regulated, and not tested for its fairness and biases, it can pose a serious threats to our human rights and fundamental freedoms.

EDRi has fought for the EU General Data Protection Regulation (GDPR) to regulate this practice. Now EU Member States are implementing the GDPR, and it is up to them not to abuse the weak points of the Regulation to undermine the protection of the European citizens’ data.

Video by EDRi member Privacy International: Big Data

Creating a Big Impact with Big Data

(Contribution by Siddharth Rao, Ford-Mozilla Open Web Fellow, EDRi)