13 Mar 2019

What will happen to our memes?

By Bits of Freedom

In Europe, new rules concerning copyright are being created that could change the internet fundamentally. The consequences that the upload filters included in the EU copyright Directive proposal will have for our creativity online raise concerns. Will everything we want to post to the internet have to pass through “censorship machines”? If the proposed Directive is adopted and implemented, what will happen to your memes, for example?

The proposal that will shortly be voted on by the European Parliament contain new rules regarding copyright enforcement. Websites would have to check every upload that is made by their users for possible breaches of copyright, and must block this content when in doubt. Even though memes are often extracted from a movie, well-known photo or video clip, advocates of the legislation repeat time and again that this doesn’t mean memes will disappear − they reason that exceptions will be made for that. In practice, however, such an exception does not seem workable and impairs the speed and thus the essence of memes. It will be impossible for an automated filter to capture the memes’ context.

Step 1: You upload a meme

Imagine that you’re watching a series and you see an image that you would like to share with your friends − it could be something funny or recognisable to a large group of people. Or that you use an existing meme to illustrate a post on social media. Maybe you adjust the meme with the names of your friends or the topic that concerns you at that moment. Then you upload it on Youtube, Twitter or another online platform.

Step 2: Your upload is being filtered

If the new Directive – as currently proposed – is implemented, the platform will be obliged to avoid any copyrighted material from appearing online. In order to abide the legislation, they will install automated filters that compare all material imported into the platform with all the copyrighted material. In case there is a match, the upload will subsequently be blocked. This will also be the case with the meme you intended to share online, because it originates from the television series, video clip or movie. You get the message: “Sorry, we are not allowed to publish this.”

Step 3: It’s your turn

What!? What about the exception that was supposed to be there for memes? Of course the exception is still there, but in practice it’s impossible to train filters to know the context of every image. How does a filter know what is a meme and what isn’t? How do these filters keep learning about new memes that appear every day? There are already many examples of filters that fail. Hence, you’ll need to get to work. Just like you can appeal against the online platforms’ decision when it has wrongfully blocked a picture for depicting “nudity” or “violence”, you will be able to appeal when your meme couldn’t pass the filter. That probably means that you’ll need to fill in a form in which you explain that it’s just a meme and explain why you think it should be allowed to be uploaded.

Step 4: Patience, please

After the form is filled in and you click “send”, all you can do is wait. Just like already is the case with filters of Youtube and Facebook: the incorrectly filtered posts need to be checked by real human beings, people that can assess the context and hopefully come to the conclusion that your image really is a meme. But that process can take a while… It’s a pity, because your meme was responding perfectly to current events. Swiftness, creativity and familiarity are three key elements of a meme. With upload filters, to keep the familiarity, you lose the swiftness.

Step 5: Your meme will still be posted online − or not?

At a certain moment in time, you receive a message. Either your upload has been finally accepted, or there still might be enough reasons to refuse it from being uploaded. And then what? Will you try again at another platform? That might take some days as well. The fun and power of memes is often the speed in which someone responds to a proposal of a politician, or an answer in a game show. Therefore you shouldn’t let Article 13 destroy your creativity!

#SaveYourInternet as we know it! Call a Member of the European Parlement (for free) through pledge2019.eu!

Bits of Freedom

What will happen to our memes? (11.03.2019) https://www.bitsoffreedom.nl/2019/03/11/what-will-happen-to-our-memes/

What will happen to our memes? (only in Dutch, 11.03.2019) https://www.bitsoffreedom.nl/2019/03/04/wat-gebeurt-er-straks-met-onze-memes/


Save Your Internet

(Contribution by Esther Crabbendam, EDRi member Bits of Freedom, the Netherlands; translation by Winnie van Nunen)

27 Feb 2019

You cannot post “a bag of bones” on Facebook

By Bits of Freedom

However shocking our reality may be, sometimes you have to face it. By censoring a news article about the horrific war in Yemen, Facebook completely disqualifies itself as a platform for public debate.

This story should be heard

“Chest heaving and eyes fluttering, the 3-year-old boy lay silently on a hospital bed in the highland town of Hajjah, a bag of bones fighting for breath.” This is the first sentence of an article by the New York Times about the war in Yemen. But the article actually starts with a photo. Below the headline and above this first paragraph a picture of the seven-year-old Amal Hussain fills the screen. The picture is harrowing.

The article tells of the horrors of the unimaginable humanitarian disaster that is taking place in Yemen. For the third time in 20 years the United Nations is about to officially speak of famine. This story must be told and heard, no matter how painful it may be.

Censorship, censorship, censorship

That was also the opinion of freelance journalist Shady Grove Oliver, who shared the New York Times article with her followers on Facebook. Soon the post was removed because it was supposedly in violation of Facebook’s Community Standards. Why? The photo accompanying the newspaper article contained “nudity or sexual activity”, according to Facebook.

The journalist pointed out this shameful mistake to Facebook, but the platform stuck to its decision. Persevering, Grove Oliver asked for a real review by an actual human being. In a message to Facebook, she referred to an article by the editors of the New York Times in which the newspaper accounts for its decision to confront readers with the shocking images. Facebook still refused to reconsider its decision and instead blocked Grove Oliver’s entire account. Only hours later the account and the posts were shown again.

Fauxpologies from Facebook

On the same day as the article, the New York Times published an extensive piece in which it explains why it made the difficult decision to publish these photographs. “This is our job as journalists: to bear to give voice to those who are otherwise abandoned, victimized and forgotten.” This stands in stark contrast to the way Facebook dealt with this important story. Firstly, Facebook’s content moderation policy is apparently so blunt that it confuses photos of emaciated children with “nudity or sexual activity”.

Secondly the journalist, once Facebook realised its mistake, received the usual clumsy fauxpologies from the company. The apologies were shown in a screen entitled “Warning”, followed by a text indicating that Grove Oliver must confirm that she “understood”. No explanation about how it’s possible that this happened, how bad Facebook thinks this is, or what it learned from this. And, yes, what happened next is unfortunately no surprise: a few hours later another of Grover Oliver’s posts was censored.

Facebook disqualifies itself (again)

It isn’t the photos of children that are shocking, but what is happening to these children in Yemen. And we must be confronted with that story. However painful it may be, we should not look away from it en masse. Reality is often harsh. It isn’t bad at all that we are confronted with it from time to time. It isn’t bad at all that it sometimes makes us feel a little queasy. That confrontation, that queasy feeling, are sometimes the driving force behind change. That’s why the New York Times writes: “we are asking you to look.”

Facebook’s mission is to bring “the world closer together.” But how do we get closer to each other as long as we are not allowed to see the suffering of others? When images of children who are victims of a horrible war are simply brushed away? Don’t these children belong to “the world” Facebook has in mind? And why do we still have faith in a company that cannot distinguish famine from sex? Or indeed: that it might not even want to?

Once again Facebook has completely disqualified itself as a place for public debate. With its dominant position, the company stands in the way of a critical view of the atrocities of our time. We urgently need to review how we want to communicate with each other.

You cannot post “a bag of bones” on Facebook (only in Dutch, 19.12.2018)

The New York Times: The Tragedy of Saudi Arabia’s War (26.10.2018)

Tweets by Shady Grove Oliver (16.12.2018) https://twitter.com/ShadyGroveO/status/1074426791736107019

Why We Are Publishing Haunting Photos of Emaciated Yemeni Children (26.10.2018)

(Contribution by Evelyn Austin and Rejo Zenger, EDRi member Bits of Freedom, the Netherlands; translation from Dutch to English by Martin van Veen)



13 Feb 2019

Time for better net neutrality rules

By Bits of Freedom

A Dutch court struck a blow against strong net neutrality protections. According to the court, the mobile operator T-Mobile may continue to provide certain music services with preferential treatment to its customers in the Netherlands − a disappointing judgment showing the need for better rules.

edri.org/wp-content/uploads/2015/09/Supporters_banner.png” alt=”—————————————————————–
Support our work – make a recurrent donation!
—————————————————————–” width=”600″ height=”50″>

T-Mobile has thrown the principle of net neutrality overboard with their “Data-Free Music” service. This service provides certain music streaming services with preferential treatment over other services, as long as they fulfil the conditions set by T-Mobile. This practice is called “zero rating”. To get preferential treatment, the provider of the service must not only fit within the mold of a “music streaming service” as defined by T-Mobile, it must also meet the legal and technical requirements, again set by T-Mobile. This means that music streaming services that do not make it to this list are in a disadvantageous position compared to its listed competitors.

In 2018, Dutch EDRi member Bits of Freedom appealed the decision of the national regulatory authority (ACM) not to act against T-Mobile’s “Data-free Music” service. The administrative court of first instance ruled in favour of T-Mobile: the service does not violate the net neutrality rules and the ACM does not have to act.

Unfortunately, due to procedural reasons, the court does not get to a substantive judgment on the first part of the appeal. Bits of Freedom argued that the European net neutrality rules prohibit the preferential treatment of traffic from certain services by not charging this traffic to users. The court defers to its previous judgment about this service in a case between T-Mobile and the ACM. In that judgment, it ruled that the prohibition on unequal treatment of traffic is limited to the technical treatment of traffic. The economic treatment of traffic is not covered, according to the court. The court considers this ruling to also bind the court in the case Bits of Freedom appealed.

The non-discrimination principle contained in the European net neutrality guidelines should apply to the treatment of traffic, regardless of whether that treatment involves delaying or blocking traffic or applying a different price for traffic. It is true that certain forms of differential technical treatment of traffic (so called “traffic management”) are admissible under the net neutrality rules, but this does not automatically mean that the general standard to treat traffic equally is limited only to the technical treatment of traffic.

In the second part of the appeal, Bits of Freedom explains why the zero rating service “Data-Free Music” limits the rights of end users and is therefore in violation of the net neutrality rules. An essential part of net neutrality is that an internet user is free to determine which information, services or applications they use, without interference by an internet access provider. T-Mobile is doing exactly that: it influences how certain services are treated for economic reasons. The ACM did not agree on this argument, and the court of first instance upheld ACM’s decision, unfortunately.

This judgment makes it clear that the current interpretation of the European net neutrality rules by ACM and the Dutch court is a deterioration compared to the previous net neutrality rules in the Netherlands since 2012 and before the European law came into effect. Under those rules internet access providers could not treat services unequally by charging different prices for data traffic. Dutch internet users are therefore currently less protected against practices that undermine the open and innovative nature of the internet.

In 2019, the Body of European Regulators (BEREC) will review the guidelines that protect net neutrality. The current judgment shows that it is essential that the application of the rules about zero-rating, and ultimately the rules themselves, will be improved. Only this will ensure strong net neutrality in Europe.

Bits of Freedom

Time for better net neutrality rules (06.02.2019)

Bits of Freedom’s court case about zero rating (06.08.2018)

Judgment of the administrative court of first instance (only in Dutch, 24.01.2019)

T-Mobile treats everyone equally unequally (21.02.2018)

(Contribution by David Korteweg, EDRi member Bits of Freedom, the Netherlands)



28 Jan 2019

Period tracker apps – where does your data end up?

By Bits of Freedom

More and more women use a period tracker: an app that keeps track of your menstrual cycle. However, these apps do not always treat the intimate data that you share with them carefully.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

An app that notifies you when to expect your period or when you are fertile can be useful, for example to predict when you can expect to suffer the side effects that for a lot of women come with being on your period. In itself, keeping track of your cycle is nothing new: putting marks in your diary or on your calendar have always been an easy way to take your cycle into account. But sharing data on the workings of your body with an app is more risky.

There seems to be quite a large market for period tracker apps. From “Ladytimer Maandstonden Cyclus Kalender” to “Magic Teen Girl Period Tracker”, from “Vrouwenkalender” to “Flo” – all neatly lined up in different shades of pink in the appstore. “Femtech” is seen as a growing market that has raised billion-dollar investments over the last couple of years by different startups. Are these apps made to provide women with more insight into the workings of their bodies, or to monetise that need?

It’s interesting to look at the kind of data these apps collect. The app usually opens with a calendar overview. In the overview you can input the date of your last period. In addition, you can keep a daily record of how you feel (happy, unhappy, annoyed) and whether you experience blood loss. But for most of these apps it doesn’t end there. Have you had sex? And if so, with or without protection? With yourself or with another person? How would you grade the orgasm? Did you have a stomach ache? Were your bowel movements normal? Did you feel like having sex? Sensitive breasts? An acne problem? Did you drink alcohol? Exercise? Did you eat healthy?

For a number of these questions it is understandable why answering them might be useful, if the app wants to learn to predict in what stage of your cycle you are. But a lot of these questions are quite intimate. And all this sensitive data often seems to end up in possession of the company behind the app. The logical question then is: What exactly does a company do with all this data you hand over? Do you have any say in that? Do they treat it carefully? Is the data shared with other parties?

After digging through a number of privacy statements, it appears that one of the most used apps in the Netherlands, “Menstruatie Kalender”, gives Facebook the permission to show in-app advertisements. It’s not clear what information Facebook gathers about you from the app to show you advertisements. For example, does Facebook get information on when you are having your period?

Another frequently used app in the Netherlands is “Clue”. It’s the only one we found that has a comprehensive and easily readable privacy statement. You can use the app without creating an account in which case data is solely stored locally on your phone. If you do choose to create an account you give explicit consent to share your data with the company. In that case it is stored on secure servers. With your consent it will also be used for academic research into women’s health.

This can not be said of many other apps. Their privacy statements are often long and difficult to read, and require good reading-between-the-lines skills to understand that data is being shared with “partners”. It’s possible that the sensitiveness of your breasts in itself is not very interesting to an advertiser, but by keeping track of your cycle the apps automatically acquire information on the possible start of one of the most interesting periods of your life for marketeers: motherhood.

The most extreme example is Glow, the company behind the period tracker app “Eve”. Their app is focused on the potential desire to have children. The company’s tagline is as straightforward as they come: “Women are 40% more likely to conceive when using Glow as a fertility tracker”. Besides Eve, Glow has three other apps: an ovulation and fertility tracker, a baby tracker and a pregnancy tracker. The apps link to the Glow-community, a network of forums where hundreds of women share their experiences and give each other tips.

But that’s not the only thing that Glow offers. You can’t use a Glow webpage or app without being shown the “Fertility Program”. For 1200-7000 euro, you can enroll in different fertility programs. Too expensive? You are able to take out a cheap loan through a partnership with a bank. And in the end, freezing your eggs, if you are in your early thirties, is the most economically viable option, according to the website.

Turns out that Glow is a company selling fertility products. It has built a number of apps to subtly (and sometimes not so subtly) attract more female customers. As a consumer you think you are using an app for keeping track of your cycle, but in the meantime you are constantly notified of all the possibilities of freezing your eggs, the costs of pregnancy at a higher age, and your limited fertile years. Before you know it, you are lying awake at age 30, wondering whether it would be more “economical” to freeze your eggs.

These apps shed light on what seems to be a contract to which we are forced to consent more and more often. In exchange for the use of an app that makes our lives a little bit easier, we have to give away a lot of personal information, without knowing exactly what happens with it. The fact that these apps deal with intimate information doesn’t mean that the creators treat it more carefully. To the contrary: it increases the market value of that data.

So before you download one of these apps, or advise your daughter to download one, think again. Take your time to read an app’s privacy statement, to know exactly what the company does with your data. But there is also a responsibility for the regulatory body, such as the Autoriteit Persoonsgegevens in the Netherlands, to ensure companies don’t abuse your intimate data.

Are you using one of these apps and do you want to know which data the company has gathered on you, or do you want to have that data erased? You can easily draw up a request which you can send by mail or email using My Data Done Right.

Bits of Freedom

Who profits from period trackers? (25.01.2019)

Who benefits from cycle trackers? (only in Dutch, 03.12.2018)

(Contribution by EDRi member Bits of Freedom; translated from Dutch by volunteer Axel Leering)



16 Jan 2019

We can no longer talk about sex on Facebook in Europe

By Bits of Freedom

Sometime in late 2018, Facebook quietly added “Sexual Solicitation” to its list of “Objectionable Content”. Without notifying its users. This is quite remarkable, to put it mildly, as for many people sex is far from being a negligible part of life.

The company writes that it draws a line “when content facilitates, encourages or coordinates sexual contact between adults”. A selection of what isn’t allowed (translated from the Dutch-language Community Standards):

“Content that includes an implicit invitation for sexual intercourse, which can be described as naming a sexual act and other suggestive elements including (but not limited to):
– vague suggestive statements such as: ‘looking forward to an enjoyable evening’
– sexual use of language […]
– content (self-made, digital or existing) that possibly portrays explicit sexual acts or a suggestively positioned person/suggestively positioned persons.

Content in which other acts committed by adults are requested or offered, such as:
– commercial pornography
– partners that share fetishes or sexual interests”

It is unclear what the cause is for this change. The most obvious explanation is new legislation that went into force at the beginning of last year in the United States. The “Fight Online Sex Trafficking Act” and the “Stop Enabling Sex Traffickers Act” (FOSTA/SESTA) hold companies accountable for sex work ads on their platform. Craigslist, among others, took its “Personals” offline and Reddit blocked a couple of sex work-related subreddits. Facebook’s new policy can, as well, be seen as a response to this legislation. The broad formulation of the criteria for what isn’t allowed is a precaution. Facebook chooses to err on the side of caution and over-censor, rather than risk the consequences of hosting illegal content.

Facebook boasts about connecting people, but in reality, the company increasingly frustrates our communication. There’s no question that such vaguely formulated rules combined with automated content filters will lead to more arbitrary censoring. But what this incident illustrates, more than anything, is that Facebook is thwarted by the scale at which it operates, and chooses to offload the cost of scale, namely arbitrary censorship and diminished freedom of expression, onto European users. It’s inconceivable that new legislation passed in the US means that in many European countries, if not all, one consenting adult can no longer ask another consenting adult if they want to have sex. Or, for that matter, get in touch with other people over shared fetishes or fantasies, or exchange information about safe sex.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

This impacts all European citizens, and is particularly problematic in the case of people who don’t identify with the traditional, heteronormative perspective of sex and turn to the internet for alternatives. In addition, sex workers are affected disproportionately. Sex workers often use online platforms for contacting clients and in order to exchange tips and information. Proud, an interest group for Dutch sex workers, spoke out against the new legislation in 2018 because it would (further) marginalise sex work. Facebook’s new policy demonstrates that these fears weren’t unfounded.

European countries, like all others, work hard in order to uphold their values. Many of these countries find it important that one can speak openly about sex and sexuality. In the Netherlands, significant efforts are made in order to protect and improve sex workers’ rights. Facebook’s policy thwarts these endeavours. It is unacceptable that we find ourselves in a situation in which legislation from another country has such a big impact on our societies. Is Facebook’s bottom line so important to Europe that we are willing to part with the rights and freedoms we’ve fought so hard to achieve?

In Europe we can no longer talk about sex on Facebook (only in Dutch, 13.12.2018)

(Contribution by Evelyn Austin, EDRi member Bits of Freedom, the Netherlands; translation by Winnie van Nunen)



09 Jan 2019

Bits of Freedom announces winner of privacy award

By Bits of Freedom

The Dutch Big Brother Awards will take place on 22 January 2019 in Amsterdam, the Netherlands.

This year’s distinguished winner of the Felipe Rodriguez Award is Kirsten Fiedler, Managing Director of European Digital Rights. With this award, a Dutch digital rights organisation, EDRi member Bits of Freedom recognises people and organisations who have made a remarkable contribution to our right to privacy in the digital age. Previous winners include Kashmir Hill, Open Whisper Systems, Max Schrems and Edward Snowden. The award ceremony will take place on 22 January 2019.

Photo: Jason Krüger

Kirsten Fiedler is Managing Director* of European Digital Rights (EDRi), an umbrella organisation of digital rights groups that advocates at the EU level for the protection of privacy, security and freedom of expression online. Thanks to Fiedler’s contribution, over the past eight years EDRi has grown into a highly regarded organisation with nine team members and 39 member organisations.

Increasingly, the rights and restrictions of European internetters are negotiated and decided at the EU level. Therefore it is essential that there is a strong organisation in Brussels that advocates for our human rights. Thanks to Fiedler, EDRi has become that organisation. Residents of all member states benefit from their work everyday.

– Hans de Zwart, Executive Director of Bits of Freedom.

Kirsten Fiedler will accept the award on Tuesday 22 January 2019 during a ceremony in Amsterdam. Besides the Felipe Rodriguez Award, Bits of Freedom will award the Audience Award and the Expert Award to the biggest privacy violators of 2018. Tickets can be obtained through www.bigbrotherawards.nl.

What others say about Fiedler’s nomination

“We have lived in Internet long enough to stop calling it ‘new technology’. Yet, we are facing new problems that cannot be solved with old narratives and corrupted political compromises. EDRi, co-led by Kirsten in recent years, is at the front line of all important political battles in Brussels, pushing new narratives and proposing solutions that can actually work.”

– Katarzyna Szymielewicz, Panoptykon Foundation

“EDRi is the first line of defense for digital rights in Europe and beyond. The decisions made by European policymakers have repercussions for individuals the world over, and so it is vital that we have a strong organization like EDRi working to protect our rights online. Since 2011, Kirsten’s contribution has been essential to that effort.”

– Jillian York, Electronic Frontier Foundation (EFF)

“Kirsten is the only person who ever worked for EDRi that didn’t apply for the job. Her passion and drive to fight for our human rights were so clear that I asked her – with zero job security and mediocre pay – to leave her secure job and come to work for EDRi. From that day to this, she was directly or indirectly key to all of EDRi’s successes.”

– Joe McNamee, former Executive Director of European Digital Rights (EDRi)


* From the beginning of 2019, Kirsten has taken over a new area of responsibilities, and now works as Senior Policy and Campaigns Manager.

20 Dec 2018

EDRi Awards 2018


For the first time and with great solemnity, EDRi presents the first ever 5th edition of our annual awards.

The “Humpty Dumpty Award” for the most silly “statistics”

This Award goes to IAB Europe for confusing the Google-Facebook duopoly with publishers to lobby against ePrivacy. Of course, both companies usually do everything they can to avoid being placed into the category of a publisher, but for the IAB it was nevertheless convenient to boast about the “economic value of behavioural advertising” (in other words unasked stalking) by including the revenues of their biggest clients in the larger statistic. Next time IAB, maybe let us know to whom the money goes?

The Springer Award for WTF

This prestigious Award goes to the EU Parliamentarians who are trying to introduce laws at EU level that have already dramatically failed at the national level and are doomed to be disapplied.

Another BREAKING NEWS from the CJEU: AG Hogan advises Court to rule that German press publishers’ right should be disapplied due to lack of preventive notification to EU Commission "Advocate General Hogan: the Court should rule that the new German rule prohibiting search engines from providing excerpts of press products without prior authorisation by the publisher must not be applied."

The cranial fracture facepalm Award

This year’s cranial fracture facepalm Award goes to…surprise… Facebook! Well deserved, because the company:

  • lost the data of hundreds of millions of people
  • ruined a few elections
  • had a data breach which let anyone into millions of people’s accounts
  • and still wants to put an online webcam in your kitchen

Need we say more?

"Today we're excited to introduce @PortalFacebook to everyone. Come say hi and check out http://portal.facebook.com to learn more."

The “rules of engagement” Award for outstanding courtesy in political discussions

Being blocked by the UN Special rapporteur on freedom of expression is surely an achievement to behold: This year’s first ever “rules of engagement Award” goes to David Lowery, writer for the copyright advocate website “The Trichordist” and highly vocal commentator on this year’s discussions on the Copyright Directive proposal. We value a good exchange of arguments. We are, however, sometimes more than a bit surprised about the tone and language that the rightsholder lobby deems appropriate.

"Academia? Experts? Yeah right. Don’t know jack shit about how copyright holders are exploited by internet firms. Don’t wanna know. I offered that UN Rapporteur to come and spend some time with me combatting illegal uploads with me. He blocked my twitter account. Fuck you all!"

Positive EDRi Awards

On a more serious note, we should also spare a thought for the wonderful people that are doing wonderful work at a difficult time.

The new “Old Hero” Award

This year, instead of granting the traditional New Hero Award, we’d like to introduce a new Old Hero Award, to show our thankfulness and respect to our old (young) Executive Director Joe McNamee for his tireless efforts and achievements protecting digital rights in Europe.

We know you are reading this (and quietly correcting our grammar in your mind), so we just want to tell you that, as such, we’ll do our very best to make sure that your spirit carries on!

The heroes who keep us energised Award

We cannot name everybody, including last year’s awardees, but here are some of the stars that are worth highlighting:

  • Female digital rights heroes in the Parliament: MEPs Sippel, in’t Veld, Schaake, Reda, Ernst, Sargentini
  • Wolfie Christl for his research
  • EDRi member Bits of Freedom for their work and contributions to the defence of digital rights in Europe:
  • The thousands of people who voiced their concerns against online censorship and contacted their Parliamentarians during the copyright votes – and were then insulted as being bots

Finally, we want to recognise the amazing work that all of our members and other digital rights activists are doing in Europe and around the world.

The Shortlist

The following Awards were shortlisted this year but did not quite make it to the top:

The Humpty Dumpty Award for unsuccessful filters

Tumblr for its launch of “adult content” filter – it’s so bad at its job that it flagged the company’s own examples of acceptable nudity… Algorithmic filtering just.never.works.

The WTF pop culture surveillance award

Taylor Swift fans who went to her Rose Bowl show on 18 May were unaware of one crucial detail: A facial-recognition camera inside the display was taking everyone’s photos. The images were being transferred to a server, where they were cross-referenced with a database of hundreds of the pop star’s known stalkers.

The cranial fracture facepalm Award

The cranial fracture facepalm Awards 2018 almost went to Axel Voss MEP again – for the third time – for commenting on the proposal on adopting extra rights for filming sports events: “This was a kind of a mistake by the JURI Committee, I think, someone amended these, nobody has been aware of these, and then all of a sudden…”

More shockingly, he does not even seem to know which companies will be covered by his proposals.

Notable Publications

Did you like them? Please, check previous EDRi awards:

EDRi awards 2017
EDRi awards 2016
EDRi awards 2015

EDRi awards 2014


21 Nov 2018

ENDitorial: Facebook can never get it right

By Bits of Freedom

In 2017, a man posted live footage on Facebook of a murder he was committing. The platform decides whether you get to see this shocking footage or not – an incredibly tricky decision to make. And not really the kind of decision we want Facebook to be in charge of at all.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

I didn’t actually see the much-discussed footage of the murder – and I really don’t feel the need to see it. The footage will undoubtedly be shocking, and seeing it would without doubt leave me feeling very uncomfortable. When I close my eyes, I unfortunately have no trouble conjuring up the picture of Michael Brown after he had just been shot. Or the footage of the beheading of journalist James Foley. The thought of it is enough to make me sick.

Should these kinds of images be readily available? I certainly can’t think of a straightforward answer to this question. I would even argue that those who claim to know the answer are overlooking a lot of the finer nuances involved. The images will, of course, have the bereaved family and friends cowering in pain. Every time one pops up somewhere, they will have to go through it all again and again. You wouldn’t want people to accidentally come across any inappropriate images either: not everyone will be affected equally, but the images are inappropriate nonetheless. No one remains indifferent.

That said, I still have to admit that visuals are sometimes essential in getting a serious problem across. A while back I offered a journalist some information that we both agreed was newsworthy, and we also agreed it was important to bring it to people’s attention. Even so, his words were: “You’ve got a smoking gun, but where is the dead body?”. I didn’t realise then that this sometimes needs to be taken very literally. Sometimes, the photographs or footage of an awful event can act as a catalyst for change.

Without iconic images such as the one of Michael Brown’s body, discrimination by police in the United States might not have been given much attention. And we would probably have never seen reports on the countless mass demonstrations that ensued. The fact we’re not forgetting about the Vietnam War has something to do with a single seminal photograph. Had we never seen these images, they could never have made such a lasting impact, and the terrible events that caused them would not be as fresh in our collective memory today.

I have no doubt that these images sometimes need to be accessible – the question is when. When is it okay to post something? Should images be shared straight away or not for a while? With or without context, blurred or in high definition? And perhaps most importantly: who gets to decide? Right now, an incredible amount of power lies with Facebook. The company controls the availability of news items for a huge group of users. That power comes with an immense responsibility. I wouldn’t like to be in Facebook’s shoes, as Facebook can never get it right. There is always going to be someone among those two billion users who will take offence, and for legitimate reasons.

But there, for me at least, lies part of the problem – and maybe also part of the solution. Facebook decides for its users what they get to see or not. Many of the questions floating around about Facebook’s policy would be less on people’s minds if Facebook wasn’t making important decisions on behalf of its users, and if instead users themselves were in control. The problem would be less worrisome if users actually were given a choice.

One way to make that possible is to go back to a system where you can choose between a large variety of providers of similar services. Not one Facebook, but dozens of Facebooks, each with its own profile. Compare it with the newspapers of the past. Some people were satisfied with a subscription to The New York Times while others felt more at home with The Sun. And where the editors of one newspaper would include certain images on its front page, the editors of another newspaper would make a different choice. As a reader, you could choose what you subscribed to.

But even without such fundamental changes to the way the internet is set up, users might be able to get more of a say – for instance if they can do more to manage their flood of incoming messages. Get rid of certain topics, or favour messages with a different kind of tone. Or prioritise messages from a specific source if they are the only ones writing about something. Users may not even have to make all those decisions by themselves if instead they can rely on trusted curators to make a selection for them. And even though that sounds quite straightforward, it really isn’t. That one interface has to accommodate those same two billion users, and shouldn’t create any new problems – like introducing a filter bubble.

So what it is we’re supposed to do about that shocking murder footage, I really wouldn’t know. There is no straightforward and definite answer to that question. But one thing is very clear: it is not a good idea to leave those kinds of decisions to a major tech company that holds too much power and does not necessarily share our interests. One way out would be to give users more of a choice, and consequently more control, over what they see online.

Facebook can never get it right (20.11.2018)

Bits of Freedom

(Contribution by Rejo Zenger, EDRi member Bits of Freedom, the Netherlands; translation from Dutch by Marleen Masselink)



21 Nov 2018

Whom do we trust with the collective good?

By Bits of Freedom

Wittingly and unwittingly, we increasingly leave the care of society to tech companies. This trend will prove detrimental to us.

In search of gentrification

Gentrification is a process in which a neighbourhood attracts more and more well-to-do residents, gradually driving out the less affluent. The Dutch designer Sjoerd ter Borg, collaborating with Radboud University Nijmegen among others, is researching whether we can use technology to recognise this urban process from large quantities of visual information.

One of the first projects originating from this research makes use of Google Street View archives. While looking for indications of gentrification in Seoul, South Korea, Ter Borg came across the beach umbrella. Street vendors use these umbrellas to stand out while staying in the shade. Beach umbrellas have long dominated the streets of Seoul, but they are disappearing from gentrified neighbourhoods. Because Google photographs streets at regular intervals and Street View has the option to go “back in time”, it makes for a perfect visualisation of this phenomenon. This enabled Ter Borg to create the intriguing film Beach Umbrella that touches on urgent questions concerning the access to and use of data.

Data on the present means control over the future

Whether you want to virtually walk the streets of Seoul or check out the gardens of a Yorkshire village: for the best results, you’ll have to go to a US company. Of course there are other initiatives, such as the data portal Amsterdam City Data made by the City of Amsterdam, that collect and connect data and make it available. The scale on which this is done, however, is incomparable. The Google Street View archives are unequalled. Moreover, Google enhances the data collected by its Street View cars with for instance satellite images. Local authorities provide the company with information on the design of public space; public transport companiesprovide timetables and real-time information on disruptions. The mobile phones we all carry with us are useful sensors for Google.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

And the result? If Google were to go into real-estate development, it would have a head start. If it were to make a bid for Amsterdam’s public-transport service, without a doubt the company could do it more efficiently than the city’s own transport company. Thanks to Google’s access to immense quantities of data combined with the unlimited capacity for making complex analyses, it can make better-founded assumptions regarding future developments. And that’s the most painful bit: as a logical consequence of our blind faith in big data, the future will be shaped by the parties with the most data and the greatest computing power at their disposal.

Increasingly, public data is in private hands

More and more those powerful parties are private companies. It’s a disturbing thought that Google has more data on Amsterdam’s urban development than the city itself. Not only do other companies find it increasingly difficult to compete with Google, but the public sector is being left behind as well. We are moving toward a situation in which more and more of our public data is privately owned—and leased back to us under commercial terms.

An untenable situation, once you realise that Google will be the one to decide which data should be collected and which part of it should be disclosed to whom and under what conditions. Should Google be unhappy with the direction a certain research project is taking, or with the products and services being constructed on the basis of “its” data, it can simply shut down access to those data. If it turns out that Google’s interests conflict with citizens’ interests, we have no democratic means to hold it accountable or influence plans as we would with local government. If we are not careful, collective means to cultivate the collective good will become increasingly scarce.

The future is nearer than you think

Do you think these are problems of the distant future? Think again. In Canada right now there is a vehement discussion about the development of Toronto, where Google’s sister company Sidewalk Labs is developing a “smart neighbourhood”.

Everything that’s been going wrong for years wherever Google steps in, is also going wrong in Toronto. One after another, privacy experts are leaving the project disillusioned; inhabitants don’t get a say in what is happening, nor will they be able to opt out once it’s done; all the data generated for the project seems to become the property of Sidewalk Labs, enhancing the Google family’s hegemony even further.

The promise of data blinds us to its shortcomings

A little digression. During his stay in Seoul, Ter Borg picked the beach umbrella as the symbol of a non-gentrified street view. An easily recognisable object that appeals to the imagination of people outside the South-Korean capital as well. However, it of course strongly simplifies a complex and layered process. Collected data and the data deduced from that may approach reality, but is not reality. In the context of Ter Borg’s project that’s not a problem; in the context of decisions on urban problems and developments, it is. Do we really trust the Googles of the world to build our future on the basis of an illusion?

What are the consequences for our society?

Having more data, or “as much data as Google”, is not the solution. What is needed is a greater vision on data collection and use—for what purpose, under which conditions, by whom, what data—and on the special nature of public data. If we believe that there are public matters where the interests of society outweighs commercial interests, then we must also protect that data related to those public matters.

Whom do we trust with the collective good? (only in Dutch, 15.11.2018)

Google’s “Smart City of Surveillance” Faces New Resistance in Toronto (13.11.2018)

City of Amsterdam dataportal (only in Dutch)

Bits of Freedom

(Contribution by Evelyn Austin, EDRi member Bits of Freedom)



07 Nov 2018

My Data Done Right launched: check your data!

By Bits of Freedom

On 25 October 2018 EDRi member Bits of Freedom launched My Data Done Right – a website that gives you more control over your data. From now on you can easily ask organisations what data they have about you, and ask them to correct, delete or transfer your data.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Use your rights

On 25 May 2018, new privacy rules entered into force in Europe. Based on these rules you have several rights that help you to get more control over your data. However, these rights can only have effect if people can easily exercise them. That is why Bits of Freedom developed My Data Done Right.

Generate and keep track of your requests

With My Data Done Right, you can easily create an access, correction or removal request, or a request to transfer your data. You no longer have to search for the contact details in privacy statements. This information on more than 1000 organisations is already collected on the website. You don’t have to prepare the request yourself either, but it is automatically generated based on your input. You only have to send it.

My Data Done Right also contains a few other useful options. You can receive a reminder about your request by email or in your calendar, so that you don’t forget the request you’ve sent. At the moment, you can generate requests in English and Dutch. Soon there will also be an option to share your experiences with us through a short questionnaire.

Cooperation across Europe

The launch is a starting point for the further development of My Data Done Right. We plan to continue expanding the database with organisations, but also to make My Data Done Right available for all people in the European Union.

Together with other digital rights organisations and volunteers, Bits of Freedom will work on versions of My Data Done Right for other EU countries and grow our database to include many more organisations to which you can address your requests. Do you want to help? Please contact Bits of Freedom!

My Data Done Right

GDPR Today: Stats, news and tools to make data protection a reality (25.10.2018)

Press Release: GDPR: A new philosophy of respect (24.05.2018)

A Digestible Guide to Individual’s Rights under GDPR (29.05.2018)

(Contribution by David Korteweg , EDRi member Bits of Freedom, the Netherlands)