06 Sep 2017

Netherlands: Sharing of travel data violated students’ privacy

By Bits of Freedom

It was all over the news on 22 August 2017: Translink, the company responsible for the Dutch public transport card “OV-chipkaart” had been passing student travel data to the Education Executive Agency responsible for student finance in the Netherlands (DUO). DUO uses this data to figure out whether students who claim to live on their own – and therefore receive a supplementary grant – actually still live with their parents. A court ruled that this was violating students’ privacy. The same day, Dutch EDRi member Bits of Freedom called upon students to issue a right of access request to DUO and Translink. The students were encouraged to ask the following questions:

  1. Which data does DUO have on me and if I didn’t supply this data myself, how did DUO obtain it?
  2. Which data does Translink have on me and with whom has this data been shared?

Where and when we travel, whom we call, what we buy: sometimes it seems records are kept of every single thing we do. We are becoming more and more transparent and easier to influence for companies and governments. Based on the data that is gathered about us, conclusions are drawn with tangible, sometimes far-reaching consequences. Therefore it is important that we gain insight into who knows what about us. And of course, what is being done with that information.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Imagine: you live in a dorm room when one of your parents becomes seriously ill. You are at your parents’ home for weeks or even months on end. You don’t actually live there, but you do sleep over. Is it really possible for a DUO employee to make that distinction based on your public transport data? We don’t think so. You can interpret data in multiple ways and often it does not tell the whole story. Conclusions that someone else reaches by looking at your data are not always correct. But still, you are the one who has to deal with the consequences.

It is indeed important that fraud is addressed. However, it is also important that the tools used to do so are proportionate to the offence. In this case, the Dutch court ruled that DUO cannot request this kind of privacy-sensitive information just like that. And even Translink really does know better: in its terms and conditions, Translink states that it will only hand over data as part of a criminal investigation and therefore only to the police and judiciary. By deviating from its own commitment, the company undermines trust in its service.

The Dutch constitution states that everyone is entitled to respect of their personal environment. The Dutch Data Protection Act (Wbp) is the most important law regarding the collection and sharing of personal data. This law also gives citizens the right to gain insight into their own data and the right to correct it. By executing these rights, you can verify whether the processing of your personal data is correct, complete, relevant and lawful. Bits of Freedom’s Privacy Review Machine can help you with this.

DUO and the OV-chipkaart: Ask for clarification about your data! (only in Dutch, 22.08.2017)

Privacy Review Machine (only in Dutch)

(Contribution by Evelyn Austin, EDRi member Bits of Freedom, the Netherlands; Translation: Philip Westbroek)



23 Aug 2017

The privacy movement and dissent: Whistleblowing

By Guest author

This is the second blogpost of a series, originally published by EDRi member Bits of Freedom, that explains how the activists of a Berlin-based privacy movement operate, organise, and express dissent. The series is inspired by a thesis by Loes Derks van de Ven, which describes the privacy movement as she encountered it from 2013 to 2015.*

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Whistleblowing as a way of expressing dissent is tied to the privacy movement. To fully understand the act of whistleblowing, it is important to understand that whistleblowing encountered in the privacy movement is not only a form of dissent, but also shows qualities of civil disobedience and protest.

Two elements characterise whistleblowing as an expression of dissent: disagreement and complaint. Whistleblowing has a clear aim to enforce change within an organisation and is often done out of ethical considerations, but never under threat or under oath.

In addition to dissent, whistleblowing can also be seen as civil disobedience. For example, Edward Snowden said he did what he believed right and began a campaign to correct this wrongdoing. The aims Snowden tried to achieve by disclosing the NSA documents are politically motivated: he wanted to inform the public about government surveillance activities so that policies could be adjusted as the public wished. By turning to the press he addressed this issue openly, and by addressing this issue openly he forced the entire discussion out into the open and thereby turned it into a public discussion. What he wanted to achieve with his disclosures and the subsequent public discussion was clear, and the way in which he did this was deliberate and conscientious.

Contrary to whistleblowing, protesting is something that is done by a group and hardly ever by one single individual. Mobilisation is the most powerful element of protesting, because it is usually the mobilisation that brings organisations’ wrongdoings to light. Furthermore, whistleblowing and protest also differ in the sense that whistleblowers, in comparison to protesters, are more vulnerable to reprisals, operate solo, have an intra-organisational focus, have few strategic options, and only approach the media as a last resort. The boundary between whistleblowing and protest, however, can become vague as they are both a “morally propelled action”, involve “personal risk-taking”, are “change-focused”, are “vulnerable to name calling”, and involve “strategic planning”.

When looking at the way in which Edward Snowden blew the whistle, the differences between whistleblowing and protest become even smaller. Snowden’s actions already stopped being those of an individual the moment he contacted Glenn Greenwald and Laura Poitras, months before he gave them the entire set of documents and the subsequent moment of actual publication. It is also worth noting that the use of media was certainly not Snowden’s last resort but rather one of his first choices. Furthermore, Snowden did not solely focus on change within the organisation. Instead, he focused on a type of change that would entail a major social and political change, not just of the NSA but of a larger group of intelligence agencies and governments.

For a number of reasons, whistleblowers take up an exceptional place within the privacy movement. First, much of what the movement is concerned with is related to actions of intelligence services of which the exact conduct is not made public. Activists are therefore quite reliant on the information whistleblowers disclose to know what is really happening in the field of surveillance.

Second, once whistleblowers have decided to blow the whistle and make certain classified information public, their position often changes. By blowing the whistle they exclude themselves from the organisation they previously worked for, both physically and mentally. They often find a new home within the privacy movement. We can, again, turn to Edward Snowden to see how such a development unfolds.

The first year after his revelations Snowden kept a relatively low profile. Slowly, he started to accept awards and give public speeches, for example at the 2014 Dutch Big Brother Awards; took his first steps in writing articles, for instance in The New York Times; and became a member of the Board of Directors of the Freedom of the Press Foundation.

Last, because whistleblowing can have such drastic consequences, whistleblowers often receive respect and protection by the privacy movement. There is an enormous awareness among privacy advocates of the sacrifices whistleblowers make. A striking example is Glenn Greenwald’s keynote lecture at the 30th Chaos Communication Congress, six months after the first publications of the Snowden documents.

Greenwald stated that Snowden “has been utterly indispensable and deserves every last accolade and to share in every last award”, and this was followed by a loud applause from the audience. This respect for whistleblowers also shows in organisations that support whistleblowers. When whistleblowers leak classified information, there is much at stake for them and they largely depend on others for help. They are at risk of losing their freedom, either because they are given a prison sentence or because they are forced to live in exile. This is a high price to pay, and activists and organisations within the movement dedicate themselves to helping them.

Whistleblowers have an exceptional position within the privacy movement; both as valuable sources of information and as respected members. And although whistleblowing should not be seen as protest, in practice we see that for the privacy movement the two are intricately linked. In the next article, we will further explore how the privacy movement uses art to express dissent.

The series was originally published by EDRi member Bits of Freedom at https://www.bof.nl/tag/meeting-the-privacy-movement/.

Dissent in the privacy movement: whistleblowing, art and protest (12.07.2017)

(Contribution by Loes Derks van de Ven)

* This research was finalised in 2015 and does not take into account the changes within the movement that have occurred since then.



Jubb, Peter B. “Whistleblowing: A Restrictive Definition and Interpretation” Journal of Business Ethics 21 (1999): 77-94.
Scheuerman, William E. “Whistleblowing As Civil Disobedience: The Case of Edward Snowden.” Philosophy and Social Criticism 40.7 (2014): 609-628.
De Maria, William. “Whistleblowers and Organizational Protesters. Crossing Imaginary Borders.” Current Sociology 56.6 (2008): 865-883.

26 Jul 2017

Stalking is easy with Facebook, and now even easier with Snapchat

By Guest author

We seem to get more and more accustomed to using apps that can easily track our movements. It is convenient to simply share your location with friends, instead of sending messages or calling to arrange where to meet. But are you aware of when and how you are giving the companies an insight into our whereabouts, and with that, your life? Even though it is practically impossible to completely protect yourself from location tracking if you are using a smartphone, there are ways to avoid the most obvious and intrusive ones.

The most popular location-sharing tools are provided by Facebook, Google and now Snapchat. They all provide imperfect, but still efficient and widely used features for sharing your location, which bring about the privacy concerns of location tracking.

Two options apply to location sharing – the first one is to drop a pin on a map to share your current location, and the second one is to let others follow your location in real time as you move around. Apple, Facebook, Google and Snapchat all offer these options.

Apple’s locations sharing features are integrated into Apple Maps, Messages and Find my Friends apps. Google’s location sharing tool is built into Google maps and Facebook’s is embedded into its Messenger app. They all offer options for the time limit of your location sharing – it should come with no surprise that broadcasting a live update on your location indefinitely might not be the best thing to do, if you are even vaguely concerned about your privacy. Turning off the feature when you do not need to share your location any more is a basic precaution.

The latest app to join this location-sharing crowd is Snapchat. It might also be the most controversial one, to the point when even parents and law enforcement officials raised their concerns about strangers tracking children’s locations. Snap Map shares your location by placing your avatar – a cartoon figure called Bitmoji – on a map like a pin. Others can zoom in on it to get your specific location. Even if only your friends can access your location, it is fairly common to add people you do not actually know as friends on Snapchat. This raises concerns especially because the social platform is popular among teenagers, who might not be fully aware of privacy implications of the technology that broadcasts their location.

edri.org/wp-content/uploads/2015/09/Supporters_banner.png” alt=”—————————————————————–
Support our work – make a recurrent donation!
—————————————————————–” width=”600″ height=”50″ />

Snap Map is technically an opt-in app, which only takes effect after you update the app and follow the tutorial on how to use the feature. The app asks who you want to see your location – if you choose option “only me”, it activates the so-called Ghost Mode, which makes your avatar disappear from the map, while you can still see others. This feature has been described as plain creepy.

Similar to many other apps, even if you opt out from announcing your location to the world, Snapchat can still track you of course. It might be a good idea to turn off location data altogether on your phone and just take a moment to actually tell your friends where you are when necessary. That way, the number of people, private companies, and government agencies, who are given a shortcut to monitor your movements and your activities, are at least somewhat limited. It is a simple choice between incurring the entirely unnecessary privacy and security risk of being in numerous databases, any of which might suffer a data breach at any time, or choosing not to run that risk.

Parents can make sure that children are not sharing their location with specific tools and with advice. For everyone else, not broadcasting your location publicly is always a wise choice when it comes to privacy.

(Contribution by Zarja Protner, EDRi intern)



12 Jul 2017

Dissent in the privacy movement: whistleblowing, art and protest

By Guest author

This is the first blogpost of a series, originally published by EDRi member Bits of Freedom, that explains how the activists of a Berlin-based privacy movement operate, organise, and express dissent. The series is inspired by a thesis by Loes Derks van de Ven, which describes the privacy movement as she encountered it from 2013 to 2015.*

On 29 December 2013, digital activist, technologist, and researcher Jacob Appelbaum closes the year with a talk titled “To Protect and Infect, Part 2” at the 30th edition of the Chaos Communication Congress in Hamburg, Germany. He elaborates on the kind of surveillance activities the United States National Security Agency (NSA) deploys, and reveals, among other things, the existence of a dragnet surveillance system called TURMOIL. The information he shares originates from the set of classified documents that whistleblower Edward Snowden collected while working as an NSA system administrator. In June 2013, Snowden decided to share these documents with the press, explaining that he does not want to live in a world where we have no privacy and no freedom and that the public has the right to know what their government is doing to them and doing on their behalf. Later, at the 2014 Dutch Big Brother Awards, he adds that he considered the NSA’s surveillance programs such a severe violation of human rights that he felt it was his obligation to make the documents public. Snowden’s statements are related to a larger, ongoing public debate about surveillance: how much knowledge about citizens is just and necessary for governments to possess and what actions are legitimate to obtain that information?

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Four activists surfaced in the wake of the Snowden leaks and quickly took on leading roles in the debate: Jacob Appelbaum, Glenn Greenwald, Sarah Harrison, and Laura Poitras. Although these four individuals had shared beliefs, they do not share a common background. At the time of the first publications Glenn Greenwald worked as a journalist, Laura Poitras as a documentary filmmaker, Jacob Appelbaum as a technologist, and Sarah Harrison as a journalist and legal researcher for WikiLeaks. Although they are certainly not the only individuals who are relevant to the larger group of activists who work on privacy and surveillance issues, their diversity really is a reflection of the diversity of the group concerned with these issues.

The privacy movement is incredibly diverse, decentralised, and therefore complicated to define. In spite of this, expressing dissent is one of the key characteristics of the movement. It is where activists find each other and share their ideas with the rest of the world. So what does dissent look like in the privacy movement? There are three different ways in which the privacy movement seems to express dissent, namely through whistleblowing, through art, and through protest. Each contributes to the understanding of the privacy movement as a whole.

First, whistleblowing is interesting because its role is threefold. Besides the fact that whistleblowing is a means for the privacy movement to expresses dissent, whistleblowers are also a vital source of information to the movement and furthermore often become activists within the movement themselves. Second, activist art is a way for the privacy movement to communicate its ideas and goals to members of the movement as well as to the wider public. Although there is only a small group of activists involved in the process of creating the art, it does affect the movement in its entirety. Last, the privacy movement also expresses dissent through protest. This is done both through traditional types of protest such as street demonstrations, as well as through protest forms that can only exist online, for example the development, promotion, and use of tools that provide more anonymity for internet users.

Although dissent is an element that characterises the privacy movement, it is certainly not the only one. The untraditional role of leadership within the movement and the physical meeting place in Berlin also contribute to the unique character of the movement.

In the upcoming articles in this series, we will explore whistleblowing, art, and protest as expressions of dissent in more depth.

The series was originally published by EDRi member Bits of Freedom at https://www.bof.nl/tag/meeting-the-privacy-movement/

(Contribution by Loes Derks van de Ven)



* This research was finalised in 2015 and does not take into account the changes within the movement that have occurred since then.

14 Jun 2017

#ALTwitter privacy revelation: European parliamentarian goes bananas

By Guest author

Recently, Mr Dunston (of the “Dunston Checks In” fame) came to the EDRi Brussels office looking for help. He complained that somebody from the European Parliament is messing up with his “holy banana collection” that he has been preserving since decades after he inherited it from his forefathers. Other than that we had no information.

Being the defenders of human rights in the digital environment, we decided to help Mr Dunston. Coincidentally, we were working on a project called ALTwitter, where we had created Twitter-like profiles of the Members of the European Parliament (MEPs) based on their metadata. We thought, for once, let’s use metadata for social good.

Here is what we did:

Step 1: Data collection
We collected approximately 10000 publicly available tweets from the Twitter accounts of 617 MEPs.

Step 2: Metadata extraction
We stripped the metadata associated with these tweets – such as source of the tweet i.e. the device or service from which the tweet originated – for further analysis.

Step 3: Metadata analysis
We counted the number of times each of those devices or services were used by MEPs. Then we arranged them according to how frequently they had been used.

Step 4: Finding the anomaly or unique artifact
We then selected the few least commonly used devices or services. This was to find the sources which are used only by a few MEPs.

Step 5: Finding the culprit
We were surprised to see “Banana Kong” as one of the rarely used sources of tweets from MEPs. Apparently, it was used by only one MEP on her Apple (iOS) phone. That was none other than Angelika Niebler.

Step 6: Helping Dunston with the proof
Because we had seen this information in Ms Niebler’s metadata, we searched her chronology to see if she had ever mentioned her surprising pastime. Sure enough, evidence of Ms Niebler’s banana enthusiasm came to light.

(I’ve just reached 90 meters in “Banana Kong”. Download it from the App Store and try to beat me!)

Angelika has been stealing Mr Dunston’s bananas since August 2013, and she owes him compensation – big-time. We would never have found this proof, if metadata hadn’t pointed us in the right direction. This is the same metadata the advertisers are using to target her with more personalised ads, track her online activities, and undermine her privacy online, and possibly offline. And our privacy, too.

When signing up to use the app, Ms Niebler agreed, as a prominent Member of the European Parliament, to share a variety of personal information, including her device identifier, geo-location information and IP address data with the game supplier and fourteen other companies, mainly based in the United States.

We suggested to Mr Dunston that he should take legal action against Ms Niebler for banana theft. But, he says:
“Listen! I am a nice orangutan. I don’t need any monetary compensation, but I want her and every other MEP to understand the importance of privacy. Today it is my banana, tomorrow it could be yours. If not Ms Niebler, someone else will steal it. In fact, the advertisers have been already stripping our online privacy, with or without our knowledge. It’s time to put an end to this! Let’s try to understand why privacy matters and let’s defend it! Let’s help the parliamentarians to do the same! That is the best compensation I would expect.”

We believe that his demands are fair. If you agree, join us on our mission to defend everyone’s digital rights! We want to convince Ms Niebler and other MEPs to vote right on the e-Privacy Regulation, to make sure it guarantees privacy by design and by default for our online communications. We want to make sure that no one can be refused to access information because they oppose being tracked (no “tracking walls”), that groups can act on behalf of citizens when an infringement has occurred and that tracking can never be the default. Find out more about e-Privacy here!

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

ALTwitter #hakunametadata: Twitter metadata profiles of the Members of European Parliament

ALTwitter: The treasure trove behind 140 characters (31.05.2017)

Hakuna Metadata – Let’s have some fun with Sid’s browsing history! (03.05.2017)

Hakuna Metadata – Exploring the browsing history (22.03.2017)

New e-Privacy rules need improvements to help build trust (09.03.2017)

(Contribution by Siddharth Rao, Ford-Mozilla Open Web Fellow, EDRi)



14 Jun 2017

Access to e-evidence: Inevitable sacrifice of our right to privacy?

By Guest author

What do you do when human rights “get in the way” of tackling crime and terrorism? You smash those pillars of your democratic values – the same ones you are supposedly protecting. Give up your right to privacy, it is a fair price to pay for the guarantee of your security! This is the mantra that, during the past decades, we have heard populist politicians repeat over and over again – never mind that gambling with our rights actually helps very little in that fight.

One of the bargaining chips in the debate on privacy versus security is access to e-evidence.

E-evidence refers to digital or electronic evidence, such as contents of social media, emails, messaging services or data held in the “cloud”. Access to these data is often required in criminal investigations. Since the geographical borders are often blurred in the digital environment, investigations require cross-border cooperation between public authorities and private sector.

Thorough police investigations are indeed of utmost importance. However, the access to people’s personal data must be proportionate and necessary for the aim of the investigation and provided for by law.

In a similar way that the police cannot enter your home without a court warrant, they are not supposed to look into your private communications without permission, right? Not really.

The EU is working towards easing the access to e-evidence for law enforcement authorities. The plan of the European Commission is to propose new rules on sharing evidence and the possibility for the authorities to request e-evidence directly from technology companies. One of the proposed options is that police would be able to access data directly from the cloud-based services.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

This means that Facebook, Google, Microsoft, providers of messaging services, and other companies which collect and store data of millions of EU citizens, would be obliged to provide this data to the authorities, even when stored in the cloud in another EU Member State. The types of data that might fall within the scope of the law range from metadata (such as location, time, sender and recipient of the message and other non-content data) to the content of our personal communications.

But for sure there must be safeguards to protect people’s right to privacy, right? Not necessarily, especially when pushing for “voluntary” cooperation between companies and law enforcement. This kind of arrangements often lack in accountability and predictability. This is why any new measures on e-evidence must comply with international human rights and data protection standards. Member States must continue to be able to regulate access to data in their jurisdiction and on their citizens and residents, in particular by foreign law enforcement and national security agencies. Individuals must also be able to seek protection and redress in their own country.

Access to e-evidence is also being discussed beyond EU borders. The Council of Europe (CoE) is preparing to adopt a new protocol to the so-called Budapest Convention – the Convention on Cybercrime of the Council of Europe. The Convention covers not only CoE Member States, but all 53 countries that have ratified it. This means not all of them are bound by data protection or human rights conventions. EDRi is following this process attentively and has submitted input on several occasions.

The initiative from the European Commission is establishing the framework for a new legislative proposal, which is scheduled to be presented in the beginning of 2018. On 8 June 2017, the Commission presented the options for practical and legislative measures to the EU ministers. EDRi is participating in expert discussions on the suggested way forward.

It is crucial that safeguards to ensure data protection and the rule of law are applied to the new legislation. Otherwise, it will be imposed at the cost of the human rights of citizens.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

RightsCon session on cross-border access to e-evidence – key interventions (10.05.2017) https://edri.org/rightscon-session-on-cross-border-access-to-e-evidence-key-interventions/

EDRi’s position paper on cross-border access to electronic evidence in the Cybercrime Convention (17.01.2017)

EDRi’s letter to the Council of Europe on the T-CY Cloud Evidence Group Report on criminal justice access to evidence in the cloud (10.11.2016)

Professor Douwe Korff’s analysis on the T-CY Cloud Evidence Group Report on criminal justice access to evidence in the cloud (10.11.2016)

European Commission: e-evidence

(Contribution by Zarja Protner, EDRi intern)



19 May 2017

Looking back on our 2016 victories


Technological advancements in the digital world create new opportunities but also new challenges for human rights. Especially in the past year, the fear of extremism on the one side and extreme measures on the other resulted in the desire for swift political action and made defending citizen’s rights and freedoms online a difficult task. In 2016, our European network faced demands for increased state surveillance and restrictions on the freedom of expression by private companies, and decreased protection of personal data and privacy. Our annual report 2016 (pdf) gives you an overview of EDRi’s campaigns across the European countries and our key actions at EU level.

Despite our struggles, our members, observers, national and international partners, supported by many individuals who contributed to our work, successfully protected digital rights in a number of areas.

We successfully advocated for a reform of privacy rules in electronic communications (ePrivacy) and played a key role in the civil society efforts that led to the adoption of the EU’s General Data Protection Regulation (GDPR) in April 2016.

We scored a big success in our top priority issue and secured net neutrality in Europe. This victory was the outcome of more than five years of hard work and the input from over half a million citizens responded to the net neutrality consultation in 2016.

We released influential analysis that contains implementation guidelines for the General Data Protection Regulation. We published two documents highlighting the numerous, unpredictable flexibilities in the legislation and how they should be implemented.

We published the “Digital Defenders”, a comic booklet to help kids make safer and more informed choices about what to share and how to share online. It turned out to be a huge success – the original English version of the booklet has been downloaded from our website over 25 000 times and published in Serbian, Turkish, German, Greek, Spanish and Italian, with other translations on the production line.

While we regret the adoption of an ambiguous Directive, we successfully requested the deletion of many harmful parts that were proposed in the course of the legislative discussions and the clarification of some of the ambiguous language.

Our criticism of the new so-called Privacy Shield was echoed by many experts in the European institutions and bodies (the European Parliament, the European Data Protection Supervisor, and the European Ombudsman) and led to mainly negative press coverage for the Commission and continued pressure for a more credible solution.

Read more in our Annual Report 2016!

Our finances can be found on pages 43-44.


17 May 2017

UK Digital Economy Act: Millions of websites could be blocked

By Guest author

The Digital Economy Act has become law in the United Kingdom. This wide-ranging law has several areas of concern for digital rights, and could seriously affect privacy and freedom of expression of internet users.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

One of the main concerns is that it will compel legal pornographic websites to verify the age of their users. The British Board of Film Classification (BBFC) has been given the power to fine or instruct ISPs to block websites that fail to provide age verification, which could mean that thousands of websites containing legal content could be censored.

On 10 May 2017, EDRi member Open Rights Group (ORG) received a response to their Freedom of Information (FOI) request on the correspondence between BBFC and MindGeek, the company developing age verification technology. The response revealed that for the Digital Economy Bill to be effective in preventing children from accessing pornography, the government would need to block over four million websites.

The law will also extend the maximum prison sentence for online copyright infringement to ten years. ORG has raised concerns that the wording of this offence is too broad and could in theory be used against file sharers. It could also be exploited by ”copyright trolls”, that is law firms who send letters to threaten users who are suspected of unauthorised downloading of copyrighted works with the possibility of legal procedures – even though there may not be evidence to support this.

The Digital Economy Act also gives the police the power to disable mobile phones that they believe might be used for crimes. ORG has criticised this power, as it pre-empts criminal behaviour.

Finally, the Act includes new powers for sharing data across government departments. Even if the definitions of these new powers were improved during the parliamentary process, they are still too broad, and leave room for practices that dramatically threaten citizens’ fundamental rights to privacy.

The UK Digital Economy Bill: Threat to free speech and privacy

FOI response reveals porn company’s proposals for UK to block millions of porn sites

Digital Economy Act: UK police could soon disable phones, even if users don’t commit a crime

(Contribution by Pam Cowburn, EDRi member Open Rights Group, the United Kingdom)



17 May 2017

Big Data for Big Impact – but not only a positive one

By Guest author

Technology has changed and keeps dramatically changing our everyday life by transforming the human species to advanced networked societies. To celebrate this digital revolution, 17 May is dedicated to the “World Telecommunication and Information Society Day” (WTISD-17).

The theme for this year’s celebration is “Big Data for Big Impact”. Not so surprisingly, the buzzword “big data” echoes in our daily commutes over the internet world. The chosen theme focuses on harnessing the power of big data to turn complex and imperfect pieces of data into a meaningful and actionable source of information for the social good.

Big data has a potential to improve society – much like electricity or antibiotics. From health care and education to urban planning and protecting the environment, the applications of big data are remarkable. However, big data comes with big negative impacts. Big data can be used – by both advertisers and government agencies – to violate privacy. The power of big data can be exploited to monitor every single detail of people’s activities globally.

With 29 million streaming customers, Netflix is one of the largest providers of commercial media in the world. It has also become a trove of data for advertisers as it collects data on users’ activities – what, when and where they are watching, what device they are using, when they fast-forward, pause or stop. Just imagine a representative of Netflix sitting behind your couch, looking over your shoulder and making notes whenever you turn on the service. This applies to many online services, such as Google, Amazon, Facebook or YouTube.

Mass surveillance initiatives by intelligence agencies such as the US National Security Agency (NSA) and the UK Government Communications Headquarters (GCHQ) take this power to the next level to knock down every bit of personal space. Without big data, the scale at which such profiling is done today would not be possible.

It is very tempting to use the benefits of big data for all sorts of purposes. Hiring new employees based on their social media activities, granting insurances based on fitness tracker data, airport security check ups and future crime predictions based on cell phone call logs, to mention a few. But there are some fundamental problems with applying big data to services.

The first problem is that, knowingly or unknowingly, we all have biases when making decisions. If decisions made by millions of employers, policemen or judges over a long period are collected together, it brings in all those biases, on a bigger scale. Big data may just refer to a large chunk of unstructured data, but the insights deduced from it will rely on machine learning – which accumulates all possible biases, such as gender and race. Algorithmic decision-making could turn out to be more biased than ever before, which would have a terrible effect on the society.

The second problem is the error rates: A study on automatic face recognition software found that the error rates can vary between 3% and 20%. This means that your face could match with one in the database of potential terrorist the next time you go to the airport and you could be pulled out for questioning or get into even more trouble. This is happening in the international airport transit on a daily basis. It is not possible to create 100% accurate models, and every time the assumptions are made on a missing data sample, the errors are inevitable.

Therefore, when dealing with big data, it is crucial to be extremely cautious about the quality and sources of the data, as well as about who can access it, and to what extent. If a data set stemming from diverse sources is handled with special care and anonymised thoroughly to protect privacy rights, big data can be used to solve complex societal problems. But if it is left unregulated or not properly regulated, and not tested for its fairness and biases, it can pose a serious threats to our human rights and fundamental freedoms.

EDRi has fought for the EU General Data Protection Regulation (GDPR) to regulate this practice. Now EU Member States are implementing the GDPR, and it is up to them not to abuse the weak points of the Regulation to undermine the protection of the European citizens’ data.

Video by EDRi member Privacy International: Big Data

Creating a Big Impact with Big Data

(Contribution by Siddharth Rao, Ford-Mozilla Open Web Fellow, EDRi)


03 May 2017

Encryption – debunking the myths

By Guest author

How to send a sensitive message protecting it from spying eyes? Encrypt it. You think your message is not sensitive or that no one is spying on you? Encrypt it anyway.

When you send your message encrypted, no-one else but the intended recipient can read it. Even if someone manages to catch the message when it’s on its way to the recipient, they will not be able to read its contents – they can only see something that looks like a random set of characters.

Encryption is essential for the protection of our digital infrastructure and communications, but it is still burdened by some myths that keep on surfacing in discussions.

1. For spies and geeks only

Not only spies, criminals and privacy geeks use encryption. In fact, everyone is benefiting from it on a daily basis, even if everyone is not aware of it. Encryption not only guarantees the confidentiality of our communications, but it also makes our lives easier and enables digitalisation of the society.

Electronic banking? Encryption is what makes our transactions safe and secure. The same goes for any online activities of businesses to protect themselves against fraud. Citizens submit digital tax returns, the intelligence community encrypts state secrets, the army sends orders securely in order to avoid compromising military operations, and civil servants negotiate trade deals by sending messages that only the addressee can read (or they should!). Journalists rely on it to protect their sources and information when investigating confidential or potentially dangerous issues of crime, corruption, or other highly sensitive topics, performing their role of the democratic watchdogs. Without encryption ensuring authenticity, integrity, and confidentiality of information, all this could be compromised.

2. Who cares?

Encryption enables us to collect information and communicate with others without outside interference. It ensures the confidentiality of our communications, for example with our doctors, lawyers, partners. It is an increasingly important building block for freedom of expression and respect for privacy. When you achieve privacy through confidentiality of your communication, you are able to express yourself more freely. People prefer to use messaging apps like Signal and WhatsApp, which enable privacy of their communications by employing end-to-end encryption. In a survey, requested by the European Commission, nine out of ten respondents agreed they should be able to encrypt their messages and calls, so they can only be read by the intended recipient. No matter whether you are making dinner plans, sharing an intimate message or dealing with state secrets, whether you are a president, a pop star or just an ordinary citizen, the right to have control over your private communication and protect it from hackers and government surveillance matters.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

3. Criminals, terrorists, and the old “privacy versus security”

How do you make sure encryption is not used with bad intentions? It’s simple – you cannot. But this does not mean it makes sense for governments to weaken encryption in order to fight terrorism and cybercrime. It only opens Pandora’s box – when supposedly making sure that terrorists have no place to hide, we are exposing ourselves at the same time.

From a technical point of view, encryption cannot be weakened “just a little”, without potentially introducing additional vulnerabilities, even if unintentionally. When there is a vulnerability, anyone can take advantage of it, not just police investigators or intelligence services of a specific country when necessary. Sooner or later, a secret vulnerability will be cracked by a malicious user, perhaps the same one it was meant to be safeguarding us from.

Therefore, weakening or banning of encryption in order to monitor any person’s communications and activities is a bad idea. The number of possibilities for criminals to evade government-ordered restrictions on encryption is vast. Knowledge of encryption already exists, and its further development and use cannot be prevented. As a result, only innocent individuals, companies, and governments will suffer from weak encryption standards.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

EDRi: Position paper on encryption (25.01.2016)

EDRi paper: How the internet works?, page 6: Encryption

Surveillance Self-Defense: What Is Encryption?

Winning the debate on encryption — a 101 guide for politicians (21.04.2017)

(Contribution by Zarja Protner, EDRi intern)