22 Mar 2017

Hakuna Metadata – Exploring the browsing history

By Guest author

Metadata is data about data. In an e-mail, the data is the content of the e-mail and metadata is the information about the e-mail. So, it covers information like who is it from or who sent it, the date and time, the subject, network information etc. When we are browsing the internet, data is represented by the content of the websites that we visit, but the metadata are the website addresses (so-called “URLs”), the time of visit and the number of visits, network information, and so on.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Data is often considered to be sensitive, and it is possible to protect it using encryption technologies. However, metadata is generally not treated as sensitive, and is also very difficult to encrypt. For example, if we encrypt the sender information on an email, your email client would not know where to send it.

Metadata was not invented to help privacy invaders. It was intended to speed up the process of classification and indexing of any kind of bulk data, without looking at the data itself. By definition, metadata enforces data protection by letting someone process the data, without even looking at the content inside – sort of like an envelope in traditional postal services. However, metadata is also the fastest way to profile internet users – by analysing the number and nature of communications between different people, with particular websites, location, keywords. Although profiling based on metadata can be used for a number of purposes, the exploitation of its power for advertising and surveillance is its most common and controversial use.

Browsers store the browsing history to provide a more user-friendly browsing experience. By default, browsers store the history of all the previously visited websites, cached copy of the websites, form filling history, cookie information, and bookmarks. Depending on the operating system and the browser, this information will be stored in a specific location on the hard disk of your computer in a lightweight database. Browser history has its own advantage in terms of usability, such as automatic completion of previously visited URLs, and locally cached copies of the previously visited websites to boost up the browsing speed.

Who can access our metadata from browsing? Our browsing history is accessible to our browsers, which is why it is highly recommended to use open-source trustworthy browsers such as Mozilla Firefox, which protects and respects your privacy. If you are using other browsers from the companies that are themselves data brokers and advertisers, you end up giving away your browsing history to get tracked. Even when we can trust our browsers, there are other actors with access to our browsing history. Full access to our browsing history can be gained through a Wifi Hotspot, especially when using public hotspots, as well as because of a malware in the computer. Almost full access to your browsing history is available to Internet Service Providers (ISPs), even when the traffic is encrypted. Partial access is available to Domain Name Service (DNS) Providers, to different companies for tracking, advertising and profiling through cookies, browser fingerprinting, etc., and to websites that you visit.

In spite of the clear privacy implications, there is no clarity under the law about whether browsing history is to be protected as content or non-content metadata.

Hakuna Metadata, a project to analyse metadata by EDRi’s Ford-Mozilla Open Web Fellow Sid Rao shows how metadata can reveal a surprising scale of our daily interactions online. It is possible to learn about a person’s working hours, sleep time, work-related travel and holiday schedules, interests and other keywords related information, who their friends are and much more just by using their browsing metadata. You can read more about the project and the results of the analysis here and download the open source browsing history visualisation tool here.

Hakuna Metadata – Exploring the browsing history (28.03.2017)

Hakuna Metadata – Browsing history visualization for Linux + Firefox combo

Metadata Investigation: Inside Hacking Team (29.10.2015)

(Contribution by Siddharth Rao, Ford-Mozilla Open Web Fellow, EDRi)

06 Mar 2017

Are net neutrality and privacy Europe’s brilliant way of trumping destructionism?

By Joe McNamee

For the online economy to work, trust and competition are needed. Trust to drive take-up of services and competition to drive down prices and drive up innovation.


The 2016 Eurobarometer (pdf) survey found that nearly 60% of individuals in the EU had avoided certain websites for privacy reasons, while 82% were in favour of restrictions on cookies. This shows how important clear privacy rules are for individuals and for trust in the online economy. The European Union has addressed this problem head-on, by proposing and adopting the General Data Protection Regulation (GDPR) and, more recently, proposing the e-Privacy Regulation.

Clear rules, with effective enforcement, generate trust and provide a harmonised market for companies serving individuals in Europe.

The US national telecoms regulator the Federal Communications Commission (FCC) also saw the danger from the “wild west” of personal data exploitation online. The danger was illustrated when the National Telecommunications and Information Administration carried out a survey in 2016. This study found that – in the previous 12 months – 19% of internet-using households had suffered an online security breach, while 45% had refrained from an online activity due to privacy and security fears. Faced with this compelling evidence of the damage caused by lack of trust and security, the FCC tried to act in October 2016. It passed ground-breaking privacy rules (by 3 votes to 2) to protect broadband users and improve trust. However, it was not possible to enshrine the rules in law, meaning that the rules are contingent on the whims of the Commissioners. The appointment of a new FCC Chairman by the incoming president makes it almost certain that US citizens – and the US online economy – will be robbed of this essential protection… unless they use European services, of course.

Far from GDPR and e-Privacy being European protectionism, the US laissez-faire approach appears to be self-inflicted US destructionism.

Net neutrality

In 2013, the EU was faced with increasing evidence of internet access companies seeking to undermine innovation and competition online. It was faced with calls to legislate to protect discriminatory “specialised services” which would allow big online companies to sell “fast lane” to gain access to the customer base of big telecoms operators. Not alone did the European Union not give in to this huge lobbying effort, it legislated in favour of rules that will prevent big telecoms operators from becoming a gatekeeper that stops the full internet being accessible to their customers. It legislated for openness and innovation with a binding EU-wide regulation.

The Federal Communications Commission saw the same danger as the European Union. However, it was not possible to enshrine net neutrality in law. All the FCC could do was to adapt its own implementation of its own rules and powers to defend the online environment from big telecoms operators, in a market that was already less competitive than the one in Europe. As a result, those rules are contingent on the whims of the Commissioners. The appointment of a new FCC Chairman by the incoming president makes it almost certain that US citizens and online businesses will be robbed of this essential protection.

Europe has legislated for open, innovative, better value online services. If the US abandons net neutrality and privacy, it will be opting for self-inflicted destructionism.

Only the EU could have adopted positive, exemplary legislation on this scale to protect individuals and businesses. And it did.


02 Mar 2017

Privacy Camp 2017 in video


On 24 January, the fifth annual Privacy Camp, co-organised by EDRi, Privacy Salon, Université Saint-Louis (USL-B) and the interdisciplinary Research Group on Law Science Technology & Society of the Vrije Universiteit Brussel (VUB-LSTS) took place in Brussels.

Did you miss our #PrivacyCamp17: Controlling data, controlling machines? Now you can watch all the sessions or relive some of the precious moments of insightful debates.

Community building workshop: Societal impacts of big data and the role of civil society
Rocco Bellanova, University of Amsterdam and USL-B
Hans Lammerant, VUB and BYTE
Diego Naranjo, EDRi
Estelle Massé, AccessNow
Christian D’Cunha, EDPS

Link: https://youtu.be/QpXaW5Rcbgc

Owning the web together: Peer production and sharing
Seda Gürses, KULeuven
Ela Kagel, Supermarkt
Shermin Voshmgir, BlockchainHub
Tim Jordan, University of Sussex

Link: https://www.youtube.com/watch?v=Z9Z9ewyhI0A&t=19s

Instant big data targeting: Programmatic ad tech & beyond
Anna Fielder, Privacy International
Jeff Chester, Center for Digital Democracy
Wolfie Christl, Cracked Labs
Frederik Borgesius, University of Amsterdam

Link: https://www.youtube.com/watch?v=ge0Q1hlhUpI

The Internet of Things, security, and privacy 
Sid Rao, Mozilla Advocacy Open Web Fellow at EDRi
Finn Myrstad, Norwegian Consumer Council
Katitza Rodriguez, Electronic Frontier Foundation
Andreas Krisch, EDRi and Forum Datenschutz
Fieke Jansen, Tactical Tech

Link: https://www.youtube.com/watch?v=f4VKJJUz2Yw

Surveillance tech export and human rights law
Lucie Krahulcova, AccessNow
Joshua Franco, Amnesty International and CAUSE
Renata Avila, World Wide Web Foundation and Courage Foundation
Walter van Holst, Vrijschrift
Ellen Desmet, UGent and HRI Network

Link: https://www.youtube.com/watch?v=hdDSoNYkOV4

Lightning talks:

Alexander Czadilek and Christof Tschohl, epicenter.works, Austria: Presentation of HEAT – Handbook for the Evaluation of Anti-Terrorism legislation

Link: https://www.youtube.com/watch?v=Xh_hG1iLBiQ&t=9s

Eva Lievens, Ghent University: Youth in the data deluge: How can the General Data Protection Regulation protect their privacy while fostering their autonomy

Link: https://www.youtube.com/watch?v=vJWbZFNKUZ0

Katarzyna Szymielewicz, Panoptykon, Poland How to ensure a strong General Data Protection Regulation implementation

Link: https://www.youtube.com/watch?v=RnXVaK3cCvM

Kirsten Fiedler, EDRi: Presentation of Digital Defenders: privacy for kids comic booklet

Link: https://www.youtube.com/watch?v=DK9_mT51JJ4&t=3s

Arne Hintz, Cardiff University: Presentation of Data Justice Lab

Link: https://www.youtube.com/watch?v=BP0Rs-2m6vo

Theresia Reinhold: Presentation of documentary Information. What are they looking at?

Link: https://www.youtube.com/watch?v=7j3tBG60GPI&t=48s

Ali Lange, Center for Democracy & Technology, USA: The right to explainability

Link: https://www.youtube.com/watch?v=8r-ftqFuoJc&t=1s


22 Feb 2017

The UK Digital Economy Bill: Threat to free speech and privacy

By Guest author

The Digital Economy Bill is being debated by the House of Lords in the United Kingdom. This is a far-reaching bill that covers a range of digital issues, including better broadband coverage across the UK. However, from the digital rights point of view, there are three main areas of concern.

Age verification:
The bill includes proposals to force porn sites to verify the age of their users with no requirements to protect their privacy. During the debate on 6 February 2017, the UK government said no privacy safeguards were necessary. In order to force foreign websites to comply with the proposals, the government has proposed that a regulator could instruct Internet Service Providers (ISPs) to block websites that fail to provide age verification. This could mean that thousands of websites containing legal content could be censored. These proposals have implications for privacy and free speech rights in the UK and EDRi member Open Rights Group (ORG) is campaigning to amend the bill.

Data sharing:
There are worrying proposals to make it easier to share data not only across government departments, but also with private companies. ORG has been involved in government discussions about these measures but the concerns raised have not been addressed in the bill. The main concerns are that the bill lacks sufficient privacy safeguards, ministers have too much power without scrutiny, data on births, deaths, and marriages can be shared without any restrictions other than those found in pieces of other legislation, and the codes of practice are not legally binding.

There are proposals to increase the maximum prison sentences for online copyright infringement to ten years – to bring it in line with offline infringement. ORG is concerned that the definition of the infringement is too broad and will catch large numbers of internet users. ORG is trying to amend the bill to ensure that such severe sentences are given to only those guilty of serious commercial infringement.

ORG has made a submission explaining the huge threat to free speech and why these proposals should be dropped. They launched a spoof recruitment campaign for Internet Censors to help classify the web for age verification. Over 23 000 people have signed a petition for rejecting the proposals.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

ORG’s submission

Spoof recruitment campaign

Petition about the proposals

(Contribution by Pam Cowburn, EDRi member Open Rights Group, the United Kingdom)



22 Feb 2017

What does your browsing history say about you?

By Guest author

An average internet user visits dozens of websites and hundreds of web pages every day, most of which are kept in the history of our internet browsers. But what if someone took this massive database of visited web pages and cross-referenced them? A joint collaboration of Tactical Tech and SHARE Lab researchers focused on discovering intentions, desires, needs, and preferences of a person based on their browsing history.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Swiss journalist, called Mr J for the purposes of the research, visited the Tactical Tech office in Berlin in June 2015, and provided them with a sample of his web history, upon which this research was based. By analysing large sets of web addresses (so-called Uniform Resource Locators URLs), especially from popular services such as Google Maps, Google Search or YouTube, they were able to create a picture of Mr J’s everyday routine, including his interests and intentions, even apartments he rented via Airbnb while he was travelling abroad. Also, since Facebook has a “real-name policy”, it is quite easy to link a person’s web history to their profile, as well as create a social graph of their Facebook friends and connections, based on the Facebook URLs they visited.

As websites Mr J visits contain a lot of trackers, small bits of data used for collecting behavioural information of users, the experiment also showed which companies extract the most data on Mr J. Google, Facebook and Twitter were unsurprisingly among the companies with the largest number of trackers. It was also interesting to “read” sample web pages Mr J visited like a machine would do it. This is possible with Google’s Cloud Natural Language tool, which is attached to its deep learning platform and can be used to extract information about people, places, events, and much more, mentioned in text documents, news articles or blog posts. It recognised important events, names, and places based on keywords it picked up from web pages.

All these findings lead to the conclusion that if someone, such as private companies, the state, or law enforcement, were to employ these techniques on a large segment of the population and target people’s web history, it would be a frightening introduction to a project of “thought police”, arresting individuals suspected of committing a crime in the future.

SHARE Lab: Browsing Histories – Metadata Explorations

(Contribution by Bojan Perkov, EDRi observer SHARE Foundation, Serbia)



15 Feb 2017

Citizens’ rights undermined by flawed CETA deal


On 15 February 2017, the European Parliament voted in favour of the Comprehensive Economic Trade Agreement (CETA). This concludes the process at the EU level. The EU Member States will now have to ratify the agreement, without having a right to make changes to the text. CETA creates significant risks for citizens’ fundamental rights, especially with regard to privacy and data protection.

CETA raises serious questions to the protection of our online rights and freedoms. These concerns have been sadly ignored. We now turn to the EU Member States to stand up for the interest of their citizens by rejecting CETA.

said Maryant Fernández Pérez, Senior Policy Advisor at European Digital Rights (EDRi).

The Parliament approved the agreement despite EDRi’s and other civil society organisations’ calls to improve the agreement text. We raised concerns about the lack of transparency in the negotiation process, weakened protection of the personal data and privacy of European citizens, the possibility of corporations to challenge government decisions under the so-called Investment Court System, and the inclusion of intellectual property rights (IPR) obligations without focusing on promoting access to knowledge.

Despite not being yet ratified by the EU Member States, CETA is expected to already be provisionally applied as of Spring 2017, with some exceptions, meaning that parts of it will start having a practical impact, for example on data protection. If Member States don’t stand up for citizens’ rights by rejecting the agreement, CETA could become a blueprint for other trade agreements and increase growing public mistrust in trade policy. It is the time to better design trade agreements, in order to maintain a high level of protection for the EU citizens. This is possible only with better transparency and inclusion of public interest organisations.

It is crucial for national and local NGOs to make their arguments heard in the ratification process of CETA in each of the EU Member States.

Read more:

Civil Society Letter asking MEPs to vote against CETA (13.02.2017)

Despite large opposition, CETA limps forward in the European Parliament (24.01.2017)

European and Canadian civil society groups call for rejection of CETA (28.11.2016)

CETA signature ignores Agreement’s flaws (30.10.2016)

CETA puts the protection of our privacy and personal data at risk (05.10.2016)

CETA’s cross-boder data flows will be provisionally applied (07.10.2016)

CETA will undermine EU Charter of Fundamental Rights (04.05.2016)


15 Feb 2017

The time has come to complain about the Terrorism Directive

By Maryant Fernández Pérez

Nearly a year has passed since we told that you’d be now complaining about the Terrorism Directive. On 16 February, Members of the European Parliament (MEPs) will vote on the draft Terrorism Directive. EU policy-makers have meaningfully addressed only very few of the concerns that EDRi and other NGOs have raised since the beginning of the EU legislative process.

We worked hard during the elaboration of the Terrorism Directive at the EU level: we defended digital rights since the very beginning, providing policy-makers with expert input; we joined forces with other digital rights organisations; and raised our voice against key proposals together with NGOs like Amnesty International, Human Rights Watch (HRW), the International Commission of Jurists (ICJ), the Open Society Foundations (OSF), the European Network Against Racism (ENAR) and the Fundamental Rights European Experts (FREE) Group (see our joint statements here and here). As a result of the hard work and numerous exchanges with policy-makers, not everything in the Directive is bad for digital rights.

What’s good?

Unfortunately, not as much as we would like. However, there are still some positives. Several provisions that we had advocated for are part of the final text, for example an assurance, in principle, of being able to express radical, polemic or controversial views. We managed to eliminate mandatory internet “blocking”, and some safeguards were introduced with regard to removing and blocking online content and limiting when the absurdly vague concept of unduly compelling a government can constitute a terrorist offence. We also killed some bad proposals that, for instance, tried to undermine encryption and the use of TOR.

What’s wrong?

From a digital rights perspective, there is a long list of bad elements that the European Commission, EU Member States* and the majority of the MEPs of the European Parliament’s Committee on Civil Liberties (LIBE) have introduced and/or kept in the draft Terrorism Directive, including the following:

To sum up, it took a year and two months to conclude a legislative instrument that endangers the protection of our rights and freedoms. This compares badly with the time that it took the EU to conclude an instrument to protect fundamental rights, such as the General Data Protection Regulation (five years, and two more years until it enters into force). Obvious, depressing, conclusions can be drawn about the priorities that drove different parts of the EU decision-making process in both cases.

Therefore, we urge the European Parliament to vote against this Directive or at least vote in favour of some of the amendments proposed to improve some of the elements listed above.

What can you do?

You can raise awareness and contact your MEPs prior to the debate on 15 February (starting around 3pm CET) and the vote on the Directive on 16 February (around 12pm CET). After the vote, it will be the turn of your Member State to implement the Directive and give meaning to the ambiguous provisions of the Directive. If the Terrorism Directive is adopted, civil society should look closely how their national parliaments will implement it, so it will not lead to abusive provisions. Ultimately, yet again, we will have to rely on the courts to be the guardians of our civil liberties.

If you have any questions, don’t hesitate to contact us!


* The United Kingdom, Denmark and Ireland decided not to be bound by this Directive.

08 Feb 2017

Proposed surveillance package in Austria sparks resistance

By Guest author

The Austrian coalition parties have renegotiated their government programme in January 2017. This new programme contains a so-called “security package” that encompasses the introduction of several new surveillance measures and additional powers for the Austrian security agencies. These changes in the law are to be implemented by June 2017.

However, so far no evaluation of already existing surveillance measures and investigatory powers has been carried out. Furthermore, it is doubtful that the new measures will bring about an increase in security, whereas they will severely limit fundamental right to privacy and dial back on existing data protection measures.

The following measures are outlined in the newly agreed government programme:

Networked CCTV monitoring: The Austrian Minister of the Interior Wolfgang Sobotka has repeatedly demanded “all-encompassing surveillance” of public spaces by linking already deployed CCTV cameras operated by both private and public entities, and even transmitting the footage to investigative authorities in real time. The implementation of this kind of surveillance apparatus would effectively create a true panopticon affecting every citizen. However, in light of the terrorist attack in Nice in mid-July 2016 on a promenade monitored by several surveillance cameras, any preventive effect of the surveillance of public spaces is highly doubtful, even with respect to conventional crimes: The Police Directorate of Vienna has removed 15 out of its 17 CCTV installations during the recent years due to high operating costs and no discernible benefits in combating crime.

Automatic license plate recognition: The government wants to implement a system which would recognise all licence plate numbers and retain details of the movements of all vehicles on Austrian highways. In 2007, the Austrian constitutional court decided in a similar case (Section Control) that surveillance of car drivers is only permitted for a few determined routes and that number plate information can only be retained if the vehicle was driving too fast or is on an official wanted list. The new government programme facilitates an unjustified storage of movements for all vehicles, which is very alarming.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Government spyware: In 2016 there was a legislative proposal to legalise the use of government spyware on electronic devices of Austrian citizens. Due to massive criticism from a legal and technical perspective, the Austrian Minister of Justice Wolfgang Brandstetter withdrew the proposed law. In 2008 a commission of constitutional experts under Professor Bernd-Christian Funk came to the conclusion that government spyware is not in line with Austrian constitutional law. Nonetheless, the Austrian government has started a third attempt to pass a legal basis for this unconstitutional measure.

Data Retention Directive 2.0: The Austrian data retention law was abolished by the Austrian constitutional court in 2014 due to its unconstitutionality and violation of fundamental rights. The European Court of Justice (CJEU) confirmed this decision in December 2016 by passing an even further reaching verdict against this type of unfounded mass surveillance. Nevertheless, the new government agreement contains plans for a “quick freeze” based retention of telecommunication data. The final legislative text will have to be scrutinised carefully to define if it is in line with recent CJEU rulings.

Registration of prepaid SIM cards: The Austrian government plans to forbid unregistered prepaid SIM cards and thus to eliminate a way of communicating freely and anonymously with family members, help lines, and persons of trust (such as lawyers). Criminals can easily circumvent this by using foreign SIM cards or online messaging services, making the measure ineffective and disproportionate.

“Subversive movements”: It must be possible to criticise the state or institutions. The government wants to establish a criminal offense for the expression of opinions which undermine the authority of the state. This is a crucial development which stands against the fundamental principle of freedom of expression.

Electronic tags for non-convicted “endangerers”: Another critical demand in the security package is the introduction of electronic tags – a surveillance device locked to an individual’s body – for “endangerers”. But the term “endangerer” (“Gefährder”) is legally not defined and the federal government calls such a person a potential disturber and refers to an “abstract endangering situation” (“abstrakte Gefährdungslage”). So far, electronic tags have been used only for convicted perpetrators or in cases of strong suspicion. This extended use of electronic tags is highly problematic as it violates the principle of presumption of innocence. Similar discussions are ongoing in Germany.

Resistance has been mounting over the proposed extension of surveillance measures in Austria. EDRi observer, epicenter.works and other fundamental rights NGOs in Austria are working to mobilise the population to stop the unprecedented and unfounded surveillance measures in the new government programme to be enacted.

Surveillance package – government plans complete surveillance (only in German)

Surveillance: Cameras are being removed again (only in German, 28.01.2017)

Opportunity and risk of state spyware (only in German, 26.04.2016)

German “Bundestrojaner” – The dismantling of state spy software (only in German)


(Contribution by EDRi observer epicenter.works, Austria)



25 Jan 2017

#PrivacyCamp17: Controlling data, controlling machines

By Heini Järvinen

Accountability, transparency and profiling were the buzzwords of the fifth annual Privacy Camp, which took place on 24 January in Brussels. The camp, this year entitled “Controlling data, controlling machines: dangers and solutions”, brought together civil society, policy-makers and academia to discuss the problems for human rights in the digital environment. The event is organised every year before the Computers, Privacy & Data Protection (CPDP) conference, and it’s co-organised by EDRi, Privacy Salon, Université Saint-Louis (USL-B) and the interdisciplinary Research Group on Law Science Technology & Society of the Vrije Universiteit Brussel (VUB-LSTS).

Who controls your data? Who controls the machines? Who is to be held responsible for the security of our data, and how can civil society make sure the message gets through? These questions were at the very centre of the debates surrounding the pending adoption of important EU-wide legislation, such as the review of the e-Privacy Directive, the smart borders package, the draft Regulation on dual-use goods, and the latest filtering proposals in the draft copyright Directive.

The first panels of the morning were “Community Building Workshop: Societal Impacts of Big Data and the Role of Civil Society”, which discussed how civil society can engage in the policy debates on Big Data, and “Owning the Web Together: Peer Production and Sharing”, which pondered on whether it’s possible to create online platforms based on genuine practices of sharing, with different ownership models and fair working conditions, or if the commons-based decentralised digital platforms are a utopian dream.

The next panel “Instant Big Data Targeting: Programmatic Ad Tech & Beyond” explained to the participants the structure of programmatic advertising, and discussed how personal data is treated and for what purposes. Simultaneously, the panel “The Internet of Things, Security and Privacy” discussed the possible effects of the Internet of Things (IoT) to the future of surveillance, and the solutions to legislative approaches and security education.

The afternoon panel on “Surveillance Tech Export and Human Rights Law” shed light on the proposed overhaul of the EU’s export controls for so-called “dual-use items” that can be used to violate fundamental human rights such as the right to privacy and the protection of personal data. The panellists discussed how human rights law could be used to hold the companies and states accountable.

The Lightning Talks presented a number of interesting projects and point-of-views related to online privacy. For example, Alexander Czadilek and Christof Tschohl from EDRi observer Epicenter.works introduced their new Handbook for the Evaluation of Anti-Terrorism legislation (HEAT), and Katarzyna Szymielewicz from EDRi member Panoptykon presented some ideas on how a strong implementation of the General Data Protection Regulation (GDPR) could be ensured in the EU Member States. EDRi presented the guide to Digital Defenders, a booklet to teach kids privacy that was published in October 2016 and proved to be a huge success.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The day of fruitful debates was wrapped up with a bit of fun. An interactive quiz tested the participants’ level of knowledge regarding surveillance. We came reassured that our community excels at mapping surveillance, definitely watches too much TV (but only educative contents, of course!), can separate fact from fiction, reads the news, and surely knows their classics of surveillance related literature.

Privacy Camp

Privacy Camp: Programme

Video: How long it takes to read the terms and condition for the use of a fitness tracker and how far you could run while reading them? (Finn Mystrand in the panel “The Internet of Things, Security and Privacy”)

Video: Teaser: Information. What are they looking at? (Theresia Reinhold during the “Lightning Talks”)



11 Jan 2017

ENDitorial: Happiness – owning nothing and having no privacy?

By Joe McNamee

In November 2016, Danish social-liberal parliamentarian Ida Auken wrote a chilling, dystopian article that was published on the website of the World Economic Forum. It looked forward to a hypothetical society in the year 2030, where nobody owned anything, not even their own personal space, not their own secrets, not their own life. In an addendum to the piece, Ms Auken explained that some had portrayed this as a “utopia or dream of the future” which was not, she explained, her stance.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

An unseen hand would own everything and everything would be communal. The unseen hand would be benevolent. Those that had absolute power and absolute control in a society where individuals had no privacy and no assets (and, consequently, no ability to challenge power, to control or to hold power accountable) would somehow have willingly given away their power and replaced it with a benevolent dictatorship.

Nobody would have the responsibility to do the hard work of industrial production, but it would somehow still be done. Artificial intelligence, owned, developed and maintained either by no-one or by the all-seeing benevolent dictator, would be able to do your shopping. It would know your preferences better than you, so why would you do it yourself?

In this utopia/dystopia, Ms Auken imagines individuals being disturbed by the lack of privacy and hopes that nobody will use it against them. Of course, as mentioned, she envisages that artificial intelligence would know individuals better than they know themselves. As a result, artificial intelligence will be aware of their concerns and one imagines that those individuals’ filter bubbles would be adapted accordingly, in order to assuage their fears.

The luddites, the ones who would want a society based on the freedom to evolve, to challenge and to question without the “sharing economy” disenfranchising, disappropriating and commodifying them, would live outside the city. City dwellers would worry for the welfare of these self-sufficient societies, living with privacy in an adaptable and changeable society.

Ms Auken explains she wrote the post as a means of starting a debate because (hopefully in a less extreme variety), the issues she raises are already on the horizon. Her rather provocative piece is, therefore, an important spur for some much-needed debate.

Her post raises questions such as…

  • If knowledge is power and the vast unbalance between the knowledge of the surveillance economy and the knowledge of citizens continues to grow, can that power be held accountable?
  • Can a society evolve in such circumstances?
  • Can democracy exist when monopolies of that surveillance economy are (now, already) being asked to monitor and filter our communications, building on their existing, profitable, filter bubbles?

Welcome to 2030. I own nothing, have no privacy, and life has never been better (11.11.2016)

(Contribution by Joe McNamee, EDRi)