14 Jun 2017

Access to e-evidence: Inevitable sacrifice of our right to privacy?

By Guest author

What do you do when human rights “get in the way” of tackling crime and terrorism? You smash those pillars of your democratic values – the same ones you are supposedly protecting. Give up your right to privacy, it is a fair price to pay for the guarantee of your security! This is the mantra that, during the past decades, we have heard populist politicians repeat over and over again – never mind that gambling with our rights actually helps very little in that fight.

One of the bargaining chips in the debate on privacy versus security is access to e-evidence.

E-evidence refers to digital or electronic evidence, such as contents of social media, emails, messaging services or data held in the “cloud”. Access to these data is often required in criminal investigations. Since the geographical borders are often blurred in the digital environment, investigations require cross-border cooperation between public authorities and private sector.

Thorough police investigations are indeed of utmost importance. However, the access to people’s personal data must be proportionate and necessary for the aim of the investigation and provided for by law.

In a similar way that the police cannot enter your home without a court warrant, they are not supposed to look into your private communications without permission, right? Not really.

The EU is working towards easing the access to e-evidence for law enforcement authorities. The plan of the European Commission is to propose new rules on sharing evidence and the possibility for the authorities to request e-evidence directly from technology companies. One of the proposed options is that police would be able to access data directly from the cloud-based services.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

This means that Facebook, Google, Microsoft, providers of messaging services, and other companies which collect and store data of millions of EU citizens, would be obliged to provide this data to the authorities, even when stored in the cloud in another EU Member State. The types of data that might fall within the scope of the law range from metadata (such as location, time, sender and recipient of the message and other non-content data) to the content of our personal communications.

But for sure there must be safeguards to protect people’s right to privacy, right? Not necessarily, especially when pushing for “voluntary” cooperation between companies and law enforcement. This kind of arrangements often lack in accountability and predictability. This is why any new measures on e-evidence must comply with international human rights and data protection standards. Member States must continue to be able to regulate access to data in their jurisdiction and on their citizens and residents, in particular by foreign law enforcement and national security agencies. Individuals must also be able to seek protection and redress in their own country.

Access to e-evidence is also being discussed beyond EU borders. The Council of Europe (CoE) is preparing to adopt a new protocol to the so-called Budapest Convention – the Convention on Cybercrime of the Council of Europe. The Convention covers not only CoE Member States, but all 53 countries that have ratified it. This means not all of them are bound by data protection or human rights conventions. EDRi is following this process attentively and has submitted input on several occasions.

The initiative from the European Commission is establishing the framework for a new legislative proposal, which is scheduled to be presented in the beginning of 2018. On 8 June 2017, the Commission presented the options for practical and legislative measures to the EU ministers. EDRi is participating in expert discussions on the suggested way forward.

It is crucial that safeguards to ensure data protection and the rule of law are applied to the new legislation. Otherwise, it will be imposed at the cost of the human rights of citizens.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

RightsCon session on cross-border access to e-evidence – key interventions (10.05.2017) https://edri.org/rightscon-session-on-cross-border-access-to-e-evidence-key-interventions/

EDRi’s position paper on cross-border access to electronic evidence in the Cybercrime Convention (17.01.2017)
https://edri.org/files/surveillance/cybercrime_accesstoevidence_positionpaper_20170117.pdf

EDRi’s letter to the Council of Europe on the T-CY Cloud Evidence Group Report on criminal justice access to evidence in the cloud (10.11.2016)
https://edri.org/files/surveillance/letter_coe_t-cy_accesstoe-evidence_cloud_20161110.pdf

Professor Douwe Korff’s analysis on the T-CY Cloud Evidence Group Report on criminal justice access to evidence in the cloud (10.11.2016)
https://edri.org/files/surveillance/korff_note_coereport_leaaccesstocloud%20data_final.pdf

European Commission: e-evidence
https://ec.europa.eu/home-affairs/what-we-do/policies/organized-crime-and-human-trafficking/e-evidence_en

(Contribution by Zarja Protner, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
19 May 2017

Looking back on our 2016 victories

By EDRi

Technological advancements in the digital world create new opportunities but also new challenges for human rights. Especially in the past year, the fear of extremism on the one side and extreme measures on the other resulted in the desire for swift political action and made defending citizen’s rights and freedoms online a difficult task. In 2016, our European network faced demands for increased state surveillance and restrictions on the freedom of expression by private companies, and decreased protection of personal data and privacy. Our annual report 2016 (pdf) gives you an overview of EDRi’s campaigns across the European countries and our key actions at EU level.

Despite our struggles, our members, observers, national and international partners, supported by many individuals who contributed to our work, successfully protected digital rights in a number of areas.

#privacy
We successfully advocated for a reform of privacy rules in electronic communications (ePrivacy) and played a key role in the civil society efforts that led to the adoption of the EU’s General Data Protection Regulation (GDPR) in April 2016.

#netneutrality
We scored a big success in our top priority issue and secured net neutrality in Europe. This victory was the outcome of more than five years of hard work and the input from over half a million citizens responded to the net neutrality consultation in 2016.

#dataprotection
We released influential analysis that contains implementation guidelines for the General Data Protection Regulation. We published two documents highlighting the numerous, unpredictable flexibilities in the legislation and how they should be implemented.

#saferinternet
We published the “Digital Defenders”, a comic booklet to help kids make safer and more informed choices about what to share and how to share online. It turned out to be a huge success – the original English version of the booklet has been downloaded from our website over 25 000 times and published in Serbian, Turkish, German, Greek, Spanish and Italian, with other translations on the production line.

#anti-terrorism
While we regret the adoption of an ambiguous Directive, we successfully requested the deletion of many harmful parts that were proposed in the course of the legislative discussions and the clarification of some of the ambiguous language.

#privacyshield
Our criticism of the new so-called Privacy Shield was echoed by many experts in the European institutions and bodies (the European Parliament, the European Data Protection Supervisor, and the European Ombudsman) and led to mainly negative press coverage for the Commission and continued pressure for a more credible solution.

Read more in our Annual Report 2016!

Our finances can be found on pages 43-44.


Twitter_tweet_and_follow_banner

close
17 May 2017

UK Digital Economy Act: Millions of websites could be blocked

By Guest author

The Digital Economy Act has become law in the United Kingdom. This wide-ranging law has several areas of concern for digital rights, and could seriously affect privacy and freedom of expression of internet users.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

One of the main concerns is that it will compel legal pornographic websites to verify the age of their users. The British Board of Film Classification (BBFC) has been given the power to fine or instruct ISPs to block websites that fail to provide age verification, which could mean that thousands of websites containing legal content could be censored.

On 10 May 2017, EDRi member Open Rights Group (ORG) received a response to their Freedom of Information (FOI) request on the correspondence between BBFC and MindGeek, the company developing age verification technology. The response revealed that for the Digital Economy Bill to be effective in preventing children from accessing pornography, the government would need to block over four million websites.

The law will also extend the maximum prison sentence for online copyright infringement to ten years. ORG has raised concerns that the wording of this offence is too broad and could in theory be used against file sharers. It could also be exploited by ”copyright trolls”, that is law firms who send letters to threaten users who are suspected of unauthorised downloading of copyrighted works with the possibility of legal procedures – even though there may not be evidence to support this.

The Digital Economy Act also gives the police the power to disable mobile phones that they believe might be used for crimes. ORG has criticised this power, as it pre-empts criminal behaviour.

Finally, the Act includes new powers for sharing data across government departments. Even if the definitions of these new powers were improved during the parliamentary process, they are still too broad, and leave room for practices that dramatically threaten citizens’ fundamental rights to privacy.

The UK Digital Economy Bill: Threat to free speech and privacy
https://edri.org/the-uk-digital-economy-bill-threat-to-free-speech-and-privacy/

FOI response reveals porn company’s proposals for UK to block millions of porn sites
https://www.openrightsgroup.org/press/releases/2017/foi-response-reveals-porn-blocking-proposals

Digital Economy Act: UK police could soon disable phones, even if users don’t commit a crime
http://www.independent.co.uk/life-style/gadgets-and-tech/news/digital-economy-act-uk-police-disable-phones-before-crime-users-privacy-snooping-charter-a7717126.html

(Contribution by Pam Cowburn, EDRi member Open Rights Group, the United Kingdom)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
17 May 2017

Big Data for Big Impact – but not only a positive one

By Guest author

Technology has changed and keeps dramatically changing our everyday life by transforming the human species to advanced networked societies. To celebrate this digital revolution, 17 May is dedicated to the “World Telecommunication and Information Society Day” (WTISD-17).

The theme for this year’s celebration is “Big Data for Big Impact”. Not so surprisingly, the buzzword “big data” echoes in our daily commutes over the internet world. The chosen theme focuses on harnessing the power of big data to turn complex and imperfect pieces of data into a meaningful and actionable source of information for the social good.

Big data has a potential to improve society – much like electricity or antibiotics. From health care and education to urban planning and protecting the environment, the applications of big data are remarkable. However, big data comes with big negative impacts. Big data can be used – by both advertisers and government agencies – to violate privacy. The power of big data can be exploited to monitor every single detail of people’s activities globally.

With 29 million streaming customers, Netflix is one of the largest providers of commercial media in the world. It has also become a trove of data for advertisers as it collects data on users’ activities – what, when and where they are watching, what device they are using, when they fast-forward, pause or stop. Just imagine a representative of Netflix sitting behind your couch, looking over your shoulder and making notes whenever you turn on the service. This applies to many online services, such as Google, Amazon, Facebook or YouTube.

Mass surveillance initiatives by intelligence agencies such as the US National Security Agency (NSA) and the UK Government Communications Headquarters (GCHQ) take this power to the next level to knock down every bit of personal space. Without big data, the scale at which such profiling is done today would not be possible.

It is very tempting to use the benefits of big data for all sorts of purposes. Hiring new employees based on their social media activities, granting insurances based on fitness tracker data, airport security check ups and future crime predictions based on cell phone call logs, to mention a few. But there are some fundamental problems with applying big data to services.

The first problem is that, knowingly or unknowingly, we all have biases when making decisions. If decisions made by millions of employers, policemen or judges over a long period are collected together, it brings in all those biases, on a bigger scale. Big data may just refer to a large chunk of unstructured data, but the insights deduced from it will rely on machine learning – which accumulates all possible biases, such as gender and race. Algorithmic decision-making could turn out to be more biased than ever before, which would have a terrible effect on the society.

The second problem is the error rates: A study on automatic face recognition software found that the error rates can vary between 3% and 20%. This means that your face could match with one in the database of potential terrorist the next time you go to the airport and you could be pulled out for questioning or get into even more trouble. This is happening in the international airport transit on a daily basis. It is not possible to create 100% accurate models, and every time the assumptions are made on a missing data sample, the errors are inevitable.

Therefore, when dealing with big data, it is crucial to be extremely cautious about the quality and sources of the data, as well as about who can access it, and to what extent. If a data set stemming from diverse sources is handled with special care and anonymised thoroughly to protect privacy rights, big data can be used to solve complex societal problems. But if it is left unregulated or not properly regulated, and not tested for its fairness and biases, it can pose a serious threats to our human rights and fundamental freedoms.

EDRi has fought for the EU General Data Protection Regulation (GDPR) to regulate this practice. Now EU Member States are implementing the GDPR, and it is up to them not to abuse the weak points of the Regulation to undermine the protection of the European citizens’ data.

Video by EDRi member Privacy International: Big Data
https://www.privacyinternational.org/node/572

Creating a Big Impact with Big Data
https://researchmatters.in/article/creating-big-impact-big-data

(Contribution by Siddharth Rao, Ford-Mozilla Open Web Fellow, EDRi)

Twitter_tweet_and_follow_banner

close
03 May 2017

Encryption – debunking the myths

By Guest author

How to send a sensitive message protecting it from spying eyes? Encrypt it. You think your message is not sensitive or that no one is spying on you? Encrypt it anyway.

When you send your message encrypted, no-one else but the intended recipient can read it. Even if someone manages to catch the message when it’s on its way to the recipient, they will not be able to read its contents – they can only see something that looks like a random set of characters.

Encryption is essential for the protection of our digital infrastructure and communications, but it is still burdened by some myths that keep on surfacing in discussions.

1. For spies and geeks only

Not only spies, criminals and privacy geeks use encryption. In fact, everyone is benefiting from it on a daily basis, even if everyone is not aware of it. Encryption not only guarantees the confidentiality of our communications, but it also makes our lives easier and enables digitalisation of the society.

Electronic banking? Encryption is what makes our transactions safe and secure. The same goes for any online activities of businesses to protect themselves against fraud. Citizens submit digital tax returns, the intelligence community encrypts state secrets, the army sends orders securely in order to avoid compromising military operations, and civil servants negotiate trade deals by sending messages that only the addressee can read (or they should!). Journalists rely on it to protect their sources and information when investigating confidential or potentially dangerous issues of crime, corruption, or other highly sensitive topics, performing their role of the democratic watchdogs. Without encryption ensuring authenticity, integrity, and confidentiality of information, all this could be compromised.

2. Who cares?

Encryption enables us to collect information and communicate with others without outside interference. It ensures the confidentiality of our communications, for example with our doctors, lawyers, partners. It is an increasingly important building block for freedom of expression and respect for privacy. When you achieve privacy through confidentiality of your communication, you are able to express yourself more freely. People prefer to use messaging apps like Signal and WhatsApp, which enable privacy of their communications by employing end-to-end encryption. In a survey, requested by the European Commission, nine out of ten respondents agreed they should be able to encrypt their messages and calls, so they can only be read by the intended recipient. No matter whether you are making dinner plans, sharing an intimate message or dealing with state secrets, whether you are a president, a pop star or just an ordinary citizen, the right to have control over your private communication and protect it from hackers and government surveillance matters.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

3. Criminals, terrorists, and the old “privacy versus security”

How do you make sure encryption is not used with bad intentions? It’s simple – you cannot. But this does not mean it makes sense for governments to weaken encryption in order to fight terrorism and cybercrime. It only opens Pandora’s box – when supposedly making sure that terrorists have no place to hide, we are exposing ourselves at the same time.

From a technical point of view, encryption cannot be weakened “just a little”, without potentially introducing additional vulnerabilities, even if unintentionally. When there is a vulnerability, anyone can take advantage of it, not just police investigators or intelligence services of a specific country when necessary. Sooner or later, a secret vulnerability will be cracked by a malicious user, perhaps the same one it was meant to be safeguarding us from.

Therefore, weakening or banning of encryption in order to monitor any person’s communications and activities is a bad idea. The number of possibilities for criminals to evade government-ordered restrictions on encryption is vast. Knowledge of encryption already exists, and its further development and use cannot be prevented. As a result, only innocent individuals, companies, and governments will suffer from weak encryption standards.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

EDRi: Position paper on encryption (25.01.2016)
https://www.edri.org/files/20160125-edri-crypto-position-paper.pdf

EDRi paper: How the internet works?, page 6: Encryption
https://edri.org/files/2012EDRiPapers/how_the_internet_works.pdf

Surveillance Self-Defense: What Is Encryption?
https://ssd.eff.org/en/module/what-encryption

Winning the debate on encryption — a 101 guide for politicians (21.04.2017)
https://medium.com/@privacyint/winning-the-debate-on-encryption-a-101-guide-for-politicians-4ff4353d427

(Contribution by Zarja Protner, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
22 Mar 2017

Hakuna Metadata – Exploring the browsing history

By Guest author

Metadata is data about data. In an e-mail, the data is the content of the e-mail and metadata is the information about the e-mail. So, it covers information like who is it from or who sent it, the date and time, the subject, network information etc. When we are browsing the internet, data is represented by the content of the websites that we visit, but the metadata are the website addresses (so-called “URLs”), the time of visit and the number of visits, network information, and so on.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Data is often considered to be sensitive, and it is possible to protect it using encryption technologies. However, metadata is generally not treated as sensitive, and is also very difficult to encrypt. For example, if we encrypt the sender information on an email, your email client would not know where to send it.

Metadata was not invented to help privacy invaders. It was intended to speed up the process of classification and indexing of any kind of bulk data, without looking at the data itself. By definition, metadata enforces data protection by letting someone process the data, without even looking at the content inside – sort of like an envelope in traditional postal services. However, metadata is also the fastest way to profile internet users – by analysing the number and nature of communications between different people, with particular websites, location, keywords. Although profiling based on metadata can be used for a number of purposes, the exploitation of its power for advertising and surveillance is its most common and controversial use.

Browsers store the browsing history to provide a more user-friendly browsing experience. By default, browsers store the history of all the previously visited websites, cached copy of the websites, form filling history, cookie information, and bookmarks. Depending on the operating system and the browser, this information will be stored in a specific location on the hard disk of your computer in a lightweight database. Browser history has its own advantage in terms of usability, such as automatic completion of previously visited URLs, and locally cached copies of the previously visited websites to boost up the browsing speed.

Who can access our metadata from browsing? Our browsing history is accessible to our browsers, which is why it is highly recommended to use open-source trustworthy browsers such as Mozilla Firefox, which protects and respects your privacy. If you are using other browsers from the companies that are themselves data brokers and advertisers, you end up giving away your browsing history to get tracked. Even when we can trust our browsers, there are other actors with access to our browsing history. Full access to our browsing history can be gained through a Wifi Hotspot, especially when using public hotspots, as well as because of a malware in the computer. Almost full access to your browsing history is available to Internet Service Providers (ISPs), even when the traffic is encrypted. Partial access is available to Domain Name Service (DNS) Providers, to different companies for tracking, advertising and profiling through cookies, browser fingerprinting, etc., and to websites that you visit.

In spite of the clear privacy implications, there is no clarity under the law about whether browsing history is to be protected as content or non-content metadata.

Hakuna Metadata, a project to analyse metadata by EDRi’s Ford-Mozilla Open Web Fellow Sid Rao shows how metadata can reveal a surprising scale of our daily interactions online. It is possible to learn about a person’s working hours, sleep time, work-related travel and holiday schedules, interests and other keywords related information, who their friends are and much more just by using their browsing metadata. You can read more about the project and the results of the analysis here and download the open source browsing history visualisation tool here.

Hakuna Metadata – Exploring the browsing history (28.03.2017)
http://www.privacypies.org/blog/metadata/2017/02/28/hakuna-metadata-1.html

Hakuna Metadata – Browsing history visualization for Linux + Firefox combo
https://github.com/sidtechnical/hakuna-metadata-1

Metadata Investigation: Inside Hacking Team (29.10.2015)
https://labs.rs/en/metadata/

(Contribution by Siddharth Rao, Ford-Mozilla Open Web Fellow, EDRi)

close
06 Mar 2017

Are net neutrality and privacy Europe’s brilliant way of trumping destructionism?

By Joe McNamee

For the online economy to work, trust and competition are needed. Trust to drive take-up of services and competition to drive down prices and drive up innovation.

Privacy

The 2016 Eurobarometer (pdf) survey found that nearly 60% of individuals in the EU had avoided certain websites for privacy reasons, while 82% were in favour of restrictions on cookies. This shows how important clear privacy rules are for individuals and for trust in the online economy. The European Union has addressed this problem head-on, by proposing and adopting the General Data Protection Regulation (GDPR) and, more recently, proposing the e-Privacy Regulation.

Clear rules, with effective enforcement, generate trust and provide a harmonised market for companies serving individuals in Europe.

The US national telecoms regulator the Federal Communications Commission (FCC) also saw the danger from the “wild west” of personal data exploitation online. The danger was illustrated when the National Telecommunications and Information Administration carried out a survey in 2016. This study found that – in the previous 12 months – 19% of internet-using households had suffered an online security breach, while 45% had refrained from an online activity due to privacy and security fears. Faced with this compelling evidence of the damage caused by lack of trust and security, the FCC tried to act in October 2016. It passed ground-breaking privacy rules (by 3 votes to 2) to protect broadband users and improve trust. However, it was not possible to enshrine the rules in law, meaning that the rules are contingent on the whims of the Commissioners. The appointment of a new FCC Chairman by the incoming president makes it almost certain that US citizens – and the US online economy – will be robbed of this essential protection… unless they use European services, of course.

Far from GDPR and e-Privacy being European protectionism, the US laissez-faire approach appears to be self-inflicted US destructionism.

Net neutrality

In 2013, the EU was faced with increasing evidence of internet access companies seeking to undermine innovation and competition online. It was faced with calls to legislate to protect discriminatory “specialised services” which would allow big online companies to sell “fast lane” to gain access to the customer base of big telecoms operators. Not alone did the European Union not give in to this huge lobbying effort, it legislated in favour of rules that will prevent big telecoms operators from becoming a gatekeeper that stops the full internet being accessible to their customers. It legislated for openness and innovation with a binding EU-wide regulation.

The Federal Communications Commission saw the same danger as the European Union. However, it was not possible to enshrine net neutrality in law. All the FCC could do was to adapt its own implementation of its own rules and powers to defend the online environment from big telecoms operators, in a market that was already less competitive than the one in Europe. As a result, those rules are contingent on the whims of the Commissioners. The appointment of a new FCC Chairman by the incoming president makes it almost certain that US citizens and online businesses will be robbed of this essential protection.

Europe has legislated for open, innovative, better value online services. If the US abandons net neutrality and privacy, it will be opting for self-inflicted destructionism.

Only the EU could have adopted positive, exemplary legislation on this scale to protect individuals and businesses. And it did.

Twitter_tweet_and_follow_banner

close
02 Mar 2017

Privacy Camp 2017 in video

By EDRi

On 24 January, the fifth annual Privacy Camp, co-organised by EDRi, Privacy Salon, Université Saint-Louis (USL-B) and the interdisciplinary Research Group on Law Science Technology & Society of the Vrije Universiteit Brussel (VUB-LSTS) took place in Brussels.

Did you miss our #PrivacyCamp17: Controlling data, controlling machines? Now you can watch all the sessions or relive some of the precious moments of insightful debates.

Community building workshop: Societal impacts of big data and the role of civil society
Moderator:
Rocco Bellanova, University of Amsterdam and USL-B
Speakers:
Hans Lammerant, VUB and BYTE
Diego Naranjo, EDRi
Estelle Massé, AccessNow
Christian D’Cunha, EDPS


Link: https://youtu.be/QpXaW5Rcbgc

Owning the web together: Peer production and sharing
Moderator:
Seda Gürses, KULeuven
Speakers:
Ela Kagel, Supermarkt
Shermin Voshmgir, BlockchainHub
Tim Jordan, University of Sussex


Link: https://www.youtube.com/watch?v=Z9Z9ewyhI0A&t=19s

Instant big data targeting: Programmatic ad tech & beyond
Moderator:
Anna Fielder, Privacy International
Speakers:
Jeff Chester, Center for Digital Democracy
Wolfie Christl, Cracked Labs
Frederik Borgesius, University of Amsterdam


Link: https://www.youtube.com/watch?v=ge0Q1hlhUpI

The Internet of Things, security, and privacy 
Moderator:
Sid Rao, Mozilla Advocacy Open Web Fellow at EDRi
Speakers:
Finn Myrstad, Norwegian Consumer Council
Katitza Rodriguez, Electronic Frontier Foundation
Andreas Krisch, EDRi and Forum Datenschutz
Fieke Jansen, Tactical Tech


Link: https://www.youtube.com/watch?v=f4VKJJUz2Yw

Surveillance tech export and human rights law
Moderator:
Lucie Krahulcova, AccessNow
Speakers:
Joshua Franco, Amnesty International and CAUSE
Renata Avila, World Wide Web Foundation and Courage Foundation
Walter van Holst, Vrijschrift
Ellen Desmet, UGent and HRI Network


Link: https://www.youtube.com/watch?v=hdDSoNYkOV4

Lightning talks:

Alexander Czadilek and Christof Tschohl, epicenter.works, Austria: Presentation of HEAT – Handbook for the Evaluation of Anti-Terrorism legislation


Link: https://www.youtube.com/watch?v=Xh_hG1iLBiQ&t=9s

Eva Lievens, Ghent University: Youth in the data deluge: How can the General Data Protection Regulation protect their privacy while fostering their autonomy


Link: https://www.youtube.com/watch?v=vJWbZFNKUZ0

Katarzyna Szymielewicz, Panoptykon, Poland How to ensure a strong General Data Protection Regulation implementation


Link: https://www.youtube.com/watch?v=RnXVaK3cCvM

Kirsten Fiedler, EDRi: Presentation of Digital Defenders: privacy for kids comic booklet


Link: https://www.youtube.com/watch?v=DK9_mT51JJ4&t=3s

Arne Hintz, Cardiff University: Presentation of Data Justice Lab


Link: https://www.youtube.com/watch?v=BP0Rs-2m6vo

Theresia Reinhold: Presentation of documentary Information. What are they looking at?


Link: https://www.youtube.com/watch?v=7j3tBG60GPI&t=48s

Ali Lange, Center for Democracy & Technology, USA: The right to explainability


Link: https://www.youtube.com/watch?v=8r-ftqFuoJc&t=1s

Twitter_tweet_and_follow_banner

close
22 Feb 2017

The UK Digital Economy Bill: Threat to free speech and privacy

By Guest author

The Digital Economy Bill is being debated by the House of Lords in the United Kingdom. This is a far-reaching bill that covers a range of digital issues, including better broadband coverage across the UK. However, from the digital rights point of view, there are three main areas of concern.

Age verification:
The bill includes proposals to force porn sites to verify the age of their users with no requirements to protect their privacy. During the debate on 6 February 2017, the UK government said no privacy safeguards were necessary. In order to force foreign websites to comply with the proposals, the government has proposed that a regulator could instruct Internet Service Providers (ISPs) to block websites that fail to provide age verification. This could mean that thousands of websites containing legal content could be censored. These proposals have implications for privacy and free speech rights in the UK and EDRi member Open Rights Group (ORG) is campaigning to amend the bill.

Data sharing:
There are worrying proposals to make it easier to share data not only across government departments, but also with private companies. ORG has been involved in government discussions about these measures but the concerns raised have not been addressed in the bill. The main concerns are that the bill lacks sufficient privacy safeguards, ministers have too much power without scrutiny, data on births, deaths, and marriages can be shared without any restrictions other than those found in pieces of other legislation, and the codes of practice are not legally binding.

Copyright:
There are proposals to increase the maximum prison sentences for online copyright infringement to ten years – to bring it in line with offline infringement. ORG is concerned that the definition of the infringement is too broad and will catch large numbers of internet users. ORG is trying to amend the bill to ensure that such severe sentences are given to only those guilty of serious commercial infringement.

ORG has made a submission explaining the huge threat to free speech and why these proposals should be dropped. They launched a spoof recruitment campaign for Internet Censors to help classify the web for age verification. Over 23 000 people have signed a petition for rejecting the proposals.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

ORG’s submission
https://www.openrightsgroup.org/ourwork/reports/written-evidence-to-house-of-commons-public-bill-committee-on-the-digital-economy-bill

Spoof recruitment campaign
https://www.newgovernmentjobs.co.uk

Petition about the proposals
https://www.newgovernmentjobs.co.uk/petition/say-no-to-censorship-of-legal-content/

(Contribution by Pam Cowburn, EDRi member Open Rights Group, the United Kingdom)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
22 Feb 2017

What does your browsing history say about you?

By Guest author

An average internet user visits dozens of websites and hundreds of web pages every day, most of which are kept in the history of our internet browsers. But what if someone took this massive database of visited web pages and cross-referenced them? A joint collaboration of Tactical Tech and SHARE Lab researchers focused on discovering intentions, desires, needs, and preferences of a person based on their browsing history.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Swiss journalist, called Mr J for the purposes of the research, visited the Tactical Tech office in Berlin in June 2015, and provided them with a sample of his web history, upon which this research was based. By analysing large sets of web addresses (so-called Uniform Resource Locators URLs), especially from popular services such as Google Maps, Google Search or YouTube, they were able to create a picture of Mr J’s everyday routine, including his interests and intentions, even apartments he rented via Airbnb while he was travelling abroad. Also, since Facebook has a “real-name policy”, it is quite easy to link a person’s web history to their profile, as well as create a social graph of their Facebook friends and connections, based on the Facebook URLs they visited.

As websites Mr J visits contain a lot of trackers, small bits of data used for collecting behavioural information of users, the experiment also showed which companies extract the most data on Mr J. Google, Facebook and Twitter were unsurprisingly among the companies with the largest number of trackers. It was also interesting to “read” sample web pages Mr J visited like a machine would do it. This is possible with Google’s Cloud Natural Language tool, which is attached to its deep learning platform and can be used to extract information about people, places, events, and much more, mentioned in text documents, news articles or blog posts. It recognised important events, names, and places based on keywords it picked up from web pages.

All these findings lead to the conclusion that if someone, such as private companies, the state, or law enforcement, were to employ these techniques on a large segment of the population and target people’s web history, it would be a frightening introduction to a project of “thought police”, arresting individuals suspected of committing a crime in the future.

SHARE Lab: Browsing Histories – Metadata Explorations
https://labs.rs/en/browsing-histories/

(Contribution by Bojan Perkov, EDRi observer SHARE Foundation, Serbia)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close