security & surveillance

While offering vast opportunities for exercising and enhancing fundamental rights, the digital environment also offers both opportunities to commit new offences and to impose new restrictions on our online rights. Measures such as filtering, blocking and untargeted surveillance are often easy to implement and extremely difficult to rectify. EDRi therefore works to ensure that all security and surveillance measures are necessary, proportionate and implemented based on solid evidence.

25 Jul 2018

Member in the spotlight: Hermes Center

By Hermes Center

In this edition of the “Member in the spotlight” series, EDRi is proud to present new member Hermes Center.

Who are you and what is your organisation’s goal and mission?

Hermes Center for Transparency and Digital Human Rights is an Italian civil rights organisation focusing on the promotion and development of the awareness and attention to transparency, accountability, freedom of speech online and, more in general, the protection of rights and personal freedom in a connected world.

How did it all begin, and how did your organisation develop its work?

The origins of Hermes can be traced back to the start of the GlobaLeaks project and its fundamental need for an organisation that advocates for privacy and digital human rights in Italy in an organised manner.

The Center has been formally registered in 2012 by bringing together the most active members of Italian privacy communities for years, all being part of the “Project Winston Smith” (PWS), the former “Anonymous Organisation”.

From its creation, Hermes immediately reunited a unique mix of activists, lawyers and hackers operating in the national and international context, with core activities ranging from software development of whistleblowing and anonymity tools up to very relevant digital rights advocacy activities like support of journalists, organisation of awareness-raising events, drafting of op-eds on these issues for national media, along with policy, advocacy and support for policy making.

Its main goals have been fully supported over the years by volunteers and by private donations for their works (conference presentations, free workshops, conferences and panel organisations).
Funding has mostly been provided by the Open Technology Fund in Washington and the Hivos foundation in Den Haag in form of research grants for its software development projects, GlobaLeaks and Tor2web.

The biggest opportunity created by advancements in information and communication technology is…

… the free flow of information, the possibility to connect instantaneously with people from all over the world to discuss and organise on common topics. Furthermore, these technologies can foster more transparency and accountability of the government and help citizens to defend their rights.

The biggest threat created by advancements in information and communication technology is…

… the pervasive state of surveillance that presents two different and somehow overlapping faces: surveillance capitalism by companies and police surveillance.

Which are the biggest victories/successes/achievements of your organisation?

We advocated for the adoption by the Italian Anti-Corruption Authority (ANAC) of an online whistleblowing platform using onion services, giving whistleblowers a secure way to report illegal activity and at the same time protect their identities.

With the same goals and similar activities, through a network of EU partners and mostly in collaboration with Xnet, the Center supported Barcelona City Hall, the Anti-Fraud Authority of the Catalonia, the Anti-Fraud Authority of Valencia and the Madrid City Hall in the adoption of free and open source technologies among which, first of all, the mentioned GlobaLeaks whistleblowing framework created by the Center which is now used worldwide by more than 300 organisations.

Additionally, together with other Italian organisations, we managed to push the Italian Ministry of the economic development (MISE) to revoke the export license of Area SpA, a well-known surveillance company.

If your organisation could now change one thing in your country, what would that be?

We would like to change the approach by politicians to the discussion around digital rights, and achieve greater civil society involvement.

What is the biggest challenge your organisation is currently facing in your country?

Last year, the Italian government introduced a new data retention law, extending the collection and retention of phone and telecommunication metadata to a period of 72 months. This measure is clearly against the CJEU case law on data retention. At the same time, we are concerned with government surveillance, which lacks a clear legal framework, and we are firmly opposing the adoption of electronic voting systems being discussed by the government.

How can one get in touch with you if they want to help as a volunteer, or donate to support your work?

You can reach us on Twitter, Facebook and visiting our website. There you can find our official contacts and those of our members. There is also the possibility of making a donation or volunteering.

Read more:

Edri member in the spotlight series
https://edri.org/member-in-the-spotlight/

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
25 Jul 2018

ENDitorial: The fake fight against fake news

By Guest author

The new danger is no longer yellow, but red once more: fake news. It helped getting Trump elected. It paved the highway to Brexit. Even local European elections are not safe. The greatest danger to our democracy in modern times must be fought by all possible means.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Fortunately, we have the European Commission. Laws are gradually replaced by terms of use, and courts of law by advertising agencies. The latter are tipped off by Europol and media companies when users break their rules. Trials are no longer necessary. Progress is only measured by how much and how quickly online content gets deleted. The Commission keeps up the pace with a rapid-fire of communications, covenants, press releases and directive proposals, and sees that all is well.
Unfortunately, the previous paragraph is not an attempt at satire. The only incorrect part is that the fake news hype is not the cause of this evolution. It does, however, fit in seamlessly.

Fake news avant la lettre

Fake news is old news. In his book “The Attention Merchants”, Tim Wu links its emergence to the rise of the tabloids in the 1830s. While most news papers cost six cent, the New York Sun only cost one cent. It grew quickly in popularity thanks to publishing an abundance of gruelling details on court cases, and was mainly financed by ads for “patent medicine” – commercial medicine based on quackery.

The rise of radio tells a similar tale. RCA, the maker of the first radios, launched NBC so its clients could listen to something with their new device. CBS, which started broadcasting at a later date, nevertheless quickly grew much bigger thanks to easy listening programming coupled to an expansive franchise model that enabled local stations to share in the ad revenue. Television reruns the same story, with Fox News that manages to reach a broad audience with little previous exposure to TV ads.

Tall stories, half truths, and sensational headlines are tried and tested methods used by media companies to sell more ads. On the internet every click on “Five Tips You Won’t Believe” banners also earns money for the ad agencies. However, so do visits to “Hillary’s Secret Concentration Camps”. In this sense the distribution of fake news through Facebook and Google always has been a natural part of their business model.

The doctor is expensive

Disinformation about a person is handled by defamation law. For specific historical events, like the holocaust, most European countries have laws that make its denial a criminal offence. Spreading certain other kinds of wrong information, however, is not illegal, like claiming that the Earth is flat.
Laws in this field are always contentious, given the tension with the right to free speech and the freedom of the press. Deciding what takes precedence is rarely obvious, and so normally a judge has the final word as to whether censorship is appropriate.

However, the courts are overloaded, money to expand them is lacking, and the amount of rubbish on the internet is gargantuan. Therefore legislators are eagerly looking at alternatives.

Let’s try self-medication

In recent years, the approach at the European level to relieve the courts has been one of administrative measures and self-regulation.

In 2011, article 25 of the Directive on Combating Sexual Exploitation of Children introduced voluntary website blocking lists at the European level. The goal was to make websites related to sexual child abuse unreachable in case closing them down or arresting the criminals behind them turned out unfeasible.

The 2010 revision of the Directive on Audiovisual Media Services (AVMS), originally intended for TV broadcasters and media companies, was broadened to also partially include services like YouTube. It requires sites that enable video sharing, and only those sites, to take measures against a.o. hate speech. A procedure to broaden this required policing is ongoing.

This fight was intensified by means of a Code of Conduct on Online Hate Speech. The European Commission agreed on it in 2016 with Facebook, Microsoft, Twitter and YouTube. These companies have accepted to take the lead in combatting this kind of unwanted behaviour.

The Europol regulation, also from May 2016, complements this code of conduct. It formalised Europol’s “Internet Referral Unit” (IRU) in article 4(m). Europol itself cannot take enforcement actions. As such, the IRU is limited to reporting unwanted content to the online platforms themselves “for their voluntary consideration of the compatibility of the referred internet content with their own terms and conditions.” The reported content need not be illegal.

The European Commission’s communication on Tackling Illegal Online Content from 2017 subsequently focussed on how online platforms can remove reported content as quickly as possible. Its core proposal consists of creating lists of “trusted flaggers”. Their reports on certain topics should be assumed valid, and hence platforms can check them less thoroughly before removing flagged content.

The new, and for now voted out, Copyright Directive made video sharing sites themselves liable for copyright infringements by their users, both directly and indirectly. This would force them to install upload filters. Negotiations between the institutions on this topic will resume in September.

Concerning fake news, the European Commission’s working document Tackling Online Disinformation: a European Approach from 2017 contains an extensive section on self-regulation. In January 2018, the Commission created a “High Level Working Group on Fake News and Online Disinformation”, composed of various external parties. Their final report proposes that a coalition of journalists, ad agencies, fact checkers, and so on, be formed to write a voluntary code of conduct. Finally, the Report on the Public Consultation from April 2018 also mentions a clear preference by the respondents for self-regulation.

Follow-up of the symptoms

At the end of 2016 (a year late), the European Commission published its first evaluation of the directive against the sexual exploitation of children. It includes a separate report on the blocking lists, but it does not contain any data on their effectiveness nor side effects. This prompted a damning resolution by the European Parliament in which it “deplores” the complete lack of statistical information regarding blocking, removing websites, or problems experienced by law enforcement due to erased or removed information. It asked the European Commission to do its homework better in the future.

However, for the Commission this appears to be business as usual. In January 2018, four months after its Communication on Tackling Illegal Online Content, it sent out a press release that called for “more efforts and quicker progress” without an evaluation of what had been done already. The original document moreover did not contain any concrete goals nor evaluation metrics, so it begs the question more and quicker than what these efforts should be exactly. This was followed up in March 2018 by a recommendation by that same European Commission in which everyone, except for the Commission and the Member States themselves, were called upon to further increase their efforts. The Commission now wants to launch a Directive on this topic in September 2018, undoubtedly with requirements for everyone but themselves to do even more and more quickly.

Referral to the pathogen

Online platforms have the right, within the boundaries of the law, to implement and enforce terms of use. What is happening now, however, goes quite a bit further.

More and more decisions on what is illegal are systematically outsourced to online platforms. Next, covenants between government bodies and these platforms include the removal of non-illegal content. The public resources and the authority of Europol are used to detect such content and report it. Finally, the platforms are encouraged to perform fewer fact check on reports from certain groups, and there are attempts to make the platforms themselves liable for their users’ behaviour. This would only make them more inclined to pre-emptively erase controversial content.

When a governmental body instates measures, these are always put to the test of the European Charter. The proportionality, effectiveness and subsidiarity need to be respected in the light of fundamental rights such as the right to free speech, no arbitrary application of the law, and the right to a fair trial. Not prosecuting certain categories of unwanted behaviour or not even making them illegal, and instead “recommending” online platforms to undertake action against them, undercuts these fundamentals of our rule of law in a rather blunt way.

Moreover, these online platforms are not just random companies. As the founders of Google wrote in their original academic article on the search engine:

For example, in our prototype search engine one of the top results for cellular phone is ‘The Effect of Cellular Phone Use Upon Driver Attention’… It is clear that a search engine which was taking money for showing cellular phone ads would have difficulty justifying the page that our system returned to its paying advertisers. For this type of reason and historical experience with other media, we expect that advertising funded search engines will be inherently biased towards the advertisers and away from the needs of the consumers.

Meanwhile, Google has become one of the largest advertising agencies in the world. Facebook and Twitter also obtain a majority of their revenue from selling ads. If the Commission were to ask whether we shouldn’t outsource the enforcement of our fundamental right to ad agencies, the response would be presumably quite different than to the umpteenth hollow announcement about how internet platforms should address illegal online content more quickly and more thoroughly.

A difficult diagnosis

If we look specifically at fake news, there are two additional problems: no definition and few studies about it in its current form. Even the public consultation on fake news by the European Commission caused confusion by giving hate speech, terrorism and child abuse as examples of “already illegal fake news”. However, none of these terms refer to concepts that are necessarily fake or news. This makes it hard to draw conclusions from the answers, because it is unknown what the respondents understood under the term. The Eurobarometer survey on this topic has similar problems.

This does not mean that no information exists. The German NGO Stiftung Neue Verantwortung performed an experiment by spreading fake news through bought Twitter-followers. They drew some interesting conclusions:

Fake news is like memes: their essence is not their existence, but how well they are picked up. This means that blocking the source of popular fake news will not stop it from spreading;
Fake news is over by strongly linked users, while debunking it happens by a much more varied group of people. Hence, people with similar opinions seem to share the same fake news, but it seems to influence the general public less strongly.

An analysis by EDRi member Panoptykon on artificially increasing the popularity of Twitter messages on Polish politics led to compatible conclusions. There are bubbles of people that interact very little with each other. Each bubble contains influencers that talk about the same topics, but they seldom talk directly to each other. Prominent figures and established organisations, rather than robots (fake accounts), steer the discussions. Robots can be used to amplify a message, but by themselves do not create or change trends.

It is virtually impossible to distinguish robots from professional accounts by only looking at their network and activity. Therefore it is very hard to automatically identify such accounts for the purpose of studying or blocking them. These are only small-scale studies and one has to be careful to draw general conclusions from them. They certainly do not claim that fake news has no influence, or that we should just ignore it. That said, they do contain more concrete information than all pages on this topic published to date by the European Commission.

So what should we do? Miracle cures are in short supply, but there are a few interesting examples from the past. The year 1905 saw a revolution against patent medicine after investigative journalists exposed its dangers. Later, in the 1950s, tv quizes were found out to favour telegenic candidates due to their beneficial effect on ratings. Income steering content has been around since forever. Independent media should therefore be an important part of the solution.

The cure and the disease

The spectacular failure of the political establishment both in the US and the UK could impossibly have been their own undoing, so a different explanation was called for. Forget about the ruckus back in the day about Obama’s birth certificate or his alleged secret muslim agenda, David Cameron’s desperate ploy to cling to power, and the tradition of tabloids and their made-up stories in the UK. This is something completely different. Flavour everything with the dangers of online content and present yourself as the digital knight on the white horse that will set things straight. Or rather, that orders the ones making money from sensation and clickbait (such as fake news) to set things straight as they see fit.

The above is oversimplified, but it is incredible how this European Commission is casually promoting the Facebooks and Googles of this world to become the keepers of European fundamental rights. Protecting democracy and the rule of law is not a business model. It is a calling. One that few will attribute to Mark Zuckerberg.

This article originally appeared in Dutch on Apache.be.
Original Article: De nepstrijd tegen het nepnieuws

 

Read more:

Press Release: “Fake news” strategy needs to be based on real evidence, not assumption (26.04.2018)
https://edri.org/press-release-fake-news-strategy-needs-based-real-evidence-not-assumption/

ENDitorial: Fake news about fake news being news (08.02.2017)
https://edri.org/enditorial-fake-news-about-fake-news-being-news/

(Contribution by Jonas Maebe, EDRi observer)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
25 Jul 2018

New Protocol on cybercrime: a recipe for human rights abuse?

By EDRi

From 11 to 13 July 2018, the Electronic Frontier Foundation (EFF) and European Digital Rights (EDRi) took part in the Octopus Conference 2018 at the Council of Europe together with Access Now to present the views of a global coalition of civil society groups on the negotiations of more than 60 countries on access to electronic data by law enforcement in the context of criminal investigations.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

There is a global consensus that mutual legal assistance among countries needs to be improved. However, recognising its inefficiencies should not translate into bypassing Mutual Legal Assistance Treaties (MLATs) by going to service providers directly, thereby losing procedural and human rights safeguards embedded in them. Some of the issues with MLATs can be solved by, for example, technical training for law enforcement authorities, simplification and standarisation of forms, single points of contact or by increasing resources. For instance, thanks to a recent US “MLAT reform programme” that increased resources to handle MLATs, the US Department of Justice reduced the amount of pending cases by a third.

There is a worrisome legislative trend  emerging through the US CLOUD Act and the European Commission’s “e-evidence” proposals to access data directly from service providers. This trend risks creating a race to the bottom in terms of due process, court checks, fair trials, privacy and other human rights safeguards.

If the current Council of Europe negotiations on cybercrime focused on improving mutual legal assistance, they could offer an opportunity to create a human rights-respecting alternative to dangerous shortcuts as proposed in the US CLOUD Act or the EU proposals. However, civil right groups have serious concerns from a procedural and substantive perspective.

This process is being conducted without regular and inclusive participation of civil society, or data protection authorities. Nearly 100 NGOs wrote in April 2018 to the Council of Europe’s Secretary General because they are not duly included in the process. While the Council of Europe issued a response, civil society groups reiterated that civil society participation and inclusion goes beyond a public consultation, participation in a conference and comments on texts preliminary agreed by States. Human rights NGOs should be present in drafting meetings to learn from the law enforcement expertise of the 60+ countries and provide human rights expert input in a timely manner.

From a substantive point of view, the process is being built on the faulty premise that anticipated signatories to the Convention on cybercrime (“the Budapest Convention”) share a common understanding on basic protections of human rights and legal safeguards. As a result of this presumption, it is unclear how the proposed Protocol can provide for strong data protection and critical human rights vetting mechanisms that are embedded in the current MLAT system.

One of the biggest challenges in the Council of Europe process to draft an additional protocol to the Cybercrime convention – a challenge that was evident in the initial Cybercrime convention itself and in its article 15 in particular – is the assumption that signatory Parties share (and will continue to share) a common baseline of understanding with respect to the scope and nature of human rights protections, including privacy.

Unfortunately, there is neither a harmonised legal framework among the countries participating in the negotiations nor a shared human rights understanding. Experience shows that there is a need for countries to bridge the gap between national legal frameworks and practices on the one hand, and human rights standards established by case law of the highest courts on the other. For example, the Court of Justice of the European Union (CJEU) held that blanket data retention is illegal under EU law on several occasions. Yet, the majority of the EU Member States still have blanket data retention laws in place. Other states involved in the protocol negotiations have implemented precisely the type of sweeping, unchecked, and indiscriminate data retention regime that the CJEU ruled out as well, such as Australia, Mexico or Colombia.

As a result of a lack of a harmonised human rights and legal safeguards protection, the forthcoming protocol proposals risk:

– Bypassing critical human rights vetting mechanisms inherent in the current MLAT system that are currently used to, among other things, navigate conflicts in fundamental human rights and legal safeguards that inevitably arise between countries;

– Seeking to encode practices that fall below minimum standards being established in various jurisdictions by ignoring human rights safeguards established primarily by the case law of the European Court of Human Rights, the Court of Justice of the European Union, among others;

– Including few substantial limits and instead relying on the legal systems of signatories to include enough safeguards to ensure human rights are not violated in cross-border access situations and a general and non-specific requirement that signatories ensure adequate safeguards (see Article 15 of the Cybercrime Convention) without any enforcement.

Parties to the negotiations should render human rights safeguards operational – as human rights are the cornerstones of our society. As a starting point, NGOs urge countries to sign, ratify and diligently implement Convention 108+ on data protection. In this sense, EDRi and EFF welcome the comments of the Council of Europe’s Convention 108 Committee.

Finally, civil society groups urge the forthcoming protocol not to engage in a mandatory or voluntary direct access mechanism to obtain data from companies directly without appropriate safeguards. While the proposals seem to be limited to subscriber data, there are serious risks that interpretation of what constitutes subscriber data is expanded so as to lower safeguards, including access to metadata directly from providers by non-judicial requests or demands.

This can conflict clear court rulings from the European Court of Human Rights, such as the Benedik v. Slovenia case or even States’ case law, such as that of Canada’s Supreme Court. The global NGO coalition therefore reiterates that the focus should be put on making mutual legal assistance among countries more efficient.

Civil society is ready to engage in the negotiations. Until now however, the future of the second additional protocol to the Cybercrime Convention remains unclear, raising many concerns and questions.

Read more:

Joint civil society response to discussion guide on a 2nd Additional Protocol to the Budapest Convention on Cybercrime (28.06.2018)
https://edri.org/files/consultations/globalcoalition-civilsocietyresponse_coe-t-cy_20180628.pdf

How law enforcement can access data across borders — without crushing human rights (04.07.2018)
https://ifex.org/digital_rights/2018/07/04/coe_convention_185_2ndamend_supletter/

Nearly 100 public interest organisations urge Council of Europe to ensure high transparency standards for cybercrime negotiations (03.04.2018)
https://edri.org/global-letter-cybercrime-negotiations-transparency/

A Tale of Two Poorly Designed Cross-Border Data Access Regimes (25.04.2018)
https://www.eff.org/deeplinks/2018/04/tale-two-poorly-designed-cross-border-data-access-regimes

Cross-border access to data has to respect human rights principles (20.09.2017)
https://edri.org/crossborder-access-to-data-has-to-respect-human-rights-principles/

(Contribution by Maryant Fernández Pérez, EDRi, and Katitza Rodríguez, EFF)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
25 Jul 2018

The EU gets another opportunity to improve copyright rules

By Guest author

The EU gets another opportunity to improve a copyright proposal that would have threatened the open web. On 5 July 2018, Members of the European Parliament (MEPs) rejected the mandate to proceed with a flawed proposal for an EU Copyright Directive that would have had detrimental effects to internet freedom, access to knowledge, and collaboration online. This vote means that the text of the draft Directive is open again for amendments to be proposed and voted on by Parliament, likely at the next plenary session on 12 September 2018.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The Wikimedia Foundation applauds the rejection of the mandate and the opportunity this offers for a wider discussion to create a balanced, modern copyright system for Europe. Prior to the vote on 5 July 2018, the Wikimedia Foundation Board of Trustees, more than ten Wikipedia language communities, and many Wikimedia chapters in Europe took a stand against the draft Directive. Some Wikimedians even took to the streets to protest and educate the public.

Wikipedia’s chapters ran messages asking European users to contact their MEPs or went dark for a day to protest the proposal. Unlike other online platforms, Wikipedia’s global community of volunteer editors makes decisions about how they want the site to be involved in policy debates that impact Wikipedia, access to knowledge, and the free and open internet. The banners and blackouts protesting the draft Directive were the direct result of Wikipedia’s volunteer editors making their voices heard through Wikipedia’s democratic decision to make a statement.

Of course, the Wikimedia movement was not alone in our protest to the Directive. Countless civil society stakeholders, technologists, creators, and human rights defender also spoke out against the proposal.

Wikimedia believes strongly that the draft Directive went too far to expand copyright and failed to provide critical protections for material that should be both free and public. The Foundation fears that the overbroad requirements in the draft Directive would actually lead to further dominance of the internet by large companies that have the resources to build burdensome filtering systems. This runs contrary to the vision of an open internet that fostered Wikipedia’s creation.

Now that the proposed copyright Directive is open for amendments once again, it is time to support improvements that harmonise copyright across the EU and preserve basic online freedoms. A groundswell of concern over this proposal led to its rejection in Parliament.

Let’s make sure that the next round of amendments takes these concerns into account and results in a copyright directive which will truly deliver a free and open internet for all.

Read more:

How the EU copyright proposal will hurt the web and Wikipedia (02.07.2018)
https://edri.org/how-the-eu-copyright-proposal-will-hurt-the-web-and-wikipedia/

Wikipedia blackout (04.07.2018)
https://commons.wikimedia.org/wiki/File:Wikipedia_blackout,_04.07.2018,_en.wp.jpg

Press Release: EU Parliamentarians support an open, democratic debate on Copyright Directive (05.07.2018)
https://edri.org/press-release-eu-parliamentarians-support-open-democratic-debate-around-copyright-directive/

(Contribution by Jan Gerlach, EDRi-member Wikimedia Foundation)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
25 Jul 2018

Censoring Wikipedia in Turkey is censoring our collective knowledge

By Guest author

2018 will be a pivotal year for the internet. For the first time in human history, over half of the world’s population will be online. But new threats threaten the sustainability of the world’s largest information source.

 ----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

By the end of this year, for the first time, one out of every two humans will have access to the internet. This tremendous milestone represents over 40 years of continuous investment in creating an open space where a unique culture of collaboration and sharing have thrived – and represents the grand sum of the biggest human experiment to democratize and disperse knowledge in our history. In this exciting age of the internet, new business models have been emerged, while our perspective on how we work, live and play has been re-shaped to match the growth of the web. But there are some worrying developments taking place on the global stage right now that could threaten the natural growth of the internet as it progresses its role as a great equalizer to access.

More censorship in many places

Everywhere we look, online censorship is becoming more prevalent, increasingly insidious, and worryingly surreptitious.

In the last few months, we have seen Russian and Iranian governments both continue with numerous attempts to block Telegram, making the argument that free speech on the platform enables unfavourable discussions against their governments. Vietnam just passed a law that will further limit the ability of people to use the internet as an avenue to free discourse; while in Uganda, a new law passed last month will tax users of social media for ‘gossip’ on social channels. As the world’s largest, collaboratively-built platform with over 46 million articles, Wikipedia has been monitoring these developments closely. We are increasingly concerned about the fact that for the last 12 months, Wikipedia remains banned in Turkey, denying the world information about one of the world’s  most vibrant economies.

Turkey block is far-reaching

The Turkish block is the most expansive government ban ever imposed on Wikipedia, and includes Wikipedias across nearly 300 languages. The court order imposing the ban is based on two articles in English Wikipedia which the court said damaged the reputation and prestige of the Republic of Turkey. While we respectfully disagreed with the court’s decision as it applied to the articles at the time the block was imposed, we also wish to point out that those articles remain open to editing, according to Wikipedia’s neutral editorial policies, and have been changed substantially by Wikipedia volunteer editors since the block was imposed.

Critically, the ban brings to focus the fact that all the content on Wikipedia is built through the collaborative efforts of millions of people who create content for the benefit of the world at large.
The unique value of Wikipedia is that it is a collaboration of hundreds of thousands of people across the globe. Together they make decisions about what information to include in Wikipedia and how that information is presented. The editing process proceeds according to policies developed and overseen by these independent volunteer editors. Wikipedia’s policies require reliable sources to verify information included in Wikipedia, and neutrality, especially when covering controversies in which there are differing views. This is an ongoing process and means that Wikipedia articles are under constant improvement. It is a process that benefits from more editors and differing perspectives, which is one of the reasons why ending the block in Turkey is so important.

This manner of creating and improving content remains to this day the most powerful and unique contribution of Wikipedia to the internet. When more people participate on Wikipedia, the more neutral, reliable, and accurate its articles become. The Wikimedia Foundation and independent Wikipedia volunteer editors offered to provide open, public training on Wikipedia in Turkey once the block is lifted, as we have done in other countries, with the goal of increasing the number of editors and perspectives on Wikipedia.

Lift the ban!

This is why we continue to respectfully request that the government lift the ban so that Wikipedia can return to serving as a valuable, free educational resource on a wide range of topics, including science, engineering, art, and culture. We would like to see the Turkish people able to contribute to the global conversation, including Turkish topics, on Wikipedia. As we have repeatedly noted, we are all made poorer for the absence of contributions by the Turkish people.

And at a pivotal moment such as this, when more people are getting online in search of freely accessible information, these kind of actions can only set back the remarkable progress that the internet has made over the last 40 years. When we censor information on Turkey, we censor our collective history as humans.

This article was written by Eileen Hershenov, General Counsel for the Wikimedia Foundation, and published at netzpolitik.org under a Creative Commons BY-NC-SA 4.0 licence.

Original article: In Censoring Wikipedia in Turkey, We Are Censoring Our Collective Knowledge (15.07.2018)

Read more:

Half the world’s population is still offline. Here’s why that matters. (30.05.2018)
https://news.itu.int/half-the-worlds-population-is-still-offline-heres-why-that-matters/

Viet Nam: New Cybersecurity law a devastating blow for freedom of expression (12.06.2018)
https://www.amnesty.org/en/press-releases/2018/06/viet-nam-cybersecurity-law-devastating-blow-freedom-of-expression/

Social media use taxed in Uganda to tackle ‘gossip’ (01.06.2018)
https://www.theguardian.com/world/2018/jun/01/social-media-use-taxed-in-uganda-to-tackle-gossip

Wikimedia Foundation urges Turkish authorities to restore access to Wikipedia (30.04.2018)
https://blog.wikimedia.org/2017/04/30/turkish-authorities-block-wikipedia/

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
25 Jul 2018

EU Council considers undermining ePrivacy

By IT-Pol

On 19 October 2017, the European Parliament’s LIBE Committee adopted its report on the ePrivacy Regulation. The amendments improve the original proposal by strengthening confidentiality requirements for electronic communication services, a ban on tracking walls, legally binding signals for giving or refusing consent to online tracking, and privacy by design requirements for web browsers and apps.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Before trialogue negotiations can start, the Council of the European Union (the Member States’ governments) must adopt its general approach. This process is still ongoing with no immediate end in sight. An analysis of the proposed amendments in Council documents so far shows that the Council is planning to significantly weaken the ePrivacy text compared to the Commission proposal and, especially, the LIBE report.

Metadata for electronic communications should be regarded as sensitive personal data, similar to the categories listed in Article 9 of the General Data Protection Regulation (GDPR). Under the ePrivacy Directive (current legal framework), necessary metadata may be processed for purposes of subscriber billing and interconnection payments, and, with consent of the user, for value added services. Apart from data retention requirements in national law, no other processing is allowed. In the ePrivacy Regulation, the Commission proposal and the LIBE text both uphold the principle of only allowing processing of electronic communications metadata for specific purposes laid down in law or with consent of the end-user. As a new specific purpose, processing for monitoring quality of service requirements and maintaining the availability of electronic communications networks can be done without consent.

The Council proposals significantly expand the permitted processing of metadata without consent by the electronic communications service (ECS) provider. The billing/interconnection purpose is extended to include processing when it is necessary “for the performance of a contract to which the end-user is party”. This will allow the ECS provider to process metadata not directly related to billing through provisions in the contract with the end-user. Service offerings by ECS providers are generally moving towards simpler products with increased reliance on flat rate tariffs, which should reduce the processing and storage of metadata necessary for billing purposes. These privacy benefits will be lost with the Council text.

In December 2017, the Council proposed further processing of metadata without consent for scientific research or statistical purposes based on Union or Member State law. Despite the mandatory safeguards, which include encryption and pseudonymisation, this is a very problematic amendment since a potentially large amount of metadata, which would otherwise be deleted or anonymised, will be retained and stored in identifiable form. Data breaches and law enforcement access are two very specific data protection risks created by this amendment.

The latest text from the Austrian Presidency (Council document 10975/18) goes even further than this by proposing a new general provision for further processing of metadata for compatible purposes inspired by Article 6(4) of the GDPR. This comes very close to introducing “legitimate interest” as a legal basis for processing metadata by the ECS provider, something that has previously been ruled out because metadata for electronic communications is comparable to sensitive personal data under the case law of the Court of Justice of the European Union (CJEU). GDPR Article 9 does not permit the processing of sensitive personal data with legitimate interest as the legal basis. In March 2018, the former Bulgarian Presidency specifically noted that it is highly doubtful whether a non-specific provision for permitted processing would, given the sensitive nature of the data involved, be in line with the case-law of the CJEU.

The LIBE Committee adopted amendments to ensure that electronic communications content was protected under the ePrivacy Regulation during transmission and if the content is subsequently stored by the ECS provider. This is important because storage of electronic communications content is an integral part of many modern electronic communications services, such as webmail and messenger services. However, the Council amendments limit the protection under the ePrivacy Regulation to the transmission of the communication, a period which may be a fraction of a second. After the receipt of the message, the processing falls under the GDPR which could allow processing of personal data in electronic communications content (such as scanning email messages) based on legitimate interest rather than consent of the end-user. As suggested by the Council recital, the end-user can avoid this by deleting the message after receipt, but this would entirely defeat the purpose of many modern electronic communications services.

In Article 8 of the draft ePrivacy Regulation, the LIBE Committee adopted a general ban on tracking walls. This refers to the practice of making access to a website dependent on end-user consent to processing of personal data through tracking cookies (or device fingerprinting) that is not necessary for the provision of the website service requested by the end-user. This practice is currently widespread since many websites display cookie consent banners where it is only possible to click ‘accept’ or ‘OK’.

The Council text goes in the opposite direction with proposed wording in a recital which authorises tracking walls, in particular if a payment option is available that does not involve access to the terminal equipment (e.g. tracking cookies). This amounts to a monetisation of fundamental rights, as EU citizens will be forced to decide whether to pay for access to websites with money or by being profiled, tracked and abandoning their fundamental right to protection of personal data. This inherently contradicts the GDPR since consent to processing of personal data can become the counter-performance for access to a website, contrary to the aim of Article 7(4) of the GDPR.

Finally, the latest text from the Austrian Presidency proposes to completely delete Article 10 on privacy settings. Article 10 requires web browsers and other software permitting electronic communications to offer privacy settings which prevent third parties from accessing and storing information in the terminal equipment, and to inform the end-user of these privacy settings when installing the software. An example of this could be an option to block third party cookies in web browsers. Such privacy settings are absolutely critical for preventing leakage of personal data to unwanted third parties and for protecting end-user privacy when consent to tracking is coerced through tracking walls. The recent Cambridge Analytica scandal should remind everyone, including EU Member States’ governments, of the often highly undesirable consequences of data disclosures to unknown third parties.

If Article 10 is deleted, it will be possible to offer software products that are set to track and
invade individuals’ confidential communications by design and by default, with no possibilities for the individual to change this by selecting a privacy-friendly option that blocks data access by third parties. This goes in the complete opposite direction of the LIBE report, which contains amendments to strengthen the principle of privacy by design by requiring that access by third parties is prevented by default, and upon installation to ask the end-user to either confirm this or select another, possibly less privacy-friendly, option.

The rationale for deleting Article 10 given by the Austrian Presidency is the burden on software vendors and consent fatigue for end-users. The latter is somewhat ironic since technical solutions, such as genuine privacy by design requirements and innovative ways to give or refuse consent, like a mandatory Do Not Track (DNT) standard, are needed to reduce the number of consent requests in the online environment. The Council amendments for articles 8 and 10 would aggravate the current situation, where end-users on countless websites are forced to give essentially meaningless consent to tracking because the cookie banner only provides the option of clicking ‘accept’.

If the ePrivacy amendments in 10975/18 and earlier Council documents are adopted as the general approach, Council will enter trialogue negotiations with a position that completely undermines the ePrivacy Regulation by watering down all provisions which provide stronger protection than the GDPR. This would put a lot of pressure on the European Parliament negotiators to defend the privacy rights of European citizens. For telecommunications services, which presently enjoy the strong protection of the ePrivacy Directive, the lower level of protection will be particularly severe, even before considering the dark horse of mandatory data retention that EU Member States are trying to uphold, in part through amendments to the ePrivacy Regulation.

EDRi, along with EDRi members Access Now, Privacy International and IT-Pol Denmark, have communicated their concerns about the proposed Council amendments though letters to WP TELE, as well as a civil society meeting with Council representatives on 31 May 2018 organised by the Dutch Permanent Representation and the Bulgarian Council Presidency.

Read more:

e-Privacy: What happened and what happens next (29.11.2017)
https://edri.org/e-privacy-what-happened-and-what-happens-next/

EU Member States fight to retain data retention in place despite CJEU rulings (02.05.2018)
https://edri.org/eu-member-states-fight-to-retain-data-retention-in-place-despite-cjeu-rulings/

ePrivacy: Civil society letter calls to ensure privacy and reject data retention (24.04.2018)
https://edri.org/eprivacy-civil-society-letter-calls-to-ensure-privacy-and-reject-data-retention/

Civil society calls for protection of communications confidentiality (13.06.2018)
https://edri.org/civil-society-calls-for-protection-of-communications-confidentiality/

Civil society letter to WP TELE on the ePrivacy amendments in Council document 10975/18 (13.07.2018)
https://edri.org/civil-society-calls-for-protection-of-privacy-in-eprivacy/

(Contribution by Jesper Lund, EDRi-member IT-Pol)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

 

close
25 Jul 2018

Smart Borders: the challenges remain a year after its adoption

By Maria Roson

After the initial rejection of the Smart Borders package in 2013, the European Parliament voted again on 25 October 2017 to finally adopt it, including the Entry/Exit System (EES) and amendments to integrate it into the Schengen Borders Code. This package will replace the manual stamping of passports with the automation of certain preparatory border control procedures and store biometric information of non-EU visitors traveling across borders in the Schengen zone.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

This new system foresees the recording of information such as the name, travel document, fingerprints, facial image, date and place of birth from non-EU nationals when entering, exiting or being refused entry into the Schengen area. The data will be retained for three years, or five years for those who overstay. The system will be interconnected with the Visa Information System (VIS) database, and allow law enforcement authorities to access the database for criminal identification and intelligence to prevent serious crime and terrorism. The package was signed on 30 November 2017, and the Entry/Exit System is scheduled to become fully operative by 2020.

Unlike the previous Smart Borders Package from 2013, the text gathered a broad consensus among policy-makers. However, several stakeholders have expressed concerns about the risks to human rights. The European Data Protection Supervisor (EDPS) had already warned in 2013 against potential violations of the EU Charter of Fundamental Rights, particularly Articles 7 and 8, regarding the respect for private and family life and the protection of personal data, and recommended additional improvements in the revised proposal on 2016. In particular, the EDPS recommended that additional improvements as regards the significant collection of data from non-EU nationals, seriously affecting their freedoms and rights. Moreover, the EDPS recommended increasing the protection of data regarding the retention periods, the collection of facial images and fingerprints and the use of biometric data in general. The Fundamental Rights Agency also warned in its Report “Fundamental rights and the interoperability of EU information systems: borders and security” about the disproportionate effects on the rights of migrants in an irregular situation and about the fundamental rights risks of unlawful access or use of personal data. Similarly, EDRi expressed its view responding to a public consultation in 2015, considering the Smart Borders initiative disproportionate and unnecessary as regards its data collection plans. Finally, the package was criticised in a study commissioned by the European Parliament in 2016, concluding that the large-scale collection and storage of personal data including biometric data interfere with the EU Charter of Fundamental Rights and the right to private life under the European Convention on Human Rights (ECHR).

Beyond the criticisms regarding the respect of human rights, the Smart Borders package is likely to be very costly. The European Commission estimated the cost at around 480 million euros – which has been criticised by civil liberties groups such as Statewatch defining the package as “the most expensive exercise to collect migration statistics in the history of the world”. Statewatch also mentions in its report that “the EU’s budgets for law enforcement, counter-terrorism, border control and security research amounted to €3.8 billion in the period between 2007 and 2013. In the current period, which runs from 2014 to 2020, they have grown to a total of some €11 billion (…) a significant development given that a decade ago the bloc had no dedicated budgets for security, justice or home affairs.” The organisation warns about the dangers of the “surveillance society”, considering that many of the controversial measures of the Smart Border package – mainly being biometric data – are being “tested” on migrants and asylum seekers at the borders, taking advantage of their vulnerable position and the “legitimate interest” ground used to collect this information.

We still cannot foresee the consequences of the entry into operation of the Smart Borders package, but the risks for both European and non-European citizens have been predicted as being very high. Therefore, although we will have to wait until 2020 to see the real impact of the package, it is important not to lose sight of it in order to respond to the challenges that may arise.

Read more:

Fundamental Rights Agency: “Fundamental rights and the interoperability of EU information systems: borders and security” (07.2017)
http://fra.europa.eu/en/publication/2017/fundamental-rights-interoperability

Statewatch: “Market Forces: the development of the EU Security-Industrial Complex” (31.08.2017)
http://statewatch.org/analyses/marketforces.pdf

EDRi-Gram:Smart Borders package: Unproportionate & unnecessary data collection (04.11.2015)
https://edri.org/smart-borders-package-unproportionate-unnecessary-data-collection/

(Contribution by Maria Roson, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
23 Jul 2018

EP Plenary on the Copyright Directive – Who voted what?

By EDRi

Many of you have been wondering who voted for and who voted against JURI’s mandate to enter into negotiations with the EU Council on the Copyright Directive on 5 July. We prepared a summary with the most relevant data for you to understand the EU Parliamentary dynamics.

Countries’ position

France led the voting for the mandate with 82.43% of the MEPs voting against our campaign and in favour of starting negotiations with the EU Council. Luxembourg and Malta also voted mainly in favour of the mandate (66.67% of their MEPs each), same as Romania and Latvia (62.50%) and Portugal (57.14%). On the other side, 83.33% of Estonian and 80% of Swedish MEPs voted against the mandate. Other countries that mainly opposed it were Poland (74.51%), Finland (69.23%) and the Netherlands (65.38%). It is also remarkable that a large number of MEPs abstained or were not present at the vote, with percentages close to or greater than 30% in countries such as Slovakia, Czech Republic, Lithuania, Spain or Bulgaria.

Political groups’ position

With regard the political groups, the European People’s Party (EPP) and the Progressive Alliance of Social and Democrats (S&D) were the ones voting in their majority for the entering into negotiations with the Council , with the 58.90% of the votes of the EPPs in favour (129 MEPs) and the 42.33% of the S&D (80 MEPs). Compared to EPP, a slightly higher percentage of the S&D MEPs voted against (43.39%) the mandate. On the other side, the political groups voting overwhelmingly against entering into negotiations were the Group of the Greens-European Free Alliance with a 79.92% of the votes, followed by the European United Left/Nordic Green Left (GUE/NGL) with a 72.55% of the votes and the Group of Europe of Freedom and Direct Democracy with a 67.44% of the votes.

The next voting on the Plenary with the introduced amendments will take place on 12 September.

Read more:

Press Release: EU Parliamentarians support an open, democratic debate on Copyright Directive [05.07.2018]
https://edri.org/press-release-eu-parliamentarians-support-open-democratic-debate-around-copyright-directive/

Action plan against the first obligatory EU internet filter [28.06.2018]
https://edri.org/strategy-against-the-first-obligatory-eu-internet-filter/

Moving Parliament’s copyright discussions into the public domain [27.06.2018]
https://edri.org/moving-parliaments-copyright-discussions-into-the-public-domain-2-0/

 

[Contribution by Maria Roson, EDRi intern]

close
13 Jul 2018

Civil society calls for the protection of privacy in ePrivacy

By EDRi

On the 13th of July, EDRi, together with Privacy International, IT-Politisk Forening and Access Now sent a letter to the TELE Working Party of the EU Council regarding the dangers for privacy protections associated with the latest proposals brought forward by the Austrian Council Presidency. The letter comments the developments and argues that such changes could undermine the spirit of the proposed ePrivacy Regulation. The core of the Regulation is attacked in several ways: through the introduction of a new legal basis of processing metadata for “compatible purposes”, by allowing tracking walls and by eliminating the provision ensuring Privacy by Design and Privacy by Default – a key element in the ePrivacy Regulation.

Read more:

EU Council considers undermining ePrivacy (25.07.2018)
https://edri.org/eu-council-considers-undermining-eprivacy/

ePrivacy: Civil society letter calls to ensure privacy and reject data retention (24.04.2018)
https://edri.org/eprivacy-civil-society-letter-calls-to-ensure-privacy-and-reject-data-retention/

ePrivacy: Mythbusting
https://edri.org/files/eprivacy/ePrivacy_mythbusting.pdf

ePrivacy: Document Pool
https://edri.org/eprivacy-directive-document-pool/

close
11 Jul 2018

German police raids privacy group’s premises

By Anamarija Tomicic

In the early morning of 20 June 2018, German police forces raided several locations – the headquarters of the privacy group Zwiebelfreunde, and the homes of three of its board members, as well as the association OpenLab, which is part of EDRi member Chaos Computer Club (CCC) in Augsburg. Zwiebelfreunde promotes and creates privacy enhancing technologies, and educates the public in their use. The board members are not considered suspects but witnesses in an ongoing investigation.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The locations were raided because of an alleged connection to an anonymous internet website, which called for protests against the far-right Alternative for Germany (AfD) party. The author of the blog published “calls for protest”, which the police believes included incitement to violence. During the investigation, a riseup.net e-mail address was found connected to the blog. Zwiebelfreunde facilitates European donations to Riseup, a non-commercial alternative to popular e-mail services based in the US. In other words, the police raids were based solely on Zwiebelfreunde’s connection to the e-mail provider which was used by the author of the blog.

In addition, the judicial warrant seems to not have been respected during the searches. Some items and equipment taken were not covered by the warrant, including items belonging to family members of Zwiebelfreunde’s board members. The confiscated documentation goes back to 2011, although the warrant included only the period after January 2018. The representatives of the group said: “We argue that even the original warrants and seizures were a clear overreach, and that this was used as an excuse to get access to members’ and donors’ data. We have nothing to do with Riseup’s infrastructure. During the raids, the police forces clearly gave the impression that they knew we had nothing to do with either Riseup or the blog. None of us had even heard of that blog before!”

During these raids, the premises of Open Lab, used by members of the CCC, were searched as well. While searching for members and bank accounts information, the police allegedly came across chemicals, chemical formulas, and other equipment which was believed to be used in productions of explosives. Three people were arrested, and the premises were further searched without a court order. The suspicions turned out to be false as the “dangerous” chemicals were identified as computer circuit cleaning products. The “dangerous explosive” can be seen in action on a YouTube video.

The searches followed the adoption of the widely criticised Bavarian Police Act that went into effect in May 2018. This law gives the Bavarian police extended powers. “This is a textbook example of how easy the fundamental rights of completely innocent citizens and their families can be violated as a result of artificially constructed evidence chains, no matter how ridiculous. To be drawn into this case as a witness on the basis of such patently unsustainable reasoning is questionable to say the least. The recent introduction of draconian Bavarian laws governing police authority has clearly led to a culture where those responsible no longer feel bound by any sense of proportionality of their actions“, said Frank Rieger, spokesperson of CCC.

Zwiebelfreunde plans to take legal action against the overreaches of police power. You can follow future developments on their website.

Read more:

Bavarians protest against vastly extended police powers (16.05.2018)
https://edri.org/bavarians-protest-against-vastly-extended-police-powers/

Police searches homes of „Zwiebelfreunde“ board members as well as „OpenLab“ in Augsburg (04.07.2018)
https://www.ccc.de/en/updates/2018/hausdurchsuchungen-bei-vereinsvorstanden-der-zwiebelfreunde-und-im-openlab-augsburg

Coordinated raids of Zwiebelfreunde at various locations in Germany (04.07.2018)
https://blog.torservers.net/20180704/coordinated-raids-of-zwiebelfreunde-at-various-locations-in-germany.html

Chaos Computer Club kritisiert Vorgehen der Polizei (04.07.2018) (Available only in German)
http://www.spiegel.de/netzwelt/web/hausdurchsuchungen-bei-netzaktivisten-chaos-computer-club-kritisiert-polizeivorgehen-a-1216463.html

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close