24 Apr 2019

Strategic litigation against civil rights violations in police laws

By Gesellschaft für Freiheitsrechte

Almost every German state has expanded or is preparing to expand police powers. The police authorities are now more often allowed to interfere with civil rights, even before a specific danger has been identified. They are also given new means to conduct secret surveillance online. EDRi member Gesellschaft für Freiheitsrechte (GFF, Society for Civil Rights) is taking legal action against all changes in police powers that violate civil rights. GFF has already lodged constitutional complaints against the police laws in the states of Bavaria and Baden-Württemberg.

In Germany, police powers are defined on the state level, not the federal level. At the moment, there is a clear trend to expand these powers across nearly all German federal states. The development has been pioneered by Bavaria, where in May 2018, the police was endowed with powers nearly comparable to those of secret services. The amendment in question introduced the term of “impending danger”, meaning that the police is allowed to encroach on civil rights in various ways when merely assuming that a dangerous situation could develop — which can virtually always be justified. The police can thus use far-reaching measures like online searches and telecommunications surveillance as preventive instruments.

Trend towards expanded police powers

While Bavaria is the most blatant example, several other states have subsequently introduced police laws that encroach on civil rights. Baden-Württemberg, Saxony-Anhalt, Rhineland-Palatinate, Hesse, Mecklenburg-Western Pomerania, North Rhine-Westphalia, and Brandenburg already amended their police laws.

The amendments differ, but all of them introduce questionable measures that police authorities may now use. Many federal states introduced online searches and telecommunication surveillance. This is an unprecedented way of encroaching on the fundamental right to confidentiality and integrity of information technology systems. At the same time, it means that police authorities may take advantage of security gaps and thereby destabilise the general IT security.

Other new police powers include the use of electronic shackles and bodycams, the extension of video surveillance in public places, the possibility of extended DNA analysis, the extension of maximum detention periods and the technical upgrading of the police (including hand grenades, stun guns and drones).

Legal action against excessive expansion of police powers

GFF and its partners have already filed constitutional complaints against the new police laws in Bavaria and Baden-Württemberg and are currently investigating possible action against the changes in the police laws of the states of North Rhine-Westphalia and Hesse. GFF is also critically involved in the reform debates in the other state parliaments and plans to take legal action against the further expansion of police powers in Germany.

Gesellschaft für Freiheitsrechte (GFF, Society for Civil Rights)
https://freiheitsrechte.org/english/

Germany: New police law proposals threaten civil rights (05.12.2018)
https://edri.org/germany-new-police-law-proposals-threaten-civil-rights/

Overview of police law changes in the German states prepared by Amnesty International and GFF (only in German)
https://freiheitsrechte.org/home/wp-content/uploads/2019/04/2019-03_Uebersicht_neue_Polizeigesetze_GFF_Amnesty.pdf

(Contribution by EDRi member Gesellschaft für Freiheitsrechte, Germany)

close
24 Apr 2019

Will Serbia adjust its data protection framework to GDPR?

By SHARE Foundation

After a process that took more than five years, the National Assembly of Serbia finally adopted a new Law on Personal Data Protection in November 2018. The law closely follows EU’s General Data Protection Regulation (GDPR), almost to the point of literal translation into Serbian of some parts of the text. That was expected, due to Serbia’s EU membership candidacy. However, it seems it will be very difficult to implement the new legislation in practice – and thereby actually make a difference, as there are numerous flaws that were overlooked when the law was drafted and enacted.

There is not a high level of privacy culture in Serbia and therefore the majority of people are not aware of how the state and the private sector are collecting and handling their personal data. The recent affair with new high-tech surveillance cameras in Serbia’s capital city Belgrade, which were supplied by Huawei and have facial and vehicle license plate recognition capabilities, shows that little thought is invested in how intrusive technologies might impact citizens’ privacy and everyday lives. The highest-ranking state officials for internal affairs, the Minister of Interior and the Director of Police, have announced in the media that these cameras are yet to be installed in Belgrade, while a use case study on Huawei’s official website claimed that the cameras were already operational. Soon after EDRi member SHARE Foundation, a Serbian non-profit organisation dedicated to protecting and improving human rights in the digital environment, published an article with information found in Huawei’s “Safeguard Serbia” use case, the study miraculously disappeared from the company website. However, an archived version of the page is still available.

Considering that the adaptation period provided in the law is only nine months after its coming into force – compared to two years under the GDPR, the general feeling is that both the public and the private sector will have many difficulties in adjusting their practices to the provisions of the new law.

In the past years, we have witnessed many cases of personal data breaches and abuse, the largest one undoubtedly being the case of the now defunct Privatization Agency, when more than five million people, almost the entire adult population of Serbia, had their personal data – such as names and unique master citizen numbers, exposed on the internet. The agency was ultimately shut down by the government, and no-one was held accountable as the legal proceeding was not completed in time (see PDF of Commissioner’s report, 2017, p. 59).

Although the Serbian law contains key elements of the GDPR, such as principles relating to processing of personal data and data subjects’ rights, its text is very complicated to understand and interpret, even for lawyers. One of the main reasons for this is the fact that the law contains provisions related to matters in the scope of EU Directive 2016/680, the so-called “Police Directive”, which deals with processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties and on the free movement of such data. The law also fails to cover video surveillance, particularly important aspect of personal data processing. The Commissioner for Information of Public Importance and Personal Data Protection, Serbia’s Data Protection Authority, and civil society organisations have pointed out these and other flaws on several occasions (see, among other, Serbia’s former Commissioner’s comments), but the Ministry of Justice ignored these comments.

In addition to filing a complaint to the Commissioner, citizens are also allowed under the law to seek court protection of their rights, creating a “parallel system” of protection which can lead to legal uncertainty and uneven practice in the protection of citizens’ rights. Regarding data subjects’ rights, the final text of the law includes an article with limitations to these rights, which omitted that they can only be restricted by law. In practice, this would mean that state institutions or private companies processing citizens’ personal data may arbitrarily restrict their rights as data subjects.

To make matters even more complicated, the Serbian National Assembly still hasn’t appointed the new Commissioner, the head of the key institution for personal data protection reform. The term of the previous Commissioner ended in December 2018, and the public is still in the dark as to whom will be appointed and when. There are also fears, including on behalf of civil society and experts on the topic, that the new Commissioner might not be up to the task in terms of expertise and political independence.

New and improved data protection legislation, adapted for the world of mass data collection and processing via artificial intelligence technologies, is a key component of a successful digital transformation of society. In Serbia it is, however, usually considered as a procedural stepto join the EU. A personal data protection framework which meets high standards set in the GDPR in practice is of great importance for the digital economy, particularly for Serbia’s growing IT sector. If all entities processing personal data can demonstrate that they are indeed GDPR-compliant in their everyday practices, and not just “on paper”, there will be more opportunities for investments in Serbia’s digital economy and for Serbian companies to compete in the European digital market.

It will take a lot of effort to improve the standards of data protection in Serbia, especially with a data protection law which will be difficult to implement in practice. Therefore, it is of utmost importance that the National Assembly appoints a person with enough expertise and professional integrity as the new Commissioner, so that the process of preparing both the private and public sector for the new regulations can be expedited. As the application of the new Law on Personal Data Protection starts in August 2019, it should be regarded as just the beginning of a new relationship towards citizens’ data, which requires a lot of hard work to accomplish. Otherwise, the law will remain just a piece of paper with no practical effect.

This article was originally published at https://policyreview.info/articles/news/will-serbia-adjust-its-data-protection-framework-gdpr-practice/1391

SHARE Foundation
https://www.sharefoundation.info/en/

Law on Personal Data Protection (only in Serbian, 13.11.2018)
http://www.pravno-informacioni-sistem.rs/SlGlasnikPortal/eli/rep/sgrs/skupstina/zakon/2018/87/13/

Outgoing Serbia’s Commissioner warns of data protection law (23.10.2018)
http://rs.n1info.com/English/NEWS/a430066/Outgoing-Serbia-s-Commissiner-warns-about-shortcomings-in-draft-law-on-data-protection.html

Serbian Data Protection Commissioner: NGOs call for transparency (04.12.2018)
https://edri.org/ngos-transparency-dpc-serbia/

(Contribution by Bojan Perkov, EDRi member SHARE Foundation, Serbia)

close
24 Apr 2019

EDRi is looking for a new Head of Policy

By EDRi

European Digital Rights (EDRi) is an international not-for-profit association of 42 digital human rights organisations. We defend and promote rights and freedoms in the digital environment, such as the right to privacy, personal data protection, freedom of expression, and access to information.

EDRi is looking for an experienced, strategic and dedicated Head of Policy to join EDRi’s team in Brussels. This is a unique opportunity to be part of the growth of a well-respected network of NGOs making a tangible difference in the defence and promotion of online rights and freedoms in Europe and beyond. The deadline to apply has been extended until 16 June 2019. This is a full-time, permanent position.

The Head of Policy will provide strategic leadership to EDRi Policy Team and designs policy and advocacy strategies in line with EDRi’s Strategic objectives and in consultation with member organisations. S/he is expected to bring a strategic vision on human rights in the digital environment as well as solid experience on human rights advocacy and digital rights. The successful candidate will have a strong track record in policy development and strategic planning in addition to an excellent understanding of working in the EU or national policy/advocacy environment.

We are an equal opportunities employer with a strong commitment to transparency and inclusion. We strive to have a diverse and inclusive working environment. We encourage individual members of groups at risk of racism or other forms of discrimination to apply for this post.

Job title: Head of Policy
Reports to: Executive Director
Location: EDRi Office, Brussels, Belgium
Line management: The Head of Policy leads the advocacy effort of the Policy Team (4 persons) while the team is line managed by the Executive Director. The Head of Policy will participate in the Policy staff members’ appraisal and objective setting meetings. With the future growth of the organisation, and in consultation with employees, the position can include line management responsibilities.

RESPONSIBILITIES:

As Head of Policy, your main tasks will be to:

  • Advocate for the protection of digital rights, such as in the areas of data protection, privacy, freedom of expression, platform regulation, surveillance and law enforcement, telecommunications and digital trade;
  • Contribute to and evaluate progress towards EDRi policy strategic outcomes and develop activities in response to the external environment and in partnership with the team, members and the Board;
  • Provide the Policy Team with strategic advice and lead on advocacy strategies, including by coordinating, designing, and executing policy strategies and workplans in line with EDRi overall strategic objectives;
  • Draft and oversee the production of all policy documents, such as briefings, position papers, amendments, advocacy one-pagers, letters, blogposts, and EDRi-gram articles;
  • Support and work closely with EDRi colleagues including policy, communications, and campaigns – ensuring smooth working relations between the Policy Team and other teams – and report to the Executive Director;
  • Coordinate and collaborate with EDRi members on relevant legislative processes in the EU, including coordinating working groups, developing policy positions and campaign messages;
  • Collaborate with the EDRi team to communicate to the public about relevant legislative processes and EDRi’s activities;
  • Provide policy-makers with expert, timely, and accurate input and organise and participate in expert meetings;
  • Develop and strengthen relationships with civil society partners, EU institutions, government and institutional officials, academics and industry representatives working on related issues;
  • Represent – when relevant and in collaboration with the Executive Director and the Policy Team – the organisation as a spokesperson at public events, meetings and to the media.

QUALIFICATIONS AND EXPERIENCE:

  • Passionate about digital rights and enthusiasm to work within a small team to make a big difference;
  • Minimum 6 years of relevant experience in a similar role;
  • A university degree in law, EU affairs, policy, human rights or related field or equivalent experience;
  • Demonstrable knowledge of, and interest in human rights, in particular privacy, net neutrality, digital trade, surveillance and law enforcement, freedom of expression, as well as other internet policy issues;
  • Knowledge and understanding of the EU, its institutions and its role in digital rights policies;
  • Experience in leading advocacy efforts and creating networks of influence;
  • Exceptional written and oral communications skills;
  • Technical IT skills and knowledge of free and open source operating systems and software are a plus;
  • Strong multitasking abilities and ability to manage multiple deadlines;
  • Experience of working with and in small teams;
  • Experience of organising events and/or workshops;
  • Ability to work in English. Other European languages an advantage.

How to apply:

To apply, please send a maximum one-page cover letter and a maximum two-page CV in English and in .pdf format to applications(at)edri.org by 16 June 2019.

Please note that only shortlisted candidates will be contacted.

close
17 Apr 2019

Press Release: EU Parliament deletes the worst threats to freedom of expression proposed in the Terrorist Content Regulation

By EDRi

Today, 17 April 2019, the European Parliament (EP) adopted its Report on the proposed Terrorist Content Regulation. Although it has been questioned whether this additional piece of law is necessary to combat the dissemination of terrorist content online, the European Union (EU) institutions are determined to make sure it sees the light of day. The Regulation defines what “terrorist content” is and what the take-down process should look like. Fortunately, Members of the European Parliament (MEPs) have decided to include some necessary safeguards to protect fundamental rights against overbroad and disproportionate censorship measures. The adopted text follows suggestions from other EP committees (IMCO and CULT), the EU’s Fundamental Rights Agency, and UN Special Rapporteurs.

The European Parliament has fixed most of the highest risks that the original proposal posed for fundamental rights online.”

said Diego Naranjo, Senior Policy Advisor at EDRi.

We will follow closely next stages’ developments, since any change to today’s Report could be a potential threat to freedom of expression under the disguise of unsubstantiated ‘counter-terrorism’ policies.

he further added.

European Digital Rights (EDRi) and Access Now welcome the improvements to the initial European Commission (EC) proposal on this file. Neverthless, we doubt the proposal’s objectives will be achieved, and point that no meaningful evidence has yet been presented on the need for a new European counter-terrorism instrument. Across Europe, the inflation of counter-terror policies has had disproportionate impact on journalists, artists, human rights defenders and innocent groups at risk of racism.

The proposed legislation is another worrying example of a law that looks nice, politically, in an election period because its stated objective is to prevent horrendous terrorist content from spreading online. But worryingly, the law runs the severe risk of undermining freedoms and fundamental rights online without any convincing proof that it will achieve its objectives.

said Fanny Hidvegi, Europe Policy Manager at Access Now

During the rest of the process, the very least the EU co-legislator must do is to maintain the basic human rights safeguards provided by the European Parliament’s adopted text.

she further added

The next step in the process are trilogues negotiations between the European Commission, the European Parliament and Member States. Negotiations are expected to start in September / October 2019.

Read more:

Terrorist Content Regulation: Successful “damage control” by LIBE Committee (08.04.2019)
https://edri.org/terrorist-content-libe-vote/

CULT: Fundamental rights missing in the Terrorist Content Regulation
https://edri.org/cult-fundamental-rights-missing-in-the-terrorist-content-regulation/

Terrorist Content: IMCO draft Opinion sets the stage right for EP (18.01.2019)
https://edri.org/terrorist-content-imco-draft-opinion-sets-the-stage-right-for-ep/

Terrorist Content Regulation: Document pool
https://edri.org/terrorist-content-regulation-document-pool/

Twitter_tweet_and_follow_banner
close
16 Apr 2019

EDRi is looking for an interim Executive Director (6 months maternity cover)

By EDRi

European Digital Rights (EDRi) is an international not-for-profit association of 42 civil society organisations. We defend and promote human rights and freedoms in the digital environment, such as the right to privacy, freedom of expression, and access to information.

We are looking for an interim Executive Director to replace our current Executive Director during her maternity leave (6 months from mid-July 2019 to mid-January 2020).

The Executive Director provides overall leadership and management of the strategy, policy, resources, operations, and communications of EDRi. The Executive Director is responsible for the management of the organisation and all aspects of its operations. While the Interim Executive Director is not expected to be a specialist in specific operations (campaigns, fundraising, HR, administration, finance, etc.), s/he has a sufficient grasp of all domains to ensure that staff members can achieve their objectives and that they and the EDRi members can work well together to achieve the organisation’s mission.

We are an equal opportunities employer with a strong commitment to transparency and inclusion. We strive to have a diverse and inclusive working environment. We encourage individual members of groups at risk of racism or other forms of discrimination to apply for this post.

Job title: Interim Executive Director
Start and end dates: 15 July 2019 – 15 January 2020
Reports to: Board of Directors, (human resources task force)
Line-manages: policy, advocacy, campaigning, communications, fundraising, and organisational support teams
Scope: staff members 10, annual budget of approx. 830k euro

RESPONSIBILITIES

1. Leadership, organisation mission and strategy

  • steer the consultation phase of the strategic planning process
  • provide leadership and management for the organisation
  • implement the annual work plan and ensure rigorous evaluation
  • Start preparations for the 2020 general assembly
  • support the Board, and prepare quarterly financial and narrative reports
  • represent the organisation at events as necessary
  • support development of policy strategy and taking of tactical decisions

2. Financial sustainability and oversight

  • prepare the yearly budget, oversee expenditure
  • oversee and contribute to the raising of funds from foundations corporations and individual donors
  • maintain good relations with donors and oversee reporting to them
  • oversee fiscal management operating within the approved budget
  • ensure that sound book-keeping and accounting procedures are followed
  • ensure that the organisation complies with relevant legislation and grant contracts

3. Organisation operations

  • ensure the implementation of Board decisions
  • ensure that the Board is made aware of all matters requiring a Board decision
  • inform the Board of all developments of major significance to the organisation
  • oversee internal human resources policies and ensure staff retention
  • provide oversight of all staff and organise weekly meetings with staff
  • foster effective teamwork and establish a positive work environment
  • evaluate the individual objectives with staff members
  • undertake regular one to one meetings with all staff
  • sign contracts and other agreements on behalf of EDRi
  • give or refuse final approval for any unforeseen use of resources

QUALIFICATIONS

  • senior management experience preferably in a non-governmental organisation
  • solid, hands-on financial and budget management skills
  • strong organisational abilities, especially for planning, delegation and project management
  • ability to convey the vision of EDRi’s strategic future to staff, Board, network and donors
  • ability to build trusted relationships with, and to collaborate with and oversee all staff
  • knowledge of EU policy-making processes
  • knowledge and/or experience in understanding the NGO sector
  • awareness and knowledge of the EU’s political environment
  • knowledge of the human rights and digital rights field and affinity with EDRi’s values and mission,
  • knowledge and/or experience in the field of human resources management
  • knowledge and/or experience in fundraising unique to nonprofit sector
  • knowledge and/or experience in conflict resolution
  • public speaking skills
  • ability to interface and engage EDRi’s main stakeholders

Attitude

Passionate, idealistic, enduring, team player, diplomatic, discreet, patient, mission-driven, self-directed, and committed to knowledge-sharing and high-integrity leadership.

Technical

  • fluency in written and spoken English
  • strong written and verbal communication skills
  • budgeting (oversight, presenting, monitoring)
  • knowledge of free and open source operating systems and software are a plus

HOW TO APPLY

To apply please send a maximum one-page cover letter and a maximum two-page CV (only PDFs are accepted) by email to applications[at]edri.org. Closing date for applications is 30 April 2019. Interviews with selected candidates will take place around mid-May, with a start date of (ideally) 15 July.

close
15 Apr 2019

EU Member States give green light for copyright censorship

By Diego Naranjo

Today, on 15 April 2019, European Union Member States gave their final approval to the text of the copyright Directive as it was adopted by the European Parliament on 26 March. This vote in the Council of the EU was the last procedural requirement in the EU law-making process. Now the Directive, once translated and published in the Official Journal of the EU, will become law.

19 Member States voted in favor of the Directive and effectively ignored hundreds of thousands of people who went on the streets in Europe to protest against upload filters and a petition signed by five million people. Six Member States (Finland, Italy, Luxembourg, the Netherlands, Poland, and Sweden) voted against the Directive text, while three (Belgium, Estonia, and Slovenia) abstained, showing how controversial the text is. You can find the full results of the vote on the Save Your Internet campaign website.

Member States will now have two years to implement the Directive in their legislation. The only way to prevent, in practice, upload filters for copyright purposes in the EU is to influence the national level implementation. To do this, we encourage you to support civil rights groups working to defend digital rights in your country!

Read more:

Filters Incorporated (19.04.2019)
https://edri.org/filters-inc/

Censorship machine takes over internet (26.03.2019)
https://edri.org/censorship-machine-takes-over-eu-internet/

Copyright reform: Document pool
https://edri.org/copyright-reform-document-pool/

close
12 Apr 2019

Cross-border access to data for law enforcement: Document pool

By EDRi

The European Commission proposed a Regulation on cross-border access to and preservation of electronic data held by service providers and a Directive to require service providers to appoint a legal representative within the EU in April 2018. Since then, the legislative process to adopt them has been fast-tracked, which has prevented any proper assessment of these measures to be carried out.

On 7 December 2018, the Council of the European Union reached a general approach on the text, that is to say a political agreement on a negotiating position to enter the negotiations with the European Parliament. The Civil Liberties, Justice and Home Affairs Committee of the Parliament decided to first make a thorough assessment of the Commission’s proposal before adopting its position. An introduction was first published to identify a number of questions for discussion that will be followed up by topical working documents.

The United States, adopted its own piece of legislation in a rush, the US Clarifying Lawful Use of Overseas Data (CLOUD) Act. The bill allows to access data stored outside of US territory while bypassing the legal safeguards of traditional international cooperation frameworks.

In parallel, the Council of Europe (CoE) has been also preparing a new protocol to the Convention on Cybercrime (also known as “the Budapest Convention”) on cross-border access to data by law enforcement authorities. This Second Protocol is expected to be finalised by the end of 2019

EDRi has been sending submissions to all institutions to ask for human rights to be respected. In this document pool, you will find the relevant information, documents and analyses on the e-evidence proposals. We’ve been updating this document pool as the process advanced.

Last update: 25 April 2019.

EDRi’s analysis and recommendations
Legislative documents
EDRi’s blogposts and press releases
Other
Process

EDRi’s analysis and recommendations:

Legislative documents:

More information in EUR LEX (EU Database on preparatory acts) and OEIL (European Parliament’s Legislative Observatory)

EDRi’s blogposts and press releases:

RightsCon session on cross-border access to e-evidence – key interventions (10.04.2017)
Access to e-evidence: Inevitable sacrifice of our right to privacy? (14.06.2017)
Cross-border access to data: EDRi delivers international NGO position to Council of Europe (18.09.2017)
Cross-border access to data has to respect human rights principles (20.09.2017)
CLOUD Act: Civil society urges US Congress to consider global implications (19.03.2018)
Nearly 100 public interest organisations urge Council of Europe to ensure high transparency standards for cybercrime negotiations (03.04.2018)
EU “e-evidence” proposals turn service providers into judicial authorities (17.04.2018)
Independent study reveals the pitfalls of “e-evidence” proposals (10.10.2018)
Growing concerns on “e-evidence”: Council publishes its draft general approach (05.12.2018)
EU Council’s general approach on “e-evidence”: From bad to worse (19.12.2018)

Other:

Joint Civil society letter to the Members of the US Congress on the US CLOUD Act (19.03.2018)
Joint Civil society letter to the Secretary General of the Council of Europe on the draft Second Additional Protocol to the Convention on Cybercrime (03.04.2018)
Joint Civil Society Response to Discussion Guide on a 2nd Additional Protocol to the Budapest Convention on Cybercrime (28.06.2018)
European Parliament Research Service’s assessment of the Commission’s proposals on electronic evidence (09.2018)
Joint Civil society letter to Member States about their draft position on “e-evidence” (05.12.2018)

Legislative process:

close
10 Apr 2019

UK: Online Harms Strategy must “design in” fundamental rights

By Open Rights Group

After months of waiting and speculation, the United Kingdom government Department for Digital, Culture, Media and Sport (DCMS) has finally published its White Paper on Online Harms – now appearing as a joint publication with the Home Office. The expected duty of care proposal is present, but substantive detail on what this actually means remains sparse: it would perhaps be more accurate to describe this paper as pasty green.

Increasingly over the past year, DCMS has become fixated on the idea of imposing a duty of care on social media platforms, seeing this as a flexible and de-politicised way to emphasise the dangers of exposing children and young people to certain online content and make Facebook in particular liable for the uglier and darker side of its user-generated material.

DCMS talks a lot about the “harm” that social media causes, but its proposals fail to explain how harm to free expression impacts would be avoided.

On the positive side, the paper lists free expression online as a core value to be protected and addressed by the regulator. However, despite the apparent prominence of this value, the mechanisms to deliver this protection and the issues at play are not explored in any detail at all.

In many cases, online platforms already act as though they have a duty of care towards their users. Though the efficacy of such measures in practice is open to debate, terms and conditions, active moderation of posts and algorithmic choices about what content is pushed or downgraded are all geared towards ousting illegal activity and creating open and welcoming shared spaces. DCMS hasn’t in the White Paper elaborated on what its proposed duty would entail. If it’s drawn narrowly so that it only bites when there is clear evidence of real, tangible harm and a reason to intervene, nothing much will change. However, if it’s drawn widely, sweeping up too much content, it will start to act as a justification for widespread internet censorship.

If platforms are required to prevent potentially harmful content from being posted, this incentivises widespread prior restraint. Platforms can’t always know in advance the real-world harm that online content might cause, nor can they accurately predict what people will say or do when on their platform. The only way to avoid liability is to impose wide-sweeping upload filters. Scaled implementation of this relies on automated decision-making and algorithms, which risks even greater speech restrictions given that machines are incapable of making nuanced distinctions or recognising parody or sarcasm.

DCMS’s policy is underpinned by societally-positive intentions, but in its drive to make the internet “safe”, the government seems not to recognise that ultimately its proposals don’t regulate social media companies, they regulate social media users. The duty of care is ostensibly aimed at shielding children from danger and harm but it will in practice bite on adults too, wrapping society in cotton wool and curtailing a whole host of legal expression.

Although the scheme will have a statutory footing, its detail will depend on codes of practice drafted by the regulator. This makes it difficult to assess how the duty of care framework will ultimately play out.

The duty of care seems to be broadly about whether systemic interventions reduce overall “risk”. But must the risk be always to an identifiable individual, or can it be broader – to identifiable vulnerable groups? To society as a whole? What evidence of harm will be required before platforms should intervene? These are all questions that presently remain unanswered.

DCMS’s approach appears to be that it will be up to the regulator to answer these questions. But whilst a sensible regulator could take a minimalist view of the extent to which commercial decisions made by platforms should be interfered with, allowing government to distance itself from taking full responsibility over the fine detailing of this proposed scheme is a dangerous principle. It takes conversations about how to police the internet out of public view and democratic forums. It enables the government to opt not to create a transparent, judicially reviewable legislative framework. And it permits DCMS to light the touch-paper on a deeply problematic policy idea without having to wrestle with the practical reality of how that scheme will affect UK citizens’ free speech, both in the immediate future and for years to come.

How the government decides to legislate and regulate in this instance will set a global norm.

The UK government is clearly keen to lead international efforts to regulate online content. It knows that if the outcome of the duty of care is to change the way social media platforms work that will apply worldwide. But to be a global leader, DCMS needs to stop basing policy on isolated issues and anecdotes, and engage with a broader conversation around how we as society want the internet to look. Otherwise, governments both repressive and democratic are likely to use the policy and regulatory model that emerge from this process as a blueprint for more widespread internet censorship.

The UK House of Lords report on the future of the internet, published in early March 2019, set out ten principles it considered should underpin digital policy-making, including the importance of protecting free expression. The consultation that this White Paper introduces offers a positive opportunity to collectively reflect, across industry, civil society, academia and government, on how the negative aspects of social media can be addressed and risks mitigated. If the government were to use this process to emphasise its support for the fundamental right to freedom of expression – and in a way that goes beyond mere expression of principle – this would also reverberate around the world, particularly at a time when press and journalistic freedom is under attack.

The White Paper expresses a clear desire for tech companies to “design in safety”. As the process of consultation now begins, EDRi member Open Rights Group (ORG) calls on DCMS to “design in fundamental rights”. Freedom of expression is itself a framework, and must not be lightly glossed over. ORG welcomes the opportunity to engage with DCMS further on this topic: before policy ideas become entrenched, the government should consider deeply whether these will truly achieve outcomes that are good for everyone.

Open Rights Group
https://www.openrightsgroup.org

The DCMS Online Harms Strategy must “design in” fundamental rights (08.04.2019)
https://www.openrightsgroup.org/blog/2019/the-dcms-online-harms-strategy-must-design-in-fundamental-rights

Online Harms White Paper – Executive summary (08.04.2019)
https://www.gov.uk/government/consultations/online-harms-white-paper/online-harms-white-paper-executive-summary–2

Online Harms – White Paper
https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/793360/Online_Harms_White_Paper.pdf

Open consultation: Online Harms White Paper
https://www.gov.uk/government/consultations/online-harms-white-paper

House of Lords: Regulating in a digital world (09.03.2019)
https://publications.parliament.uk/pa/ld201719/ldselect/ldcomuni/299/299.pdf

(Contribution by Jim Killock and Amy Shepherd, EDRi member Open Rights Group, the United Kingdom)

close
10 Apr 2019

Public campaigns on digital rights: Mapping the needs

By Claire Fernandez

In February 2019, the Digital Freedom Fund (DFF) strategy meeting took place in Berlin. The meeting was the perfect occasion for experts, activists, and litigators from the broad digital and human rights movement to explore ways of working together and of levelling up the field.

The group held discussions on several methods and avenues for social change in our field, such as advocacy and litigation. Public campaigning came up as an interesting option – many organisations want to achieve massive mobilisation, while few have managed to develop the tools and means needed for fulfilling this goal. One of the breakout group discussions therefore focused on mapping the needs for pan-European campaigns on digital rights.

First, we need to define our way of doing campaigns, which might differ from other movements. A value-based campaigning method should look into questions such as: Who funds us? Do we take money from the big tech companies and if yes, at what conditions and to which amount? Who are we partnering with: a large, friendly civil society and industry coalition or a restricted core group of digital rights experts? Are we paying for advertising campaigns on social media or do we rely on privacy-friendly mobilising techniques? It was agreed that being clear on how we campaign and what our joined message is were crucial elements for the success of a campaign. A risk-management system should also be put in place to anticipate criticisms and attacks.

Second, proper field mapping is important. Pre- and post- campaign public opinion polls and focus groups are useful. Too often, we tend to go ahead with our own plans without consulting the affected groups such as those affected by hate speech online, child abuse and so on.

Third, unsurprisingly, the need for staff and resources was ranked as a priority. These include professional campaigners, support staff, graphic designers, project managers and coordinators, communication consultants and a central hub for a pan-European campaign.

Finally, we need to build and share campaigning tools that include visuals, software, websites, videos, celebrities and media contacts. Participants also mentioned the need for a safe communication infrastructure to exchange tools and coordinate actions.

At EDRi, all the above resonate as we embark on the journey of building our campaigning capacity to lead multiple pan-European campaigns. For instance, one of the current campaigns we have been involved in − the SaveYourInternet.eu campaign on the European Union Copyright Directive − has revealed the importance of fulfilling these needs. Throughout this particular campaign, human rights activists have faced unprecedented accusations of being paid by Google and similar actors, and of being against the principle of fair remuneration for artists. Despite disinformation waves, distraction tactics and our small resources, the wide mobilisation of the public against problematic parts of the Directive such as upload filters has been truly impressive. We witnessed over five million petition signatures, over 170 000 protesters across Europe, dozens of activists meeting Members of the European Parliament, and impressive engagement rates on social media. The European Parliament vote, in favour of the whole Copyright Directive including controversial articles, was only won by a very narrow margin, which shows the impact of the campaign.

The EDRi network and the broader movement need to learn lessons from the Copyright campaign and properly build our campaign capacity. EDRi started this process during its General Assembly on 7-8 April in London. The DFF strategy workshop held in Berlin gave us a lot of food for thought for this process.

This article was first published by Digital Freedom Fund (DFF): https://digitalfreedomfund.org/public-campaigns-on-digital-rights-mapping-the-needs/

(Contribution by Claire Fernandez, EDRi)

close
09 Apr 2019

Filters Incorporated

By Diego Naranjo

On 26 March 2019, the European Parliament (EP) adopted the new copyright Directive. The music industry and collecting societies celebrated it as a victory for authors and creators, despite actual authors (along with civil society groups) being worried about the outcome.

Article 17 of the Directive (referred as Article 13 in the previous draft text) includes a change of platforms’ responsibility that will lead to the implementation of upload filters on a vast number of internet platforms. In effect, Article 17 represents a threat to our fundamental right to freedom of expression.

We tried hard to stop the legalisation of the first EU internet filter. Read below a summary of what happened.

It all started in 2002

EDRi has been involved in copyright discussions since the beginning of our network’s existence. We’ve promoted a positive agenda aimed at fixing the main problems within the existing framework, and supported a copyright reform that included a request for authors and artists to receive fair remuneration for their work. We published handbooks, series of blogposts, responded to public consultations, spoke in numerous public events, and met with all key policy makers in Brussels and at national level. We participated in different joint actions and were involved in the inception and development of SaveYourInternet.eu along with Copyright for Creativity (C4C).

Civic engagement vs industry lobby

During the debates, the individuals’ and civil society groups’ participation was crucial in order to balance the massive lobby efforts by industries. In July 2018, thanks to the pressure of thousands of people calling their Members of the European Parliament (MEPs), the European Parliament rejected the mandate to proceed with a flawed proposal. This gave us hope that citizens’ voice can be heard, if we shout loud enough.

During the Copyright Action Week in March 2019, ahead of the final vote on the Directive in the European Parliament, a team of 17 people from all across Europe made it all the way to Brussels and Strasbourg. They all parked their studies or jobs for a few days in order to meet their elected representatives and have a final push to delete upload filters from the copyright Directive. We were impressed with their dedication, and their thorough knowledge of the consequences Article 13 could have on the internet. More, hundreds of thousands of people went on the streets in Europe to protest against upload filters.

The latest actions taken by all of those opposing internet filters were not in vain. In the vote adopting the Directive on 26 March, 55 MEPs who previously supported Article 17 (former Article 13) in September 2018, changed their position and were willing to delete it from the final text of the Directive. The deletion could have happened through an amendment proposed by several MEPs. In order for this amendment to be adopted and Article 17 deleted, a vote on whether the text should be first opened to amendments took place during the March 2019 plenary.

The vote: Blue pill, or red pill?

On 26 March, the possibility to have a discussion on the amendments to remove Articles 11 and 13 (15 and 17 in the final text) was voted down with a difference of five votes. Thirteen MEPs claimed that they had wished to open the debate to remove both Articles, but were confused by the previous change of votes order and the obvious lack of clarity this procedural vote was introduced with, and failed to vote “yes”. The vote has been corrected only in the records, but it will not affect the actual results of the vote. After this “mistake” that made it impossible for MEPs to vote on deleting Article 13/17, the text of the Directive (including Article 13/17) was adopted with 338 votes in favor, 283 against, 36 abstentions and 93 MEPs not attending the session.

Despite some policy-makers repeatedly stating that the Directive will not lead into upload filters, it turned out it was all about filters. The day after the Directive was adopted, France hurried to declare that it will ensure that “content recognition technologies” will be a key aspect in the upcoming laws implementing the Directive.

With the adoption of Article 17 as part of the Copyright directive text, the European Union is setting a terrible precedent for the rest of the world, encouraging the implementation of upload filters. Initially under the pretext of copyright infringement, filters are already being discussed also in the framework of online “terrorist content”.

Next steps: EU Council and implementation

The final vote in the Council of the European Union, where EU Member States are represented, is scheduled for 15 April. This is traditionally a merely procedural vote – after all, the Council already agreed, before the European Parliament’s final vote, to the text on which they will be voting. However, this is technically the last chance to get rid of the upload filters. If the Member States currently opposing the ”censorship machine” (Finland, Luxembourg, Poland, Netherlands, Italy and perhaps Sweden) remain on the side of their citizens, the only beam of hope would be that a country representing around 9,5% of the population of the whole EU rejects the text. Out of those countries (Germany, France, Spain), the only realistic candidate is Germany. Will the German government respect the coalition agreement which prohibits them from implementing upload filters? Will other EU countries stand up for the citizens, taking into consideration the upcoming European Parliament (and some national) elections? We’ll find out soon.

In the case of the copyright Directive becoming law, civil rights groups are set to reject upload filters in the national implementation phase. Planned actions include potential referrals to the Court of Justice of the European Union (CJEU).

Read more:

Censorship machine takes over EU’s internet (26.03.2019) https://edri.org/censorship-machine-takes-over-eu-internet/

Copyright reform: Document pool
https://edri.org/copyright-reform-document-pool/

(Contribution by Diego Naranjo, EDRi)

close