09 Mar 2017

Copyright Directive: Lead MEP partly deletes the “censorship machine”

By Diego Naranjo

Note: We have updated this article on 20 March 2017 eliminating mentions to the leak when it was no longer necessary and updating the number of amendments below. The rest of the analysis remains relevant and has not been modified.

On 8 March, we were able to gain an insight into a leaked Draft Report from the Legal Affairs Committee (JURI) of the European Parliament (EP) on the proposal for a Copyright Directive. On 17 March, the official text was published with some changes to the leak.

Member of the European Parliament (MEP) Therese Comodini Cachia, the Rapporteur for JURI on this file, has proposed a number of changes to the original broken and extreme proposal of the European Commission (EC). Ms Comodini has taken a reasonable approach and has amended the worst sections of the proposal: the “censorship machine” (aka upload filter) proposal in Article 13, and the suggestion to expand the “ancillary copyright ” (aka “link tax”) that failed in Germany and Spain and was planned to be expanded to the entire EU.

What does Ms Comodini make of the “censorship machine”?

As we described in a previous blogpost, the EC proposed a mandatory “censorship machine” to filter all uploads from every user in the EU. As this video describes, the proposal of the Commission would require private companies to police the internet, in direct contradiction to two separate European Court rulings. The proposal would eliminate our freedoms to remix, to parody and others, in explicit breach of the EU’s obligations contained in the Charter of Fundamental Rights of the EU. In the proposed amendments, Ms Comodini has deleted key aspects of the section of the draft Directive and amended the proposal in a way which would minimise the worst aspects of the censorship machine. Moreover, she has correctly restated the liability rules which exist in current EU legislation (the e-Commerce Directive). She advocates for the licensing agreements that were the ostensible goal of the European Commission in the first place.

Ms Comodini has also deleted the useless reference to the redress mechanism of the original proposal. She proposes a mechanism that allows users to communicate “rapidly and in an effective manner with the rightsholders” regarding the “measures” proposed in Art.13.1 (AM 59) that will ensure the functioning agreements concluded between rightsholders and Internet Service Providers (ISPs). This redresses the imbalance in the Commission’s proposal, which suggests that, first, rightsholders can demand that internet companies must delete certain content and secondly, that those internet companies should set up a complaint mechanism for customers who have been unfairly treated, as a result of the rightsholders’ decisions. A positive change is also AM 57, where Ms Comodini says that information about the “accuracy” of the measures employed is needed to “ensure the continued use of (…) exceptions and limitations, which are based on public interest concerns”.

We remain convinced, however, that Article 13 is fundamentally broken and its deletion is the preferred option. The alternative is for policy makers from different factions in the European Parliament to negotiate between various efforts to fix (or further break) the Commission’s text in order to reach a compromise.The 28 EU Member States would have to do the same thing in parallel, and then the Parliament and Member States could negotiate an agreement based on their respective negotiations. The broken Commission’s proposal simply does not merit this amount of work and the risk of an imperfect outcome that damages fundamental rights, innovation, and competition in Europe.

Is the mad idea of the “link tax” still on?

The proposal to expand the rights of publishers via the creation of an ancillary copyright has received such criticism that the Rapporteur of the EP Committee on Internal Market and Consumer Protection (IMCO), Catherine Stihler MEP had already asked for its deletion in her draft Opinion.

Instead of the full deletion of the article in question, Ms Comodini has tried to solve the enforcement problem, raised by press publishers. She proposes (in AM 52) the establishment of a presumption of representation for press publishers, related to “authors of literary works contained therein and the legal capacity to sue in their own name when defending the rights of such authors”. When explaining this amendment she rightly points out that “(w)ithin this context it is also important to consider that plurality of news and opinions and wide access to these news and opinions is important for public debate in a democratic society. Similarly, non-commercial sharing of such news or opinions is also important in modern democratic societies.

The approach of the two key MEPs, from the two largest political groups, puts the more extreme measures proposed by the Commission in doubt. If the EP remains on the right track, we can start preparing our farewell to the idea of a censorship machine in the Copyright Directive.

Copyright reform: Document pool
https://edri.org/copyright-reform-document-pool/

The copyright reform: A guide for the perplexed (02.11.2016)
https://edri.org/copyright-reform-guide-for-the-perplexed/

A positive step forward against the “censorship machine” in the Copyright Directive (28.02.2017)
https://edri.org/positive-step-forward-censorship-machine-copyright-directive/

Video: Stop #CensorshipMachine: EU #copyright threatens our freedoms (07.03.2017)
https://www.youtube.com/watch?v=qAcTeYtUzQY

Campaign: Save The Meme! By EDRi member Bits of Freedom
https://savethememe.net/

Twitter_tweet_and_follow_banner

close
08 Mar 2017

Danish Defence Intelligence Service will get access to PNR data

By Guest author

Denmark does not take part in the EU Passenger Name Record (PNR) Directive since Denmark has an opt-out from the Justice and Home Affairs (JHA) area of the European Union. Instead, Denmark has a national PNR system which has been developed gradually on the legislative side since 2006. The practical implementation by Danish authorities has not been keeping pace with the political willingness to legislate about PNR, though. Even the basic system for automatic pre-travel checks at external borders with Advance Passenger Information (API), a subset of PNR, has not yet been implemented. This is the case even though this part is based on the API Directive that was transposed into Danish law in 2006. This was noted as an urgent issue of a 2016 EU evaluation of Danish border control under the Schengen acquis (of which Denmark is part).

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Despite a 10-year history of stumbling blocks with airline reservations systems and PNR data formats that were more complicated than expected, the Danish Ministry of Justice is now convinced that all technical problems will soon be solved. The current Danish PNR framework is based on a 2015 law where the Danish Customs and Tax Administration (SKAT) receives PNR data from airlines. SKAT uses the PNR data for its own purpose of customs control and functions as a data warehouse for other Danish authorities that have a legal basis for collecting PNR data via SKAT. The automated PNR data exchange between airlines with flights to Denmark and SKAT should be in place by 1 July 2017. At least, this is what SKAT is demanding from airlines. Whether the July deadline for automated data exchange, completely independent of the future EU PNR system, is feasible or not remains to be seen.

One major disadvantage of this technical setup is that the concept of purpose limitation gets blurred. Airlines and their passengers may believe that PNR data is provided to SKAT for customs control, whereas in reality the PNR data is also used for different purposes by other authorities, and new purposes can be added at any time, even retroactively. Currently, the Danish Security and Intelligence Service (PET) can collect PNR data for prevention and prosecution of terrorism offences via SKAT, and two extensions of the PNR access scheme are already in the pipeline. The problems with receiving API data for border control will be handled by giving the Danish National Police access to the SKAT PNR database. A draft law for this, which also includes access to API data for non-systematic intra-Schengen border checks, is currently in consultation.

Much more concerning from a privacy and data protection point of view is the newly proposed law which will give the Danish Defence and Intelligence Service (DDIS) blanket access to the PNR database held by SKAT. Information about Danish citizens is excluded from the DDIS access, but it is very unclear how this distinction will be made for flights within Schengen where the PNR data does not include passport numbers. The comments of the proposed law mention using passenger contact information such as phone numbers and email addresses, which are somewhat unreliable indicators of nationality. For non-Danish citizens, DDIS can use the PNR data for any intelligence purpose directed against conditions abroad. Besides preventing threats against national security (such as terrorism), the comments of the law specifically mention monitoring travel patterns of persons that may act on behalf of foreign states, and even using PNR data to facilitate the recruitment of foreign agents.

The activities of DDIS generally fall under the national security exemption in the EU Treaties, but the Ministry of Defence states that the PNR access by DDIS is subject to EU law. This is rather unusual for processing of personal data by a defence intelligence agency. In the present case of access to PNR data, it raises several data protection issues based on case law from the Court of Justice of the European Union (CJEU) in, in particular, the Schrems judgment and the upcoming ruling on the EU-Canada PNR agreement, where the Advocate General opinion was published on 8 September 2016.

Under Article 52(1) of the Charter of Fundamental Rights of the European Union, any limitation of fundamental rights must respect the essence of these rights, besides the requirement of necessity and proportionality. According to paragraph 95 of Schrems judgment, the essence of the fundamental right to effective judicial protection, as enshrined in Article 47 of the Charter, includes legal remedies for citizens to have access to personal data about them, and to obtain the rectification or erasure of such data. For Danish citizens and residents in Denmark (Danish persons), some legal remedies exist through the Danish Intelligence Oversight Board (TET). Direct access to personal data held by DDIS is not available, but upon request, TET will check whether personal data about a Danish person is processed unlawfully (called “indirect access” under the Danish law governing DDIS operations). This option is not available for non-Danish persons, so it is highly questionable whether the essence of the fundamental right to effective judicial protection is respected.

The fact that DDIS only gets access to PNR data on non-Danish citizens could be seen as illegal discrimination under EU law. The CJEU ruled that the right to non-discrimination between EU nationals precludes, for the purpose of fighting crime, a system for processing personal data specific to Union citizens who are not nationals of that Member State. Finally, when viewed against the points raised by the Advocate General in the EU-Canada PNR case, there are multiple potential deficiencies in the proposed law giving DDIS access to PNR data. For example, a purpose that includes recruitment of foreign agents is hardly limited to what is strictly necessary. Moreover, for non-Danish persons, there are no limitations on profiling, no independent data protection oversight, and the PNR data can be retained indefinitely without any restrictions on the further transfer of the data to intelligence services in third countries, for example the National Security Agency (NSA) of the United States.

All of these issues were raised in the consultation response on the draft law by EDRi member IT-Pol Denmark. The Ministry of Defence maintains that the proposed law complies with EU law and the Charter of Fundamental Rights. In response to the specific criticism raised by IT-Pol, the Ministry of Defence only notes that the necessary legal remedies exist for non-Danish persons since the Danish constitution allows everyone to sue DDIS in the ordinary courts, and everyone can file a complaint with the Danish Parliamentary Ombudsman. However, it seems very unlikely that the courts or the Ombudsman will be able to examine personal data held by DDIS. The TET staff has special security clearances for this task.

The new PNR law is likely to be swiftly adopted by the Danish Parliament. At the initial public debate in the Parliament on 2 March 2017, there was a sizeable majority in favour of giving DDIS access to PNR data and very limited recognition of the privacy and data protection problems that this law would create for non-Danish citizens if they travel to Denmark by air or transit through Danish airports en route to other destinations.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Council Implementing Decision setting out a recommendation on addressing the deficiencies identified in the 2016 evaluation of Denmark on the application of the Schengen acquis in the field of the management of the external border (28.10.2016)
http://www.ft.dk/samling/20161/almdel/euu/bilag/66/1681189.pdf

EDRi: New Danish PNR system will rival the EU PNR Directive (22.04.2015)
https://edri.org/new-danish-pnr-system-will-rival-the-eu-pnr-directive/

Consultation response by IT-Pol Denmark on draft law to give DDIS access to Danish PNR data (only in Danish, 30.01.2017)
https://itpol.dk/hoeringssvar/fe-adgang-til-pnr-og-regler-om-sletning

Proposed law L 146 on access to PNR data by the Danish Defence Intelligence Agency (only in Danish, 24.02.2017)
http://www.ft.dk/samling/20161/lovforslag/L146/som_fremsat.htm

(Contribution by Jesper Lund, EDRi member IT-Pol, Denmark)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
08 Mar 2017

The tale of the fight for transparency in the EU Internet Forum

By Guest author

Chapter One: The dark knights at a secret meeting

It was the beginning of the year 2014 when the European Commission first announced the creation of an “EU Internet Forum”. But it would take almost two years and several meetings before its official launch on 3 December 2015.

The Forum’s mission: to “counter terrorist content and hate speech online.”

Its actors: something of a “secret society” consisting of almost exclusively US internet companies (Microsoft, Facebook, Twitter, Google and Ask.fm), government officials and law enforcement agencies.

Its modus operandi: meetings behind closed doors to discuss undefined “terrorist material” and badly defined “hate speech”, creating pressure on industry to monitor and censor online communications without any accountability for the outcome.

Behind the curtain, the Commission’s initiative encourages internet companies to take “voluntary” actions. It is supposed to trigger a response to a very diverse range of possibly illegal or legal but unwanted online activity. This appears to mean monitoring European citizens and censoring of online content by private companies without appropriate oversight.

Chapter two: The battle for transparency and freedom of expression

No information about the Internet Forum has been made public from the start of the initiative. So in order to make sure the EU Commission keeps this project in line with fundamental rights obligations, EDRi made several access to documents requests via the AskTheEU.org portal. At the end of April 2015, we submitted a request to receive information about the participating businesses, objectives and tasks of the Forum, minutes of meetings and a list of upcoming meetings, and timeline of the work of the Forum. The EU Commission repeatedly denied us access to the documents discussed by the Forum, or provided us only with heavily redacted documents.

After months of obstruction from the European Commission, we made a complaint to the European Ombudsman for being wrongly denied full access to two documents. As a result, in April 2016 the Ombudsman launched an investigation into the European Commission’s failure to disclose information of the EU Internet Forum.

The Commission argued that it had to refuse access to these documents to protect public security and its decision-making process. In response to the Ombudsman’s investigation, the Commission explained that if details of the initiative became public, it “would allow them [terrorist groups] to circumvent counter-terrorism measures”. This statement clearly confirms that the Commission discussed potential restrictions on the freedom of communication with industry representatives.

The saga continued as we responded in a letter to the Ombudsman in which we argued that as there is an impact on fundamental rights of citizens, the Commission is obliged to at least make the underlying legal basis or reasoning public.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Chapter three: The poisonous fruit

While no civil society organisations were invited to attend the discussions on terrorism, EDRi was invited to take part in some of the initial discussions on online hate speech. However, we were systematically excluded from the negotiations that led to the voluntary Code of Conduct for IT companies.

With the publication of this official document in May 2016, the Commission handed over the leading role in countering hate speech to private companies with the power to arbitrarily implement their terms of service. This will – or has – almost automatically led to the deletion of legal content as a result of this voluntary and unaccountable take-down mechanism.

This agreement between only a handful of companies and the European Commission creates serious risks for freedom of expression and is likely in breach of the EU Charter of Fundamental Rights, under which restrictions on rights should be provided for by law, and “genuinely achieve objectives of general interest” (which cannot be assessed due to lack of oversight and meaningful reporting obligations).

EDRi took a stand that countering hate speech online requires open and transparent discussions to ensure compliance with human rights obligations. This means that we decided not to take part in future discussions of the Forum, confirming that we do not have confidence in the ill-considered Code of Conduct.

Epilogue

There is in fact no epilogue yet; the complaint case is not yet closed, and the future of the EU Internet Forum is still uncertain. The last meeting took place on 6 December 2016, again behind closed doors. Nevertheless, rights defenders came out of battle victorious. Our achievement was to shed a light on the activities of the Forum with the help of the AskTheEU.org portal, and thereby enable a public debate. The next initiative in this framework, an “Internet Forum Plus” project on cross-border access to electronic evidence has been diligent, carefully developed a problem definition and has been responsive and transparent so far.

The article was originally published at http://blog.asktheeu.org/2017/03/the-tale-of-edris-fight-for-transparency-in-the-eu-internet-forum/.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

EU Internet Forum against terrorist content and hate speech online: Document pool
https://edri.org/eu-internet-forum-document-pool/

EU Commission under investigation for EU Internet Forum documents
https://edri.org/commission-under-investigation-eu-internet-forum/

Commission responds to Ombudsman investigation on EU Internet Forum
https://edri.org/commission-responds-ombudsman-investigation-eu-internet-forum/

Code of Conduct on Countering Illegal Hate Speech Online
http://ec.europa.eu/justice/fundamental-rights/files/hate_speech_code_of_conduct_en.pdf

(Contribution by Zarja Protner, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
08 Mar 2017

Is Telefónica offering real transparency and control?

By EDRi

Our data is extremely precious for technology companies. Internet and telecommunications services host and process huge amounts of personal data of their clients, based on often vague and confusing terms of service. The clients are rarely properly informed on what their data are being used for.

On 27 February, at the Mobile World Congress (MWC), Spanish telecommunications service provider Telefónica presented its project AURA, which it hopes to use to grab its share of data. AURA is an app that gives Telefónica’s clients “the possibility of managing their relationship with the company based on cognitive intelligence”. It processes personal data of the telecoms service provider’s clients and creates profiles of them. According to Telefónica, its clients will be able to access their data through the AURA app to check it and to decide whether they want to give a permission to share it with other internet giants.

The fact that clients can access their data and consult on it is something positive. However, according to the Spanish data protection law, we should also be able to demand that such data is not being processed at all without our consent.

As a telecommunications service provider, Telefónica collects data on its clients’ bills, messages and calls, payments, and so on. It also has access to the data of the masts to which clients’ devices connect when they are using their mobile (thereby producing location data), which web pages and services they visit and for how long, how many and what devices are connected to their router, and in some cases also which TV channels they are watching, and which series and movies they prefer. By processing the collected data with the artificial intelligence it has developed in collaboration with Microsoft, Telefónica can build profiles of its clients. By combining and analysing this data, it can create completely new data and draw new conclusions on its clients’ potential and probable behaviour.

Telefónica claims that its clients have the power to choose whether to share their data with third parties or not. It is yet to be seen how it will ask for this consent: in a clear and transparent manner, or pushing aggressively to accept the terms under which the data is shared, in exchange for attractive features or services. There is also a huge difference between sharing and having access to raw data and having access to the outputs of the analysis of that data.

Telefónica tries to, naturally, highlight the benefits of AURA. However, much will depend on the real choices being offered to individuals, the transparency that will be provided to them and control that can be exercised by them.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Aura: Telefónica process and trade with your data while proclaims itself “warden” of your personal information
https://xnet-x.net/aura-telefonica-procesa-negocia-tus-datos/

Telefónica presents AURA, a pioneering way in the industry to interact with customers based on cognitive intelligence
https://www.telefonica.com/es/web/press-office/-/telefonica-presents-aura-a-pioneering-way-in-the-industry-to-interact-with-customers-based-on-cognitive-intelligence

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
08 Mar 2017

Denmark: Our data retention law is illegal, but we keep it for now

By Guest author

On 2 March 2017, the Danish Minister of Justice appeared before the Legal Affairs Committee of the Danish Parliament to answer questions about the implications of the Tele2 data retention ruling (joined cases C-203/15 and C-698/15) from the Court of Justice of the European Union (CJEU).

In his statement to the committee, the Minister started by noting that the Danish government is still analysing the consequences of the judgment, but two conclusions are clear. First, EU law precludes a general and undifferentiated data retention scheme covering all subscribers. Secondly, EU law does not preclude a targeted data retention scheme for the purpose of fighting serious crime. The Minister of Justice then noted that the Danish data retention law covers all subscribers, similar to the data retention laws in the other Member States that currently have data retention. The unavoidable implication of this is that the current Danish data retention law does not comply with EU law, which the Minister of Justice admitted before the committee.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

It is definitely noteworthy that this conclusion comes from the Danish Ministry of Justice after two months of undoubtedly very intensive internal analysis of the Tele2 judgment and presumably consultations with other Member States. In June 2014, there was also a meeting in the Legal Affairs Committee of the Danish Parliament, two months after the CJEU ruled on 8 April 2014 that the Data Retention Directive (2006/24/EC) was invalid. However, at that meeting, the Minister of Justice was able to get away with presenting a legal analysis with a very narrow interpretation of the 2014 CJEU judgment that allowed the minister to conclude that there was no reason to assume that the Danish data retention law was in conflict with the Charter of Fundamental Rights. At the committee meeting on 2 March 2017, no doubt about the interpretation of the new Tele2 judgment was possible: blanket data retention is illegal in the European Union.

In this situation, a country committed to the rule of law would take immediate steps to repeal the illegal legislation. In Denmark, this can be done very easily, since the Danish data retention law authorises the Minister of Justice to lay down the specific data retention requirements in an administrative order. A simple executive decision by the Minister of Justice, repealing the illegal data retention administrative order (”logningsbekendtgørelsen”), would suffice to uphold the rule of law in Denmark.

However, this will not happen in the immediate future. Despite being unable – twice – to convince the Court of Justice of the EU of this, the Minister of Justice still argues that data retention is simply too valuable for the Danish police. Therefore, the current blanket data retention will simply continue without any change until new rules for targeted data retention have been fully implemented. The Minister of Justice claims that the EU Commission has not made any demands to the Danish government to repeal the current (illegal) data retention rules.

The projected timeline for the future process is somewhat unclear, although the next parliamentary year was mentioned tentatively at the meeting. Currently, the Danish government is consulting with the other Member States and the EU Commission on interpreting the Tele2 ruling and in particular how targeted data retention should be defined. Another requirement for the Minister of Justice is that the targeted data retention scheme is technically feasible for the telecommunications operators, and consultations with the telecommunications industry on this are ongoing. When a technically feasible targeted data retention plan is available, the Minister of Justice will present a legislative proposal to the Danish Parliament, which eventually will lead to replacing the illegal data retention scheme with a new, hopefully legal, scheme.

The Minister of Justice made it clear that the future targeted data retention rules will even include the extensions for internet traffic that were planned under blanket data retention until just prior to the Tele2 judgment, possibly including internet connection records (introduced in the United Kingdom with the Investigatory Power Act). Retention of internet connection records was a massive failure when used between 2007 and 2014 in Denmark (under the old name “session logging”). Just before the Tele2 judgment in December 2016, the working plan of the Ministry of Justice was to re-introduce internet connection records for subscribers with Carrier Grade Network Address Translation (CG-NAT) connections.

At the committee meeting on 2 March 2017, the Minister of Justice described the future process towards targeted data retention as an ”adjustment of the current data retention rules”, and he emphasised the importance of ensuring that the police and intelligence services would continue to have the necessary tools to protect the population, as had been the case for the past 10 years with data retention. Here, the Minister of Justice is clearly confusing the use of telecommunications metadata in police investigations with mandatory data retention. Danish police has systematically used available telecommunications metadata in investigations for the past 20 years, and mandatory data retention only took effect on 15 September 2007.

While the Minister of Justice repeatedly referred to the ongoing EU consultations and that the EU Commission is currently preparing guidelines for targeted data retention, there was also some informal discussion of the issue among the Members of Parliament (MPs) that participated in the committee meeting. MPs in favour of data retention were clear about their intentions: data retention should allow the police to look into the past for suspects that were unknown at the time of the crime. This sounds very much like blanket data retention, and the word targeted is only used because the CJEU has made it very clear that EU law only allows targeted data retention. There seems to be little doubt that the Danish government, backed by a clear majority in Parliament, will push the scope of targeted data retention, once this concept has been defined, to the legal limit of EU law in the future revision of the Danish data retention rules.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

EDRi: Denmark: Data retention is here to stay despite the CJEU ruling (04.06.2014)
https://edri.org/denmark-data-retention-stay-despite-cjeu-ruling/

Webcast of meeting in the Legal Affairs Committee of the Danish parliament (in Danish, 02.03.2017)
http://mobiltv.ft.dk/video/20161/reu/td.1380023

EDRi: Danish government postpones plans to re-introduce session logging (23.03.2016)
https://edri.org/danish-government-postpones-plans-to-re-introduce-session-logging/

Minister of Justice continues illegal surveillance, Information (in Danish, 03.03.2017)
https://www.information.dk/indland/2017/03/pape-viderefoerer-ulovlig-overvaagning

(Contribution by Jesper Lund, EDRi member IT-Pol, Denmark)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
08 Mar 2017

German intelligence agency violates freedom of the press

By Guest author

EDRi observer Reporters Without Borders Germany is appalled by the apparently targeted surveillance of foreign journalists by the Bundesnachrichtendienst (BND), Germany’s foreign intelligence agency. As reported by the Spiegel, the BND spied on at least 50 telephone numbers, fax numbers and email addresses belonging to journalists or newsrooms around the world in the years following 1999.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

“We have long feared that the BND has monitored journalists as part of its massive filtering of communications data. The targeted surveillance revealed by the Spiegel investigation is a massive violation of the freedom of the press,” said Christian Mihr, executive director of Reporters Without Borders Germany. Press freedom “is not a right granted by the graciousness of the German government, it is an inviolable human right that also applies to foreign journalists.”

According to documents seen by Spiegel, among the targets were the British BBC in Afghanistan and London, the New York Times in Afghanistan, as well as mobile and satellite telephones of the news agency Reuters in Afghanistan, Pakistan and Nigeria.

In October 2016 the German Parliament (Bundestag) passed the new law governing the BND. Exemptions protecting journalists, such as those in paragraph 3 of Germany’s so-called G10 law – a law specifying the restrictions that can be placed on the constitutional right to the confidentiality of email and telecommunications – are completely absent from the law.

The BND law allows the German foreign intelligence agency to carry out mass surveillance and monitor Europeans, with certain restrictions, and citizens of third countries whenever this can ensure the “capacity for action” of Germany or bring “new findings of significance to foreign and security policy”. Foreign journalists can thus quickly be targeted by the German foreign intelligence – especially when they exchange information about politically sensitive issues. The bill allows, for example, the BND to place the New York Times under surveillance if the newspaper received confidential information that the German authorities regarded as sensitive. This means that the new BND law legalises what the foreign intelligence agency did illegally before, that is spying on foreign journalists, as revealed by the Spiegel. “The reform of the BND bill was already a clear breach of the constitution. It does not alter the current practice of monitoring journalists,” said Christian Mihr.

Together with other journalist associations and under the leadership of the Society for Civil Rights, Reporters without Borders is preparing a constitutional challenge to the new BND law.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

BND violates freedom of the press
https://www.reporter-ohne-grenzen.de/presse/pressemitteilungen/meldung/bnd-ueberwachung-ist-verstoss-gegen-pressefreiheit/

Documents Indicate Germany Spied on Foreign Journalists
http://www.spiegel.de/international/germany/german-intelligence-spied-on-foreign-journalists-for-years-a-1136188.html

(Contribution by EDRi observer Reporters Without Borders Germany)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
08 Mar 2017

People with disabilities do not want just any Accessibility Act

By Guest author

On 6 March 2017, the European Disability Forum (EDF), the umbrella organisation representing persons with disabilities at EU level, joined forces with its members and other civil society organisations to protest in front of the European Parliament in Brussels. Over a hundred people gathered together calling for a meaningful European Accessibility Act (EAA), to guarantee better accessibility to services, including online services, for everyone.

The organisation of this demonstration, as well as the publication of an open letter to Members of the European Parliament (MEPs), was triggered by the initial draft report of the European Parliament Committee on the Internal Market and Consumer Protection (IMCO). The draft did not meet the expectations of 80 million persons with disabilities in Europe who still face barriers in their everyday life. Afterwards, inside the Parliament, the discussion in IMCO was more positive on the issues raised by the disability activists than had previously been the case.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Accessibility is a human right enshrined in the United Nations Convention on the Rights of Persons with Disabilities. It has been ratified by the EU and 27 of its Member States. Accessibility is also a pre-condition to be able to enjoy other fundamental rights, participate in all aspects of our societies and live independently. In practice, many daily errands, such as withdrawing money from a cash machine, using a computer and accessing online services, calling a friend or watching TV are impossible for many people.

In 2015, the European Commission presented the long-awaited Accessibility Act, the proposal for a Directive harmonising accessibility requirements for a set of mainstream products and services within the EU internal market. The Act has a very strong information and communications technology (ICT) component, including computers and operating systems, smartphones, TVs, e-books, e-commerce, self-service terminals and ATMs, e-banking services, telephony and audiovisual services.

The IMCO Committee draft report suggested deleting some of the accessibility requirements, especially for ICT products and services. It replaced them by more general functional performance criteria, which do not recognise the peculiarities of the different products and services. Some products and services, such as the ones regarding web accessibility, can share some requirements. However, certain products and services need more specific requirements. For example, the requirements for TVs should include the possibility to enable audio description to allow visually impaired people to access the contents. In contrast, the requirements for telephony services should ensure real-time text, to give hearing impaired people the possibility to use the services.

Being too vague in the accessibility requirements could lead to situations in which improvements are required, but they are not sufficient to fulfill the users’ needs. For example, enabling a “better use of limited vision” can mean simply enlarging the font size by one point, which is not necessarily sufficient. Companies should be allowed to come up with innovative solutions, but in order to serve its purpose, legislation should be comprehensive, clear, and detailed.

To make sure that persons with disabilities are not left behind in the battle for our digital rights, equal access to online contents and services for everyone should be one of the very first priorities for the EU.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Activists demonstrate for a strong law on accessibility in Europe
http://www.edf-feph.org/newsroom/news/activists-demonstrate-strong-law-accessibility-europe

Open letter to Members of the European Parliament concerning the European Accessibility Act
http://www.edf-feph.org/newsroom/news/open-letter-members-european-parliament-concerning-european-accessibility-act

United Nations Convention on the Rights of Persons with Disabilities
http://www.ohchr.org/EN/HRBodies/CRPD/Pages/ConventionRightsPersonsWithDisabilities.aspx#9

European Accessibility Act
http://ec.europa.eu/social/main.jsp?catId=1202

US Access Board: Information and Communication Technology (ICT) Final Standards and Guidelines
https://www.access-board.gov/guidelines-and-standards/communications-and-it/about-the-ict-refresh

(Contribution by Alejandro Moledo, European Disability Forum)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
08 Mar 2017

Audiovisual Media Services Directive – is it good enough to be a law?

By Joe McNamee

The worst examples of bigotry, ignorance, and hatred have appeared more visible in our public discourse in recent months and years. All reasonable people are appalled at willful ignorance and almost visceral hate. We need to take the necessary steps to fight ignorance and hatred.

But we need to do so in a way that is effective and in a way that is not counter-productive. Exacerbating the problem by blunt censorship that makes martyrs of bigots will help nobody.

We also need to defend our society. We need to defend our democracy. We need to defend the rule of law. In order to achieve any public policy goal, we need to know what it is we are trying to achieve and if the means we are using are appropriate. We need to protect the core pillars of our society.

And we need to remember that all regions of the world decided to put free speech into the Universal Declaration of Human Rights in 1948. And Europe chose to anchor free speech into the European Convention on Human Rights in 1950. Crucially, the EU reaffirmed these rights by the Charter of Fundamental Rights in 2009. Now, there are two crucial questions to be answered.

Does the Audiovisual Media Services Directive (AVMSD) approach respect the Charter of Fundamental Rights and the European Convention on Human Rights?

We need to look at the substance and the law. We need to recognise that when we say we support free speech, we don’t mean that we only support uncontroversial speech. Free speech also means, in the words of the United Nations (UN) Human Rights Committee, “deeply offensive” or, in the words of the European Court of Human Rights, it is applicable not only to “information” or “ideas” that are favourably received or regarded as inoffensive or as a matter of indifference, but also to those that offend, shock or disturb the State or any sector of the population.

When there are commercial interests at stake in industry codes of conduct, on what basis can we believe – on what basis could we even hope – that such information or ideas are safe? What’s the basis to believe that terms of service are better suited to judge speech than democratically drafted laws?

In the absence of meaningful reporting obligations, and bearing in mind the commercial and political pressures on companies, on what basis do we think that, in the words of Article 52 of the Charter, the restrictions being proposed are necessary, when we don’t know and can’t know how much legal content will be deleted under the proposed private enforcement of a non-harmonised hate speech law by private companies – and we know that legal content is being affected; or that the restrictions will genuinely achieve objectives of general interest when, for example, the Commission’s code of conduct has absolutely no mechanism for establishing whether the measures are proportionate or, indeed, might be counterproductive? This is what the law needs to uphold in practice.

Are the proposed rules in the AVMSD good enough to respect the “prescribed by law” obligations of the European Court of Human Rights (ECHR) or “provided for by law” of the Charter?

Firstly, the notions of “self-” and “co-regulation” are not clear. Most of the Directive is about traditional broadcasters, where “self-regulation” means broadcasters regulating themselves. However, “self-regulation” in the online world generally involves internet companies regulating their users. This is a fundamentally different thing and creates a first layer of ambiguity.

Underlining the measures on “hate speech” we have the Framework Decision, whose definition is already very broad and not in line with international standards. This is made more unpredictable by very varied transpositions in Member States. It is further undermined by failure to enforce in some Member States. This adds a second layer of ambiguity.

The Commission’s AVMSD proposal builds on this weak background. It mixes potentially harmful material with material that would be illegal under the Framework Decision.

It creates confusion by saying, in Recital 8 that the definitions of “incitement to hatred and violence” should only be aligned “to the appropriate extent” to the law. So, the transposing law restricting hate speech in the AVMS context would not necessarily have to be in line with the existing law on hate speech.

Recital 30 and Article 28a.5 say that, with respect to these restrictions on content, Member States shall not be precluded from imposing stricter measures with respect to “illegal content”… which implies that legal content is being restricted. This adds a third layer of ambiguity.

So, we have a Racism and Xenophobia Framework Decision, which is patchily transposed and patchily implemented. We have a draft AVMS Directive which requires action to be taken against material that incites to hatred or violence, where the definition of such content should be aligned, where appropriate to the law.

Where private companies enforce the law based on so-called “self-regulation” or “co-regulation” and where Member States can implement stricter rules than those applied to… content that is supposed to be illegal? Illegal according to whom? To the Member States that are unable to agree on what illegal is? To big companies whose terms of service do not necessarily follow the law? Do we want companies to decide what should be restricted? When did we decide to abandon the Charter of Fundamental Rights and hand over regulation of incitement to hatred to private companies, driven by commercial interests?

Separately, we have to consider basic principles of equality before the law. Do we expect US big companies, say Facebook, Google or Twitter to treat all users equally? Are they under an obligation to do so? The business priorities of private companies simply do not allow for treating a celebrity or politicians in the same way as other citizens.

Nine months have passed since the adoption of the EU Code of conduct against hate speech. We haven’t seen any other company joining this code and we don’t see meaningful results from it. Is it the model we want to replicate in law?

In summary, the EU appears not to have the political will to adopt laws based on global human rights standards. The stop-gap solution is to adopt measures in law and codes of conduct which fail to respect basic rules on the quality of law. The danger of arbitrary enforcement is real. The danger of counterproductive effects is real.

The speech was delivered by Joe McNamee at an expert discussion on the proposal for an updated Audiovisual Media Services (AVMS) directive, organised by Hanse-Office, the joint representation of the Free and Hanseatic City of Hamburg and the State of Schleswig-Holstein to the EU, on 2 March 2017.

Twitter_tweet_and_follow_banner

close
06 Mar 2017

Rights groups demand action on export controls

By EDRi

Nine civil society organisations, including EDRi and several EDRi members, have signed a letter to the participants of the Wassenaar Arrangement, a multilateral export control regime with 41 participating states. We joined Privacy International’s efforts, in expressing concerns that “elements of the current control list of technologies and proposed new additions will have adverse effects on human rights and security.”

In the letter, we acknowledge the need for periodic, open and transparent revisions of export controls in view of technological changes and developments. Tracking and controlling exports is crucial for accountability and minimisation of the threats of uncontrolled trade in advanced surveillance capabilities used for security, law enforcement, and espionage. However, disproportionate and burdensome controls on tools that enhance privacy and security is a threat to global stability, security, and the protection of human rights. Therefore, we encourage the participating states to make a right balance between these legitimate goals and narrowly define the technologies of concern.

The signatories of the letter support restrictions on the proliferation of surveillance technologies. Conversely, it is important that bona fide cybersecurity and security research are not undermined. In this regard, the current language on controls on intrusion software, encryption, and the proposed inclusion of forensic tools needs to be considered. We urge participants of the Wassenaar Arrangement to remove encryption technology from the control list. If adequate language cannot be drafted to capture all considerations, this is likely to mean that intrusion software and forensic tools would have to be excluded from the controls list as well.

We urge participant states to address our concerns prior to the final agreement and the plenary session taking place in December 2017.

Civil society organisations’ letter to the participants of the Wassenaar Arrangement (06.03.2017)
https://edri.org/files/exportcontrols/letter_wassenaar_controllist_20170223.pdf

Rights organisations urge export control body to change control list (06.03.2017)
https://medium.com/@privacyint/rights-organisations-urge-export-control-body-to-change-control-list-997c209c6aa4#.j0a90vqco

Twitter_tweet_and_follow_banner

close
06 Mar 2017

Are net neutrality and privacy Europe’s brilliant way of trumping destructionism?

By Joe McNamee

For the online economy to work, trust and competition are needed. Trust to drive take-up of services and competition to drive down prices and drive up innovation.

Privacy

The 2016 Eurobarometer (pdf) survey found that nearly 60% of individuals in the EU had avoided certain websites for privacy reasons, while 82% were in favour of restrictions on cookies. This shows how important clear privacy rules are for individuals and for trust in the online economy. The European Union has addressed this problem head-on, by proposing and adopting the General Data Protection Regulation (GDPR) and, more recently, proposing the e-Privacy Regulation.

Clear rules, with effective enforcement, generate trust and provide a harmonised market for companies serving individuals in Europe.

The US national telecoms regulator the Federal Communications Commission (FCC) also saw the danger from the “wild west” of personal data exploitation online. The danger was illustrated when the National Telecommunications and Information Administration carried out a survey in 2016. This study found that – in the previous 12 months – 19% of internet-using households had suffered an online security breach, while 45% had refrained from an online activity due to privacy and security fears. Faced with this compelling evidence of the damage caused by lack of trust and security, the FCC tried to act in October 2016. It passed ground-breaking privacy rules (by 3 votes to 2) to protect broadband users and improve trust. However, it was not possible to enshrine the rules in law, meaning that the rules are contingent on the whims of the Commissioners. The appointment of a new FCC Chairman by the incoming president makes it almost certain that US citizens – and the US online economy – will be robbed of this essential protection… unless they use European services, of course.

Far from GDPR and e-Privacy being European protectionism, the US laissez-faire approach appears to be self-inflicted US destructionism.

Net neutrality

In 2013, the EU was faced with increasing evidence of internet access companies seeking to undermine innovation and competition online. It was faced with calls to legislate to protect discriminatory “specialised services” which would allow big online companies to sell “fast lane” to gain access to the customer base of big telecoms operators. Not alone did the European Union not give in to this huge lobbying effort, it legislated in favour of rules that will prevent big telecoms operators from becoming a gatekeeper that stops the full internet being accessible to their customers. It legislated for openness and innovation with a binding EU-wide regulation.

The Federal Communications Commission saw the same danger as the European Union. However, it was not possible to enshrine net neutrality in law. All the FCC could do was to adapt its own implementation of its own rules and powers to defend the online environment from big telecoms operators, in a market that was already less competitive than the one in Europe. As a result, those rules are contingent on the whims of the Commissioners. The appointment of a new FCC Chairman by the incoming president makes it almost certain that US citizens and online businesses will be robbed of this essential protection.

Europe has legislated for open, innovative, better value online services. If the US abandons net neutrality and privacy, it will be opting for self-inflicted destructionism.

Only the EU could have adopted positive, exemplary legislation on this scale to protect individuals and businesses. And it did.

Twitter_tweet_and_follow_banner

close