privacy

The right to privacy is a crucial element of our personal security, for free speech and for democratic participation. It is a fundamental right in the primary law of the European Union and is recognised in numerous international legal instruments. Digital technologies have generated a new environment of potential benefits and threats to this fundamental right. As a result, defending our right to privacy is at the centre of EDRi’s priorities.

29 Jan 2018

EDRi-gram – 15 years of digital rights news (and counting)

By EDRi

15 years ago this day, on 29 January 2003, we published our very first EDRi-gram. To celebrate this occasion, we are looking back at the articles in this first newsletter.

If you are feeling nostalgic, you can read the original EDRi-gram Number 1 here:
http://history.edri.org/edrigram/number1

A lot has changed, a lot stays the same.


Copyright Directive

Implementing the European Copyright Directive
(Click the link to read the original article)

In 2003, we had just escaped one of the biggest threats to the internet in Europe, the so-called “web caching ban”. Copyright fundamentalists tried to ban the incidental copies made by networks, unless they were separately authorised.

In 2018, we are facing one of the biggest threats to the internet in Europe. Copyright fundamentalists are trying to force everything uploaded to the internet to be subject to prior authorisation and/or upload filtering by internet hosting services.


Data retention

Rally Members European Parliament against data retention
(Click the link to read the original article)

In 2003, we were at the start of a long campaign by certain EU Member States to impose mandatory data retention, using the proposed ePrivacy Directive as a tool to achieve this goal.

In 2018, and despite two European Court rulings rejecting mandatory data retention, we are faced with a campaign from certain EU Member States to impose mandatory data retention, using the proposed ePrivacy Regulation as a tool to achieve this goal.


Software patents

New patent law on software threatens innovation
(Click the link to read the original article)

In 2003, European activists were faced by a massive, lobby-driven, well-financed attempt to impose software patents in Europe. The proposal was ultimately rejected, in one of the most unlikely of all “David and Goliath” successes of European activists.


Entitlement cards

Update: United Kingdom
(Click the link to read the original article)

In 2003, the UK government was trying to impose national ID cards through the back door via a national public service “entitlement” card.

In 2018, the Irish government is trying to impose national ID cards via a (“mandatory but not obligatory”) national public service entitlement card.


German censorship

Action against governmental censorship in Germany
(Click the link to read the original article)

In 2003, the German authorities were pushing censorship through the demonstrably ineffective use of blocking by internet access providers.

In 2018, the German authorities are pushing censorship through the coercion of internet services to delete content more quickly.


Recommended reading

“The Human Rights Network in Moscow has just released a very useful online report about online privacy in Russia. According to the introduction fundamental human rights and freedoms – freedom of speech, freedom of information, privacy – are apparently unprotected on the Net. While Russian Internet is growing these rights and freedoms suffer from frequent and widespread invasion.”

In 2003, our recommended reading was a study about online restrictions in Russia:
https://web.archive.org/web/20030506121238/http://www.hro.org:80/docs/reps/privacy/2002/eng/index.htm

In 2018, the story continues:
https://www.hrw.org/news/2017/08/01/russia-new-legislation-attacks-internet-anonymity


Oh no! Did you miss the 363 previous editions of the EDRi-gram? No worries, you can read all of them here and here.

And it’s of course never too late to subscribe to our newsletter!

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
10 Jan 2018

Proposal to revoke data retention filed with the Czech Court

By Iuridicum Remedium

On 20 December 2017, EDRi member Iuridicum Remedium (IuRe) filed a request with the Constitutional Court of the Czech Republic to revoke the Czech data retention related legislation.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The filing of the request was achieved in close cooperation with the Czech Pirate Party, whose 22 deputies were for the first time elected to the Chamber of Deputies of the Czech Parliament in October 2017. Apart from the Czech Pirate Party, the proposal also won the support of Members of the Parliament across five other parties represented in the Chamber of Deputies. Altogether, 58 signatures were gathered.

The proposal was prepared also thanks to means granted by the Digital Rights Fund. It builds on a similar successful proposal filed by IuRe with the Constitutional Court of the Czech Republic in 2011. In 2012, a new data retention system was adopted that implemented the EU Data Retention Directive that was in force at that time. The recent proposal aims at revoking this new law.

The proposal challenges, in particular, the Electronic Communication Act, the Police Act and the Criminal Procedure Act as well as the implementing legislation which defines the range of data to be kept. Currently, operational and localisation data on electronic communications are stored for six months. Apart from the police and other law enforcement bodies, intelligence agencies, as well as the Czech National Bank, may use the data. According to the Czech Telecommunication Office, for example, mobile phone data were requested in over 470 000 cases in 2016 alone.

The complaint to the court considers the principle of general and indiscriminate data collection a fundamental problem. It relies on two key decisions made by the Court of Justice of the European Union (CJEU) – in cases Digital Rights Ireland and Watson/Tele 2. In both cases, this measure was rejected. The proposal also explains that Czech and German statistical data demonstrates that the absense of data retention did not affect the level of criminality nor the number of criminal cases solved. The proposal also suggests revoking of selected sections of the Police Act that allow data to be requested without court permission. Furthermore, it suggests revocation of selected parts of the Code of Criminal Procedure, which do not sufficiently limit the possibility of requiring data related to serious crimes only.

Based on IuRe’s experiences from 2011, the decision of the Constitutional Court of the Czech Republic can be expected in approximately one year time.

IuRe and Pirate party send complaint on general surveillance of citizens to the Constitutional Court (only in Czech, 20. 12. 2017)
http://www.iure.org/15/pirati-iure-podali-navrh-na-zruseni-plosneho-sledovani-obcanu-ustavnimu-soudu-cr

Czech Republic: Data retention – almost back in business (01.08.2012)
https://edri.org/edrigramnumber10-15czech-republic-new-data-retention-law/

Czech Constitutional Court rejects data retention legislation (06.04.2011)
https://edri.org/edrigramnumber9-7czech-data-retention-decision/

Czech Parliament – close in implementing data retention directive (04.06.2008)
https://edri.org/edrigramnumber6-11czech-data-retention/

European fund for digital rights launched (08.02.2017)
https://edri.org/european-fund-for-digital-rights-launched/

(Contribution by Jan Vobořil, EDRi member Iuridicum Remedium, Czech Republic)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
29 Nov 2017

EU Member States plan to ignore EU Court data retention rulings

By IT-Pol

Documents made publicly available through EDRi member Statewatch reveal that EU Member States are exploring all possible options to keep, and in fact expand, their current data retention regimes. The general plan is based on a new concept of ”restricted data retention”, which is really blanket data retention with a new name, along with amendments to the draft e-Privacy Regulation to facilitate blanket data retention. Member States are considering whether these new elements should be introduced through an EU instrument or through national law in each Member State.

On 15 September 2017, the EU Counter-Terrorism Coordinator (EU CTC) submitted a new data retention proposal to Member States. The proposal was discussed at a meeting of the Working Party on Information Exchange and Data Protection (DAPIX) Friends of the Presidency (FoP) on 18 September 2017. A partial report of the discussions at the DAPIX FoP meeting can be found in Council document 13845/17.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

The judgement of 21 December 2016 by the Court of Justice of the European Union (CJEU) in the Tele2 case (joined cases C-203/15 and C-698/15) concerned the national data retention laws that are still in place after the annulment of the Data Retention Directive in 2014. The EU CTC notes that data retention cannot be ”general and indiscriminate retention of all traffic and location data of all subscribers and registered users relating to all means of electronic communication” since this would violate paragraph 134 of the Tele2 judgement. In the Tele2 judgement (paragraphs 108-111), the CJEU outlines a targeted data retention regime which does not include every subscriber.

The EU CTC, considering input received from Member States, makes it clear that he is not at all interested in targeted data retention. Instead, the EU CTC proposes the concept of ”restricted data retention” on the basis that it is necessary to fight terrorism and serious crime, including cyber attacks. This measure has to be limited to the strictly necessary and be based on objective evidence. However, according to the EU CTC, the measure can cover the entire population, even though this is quite obviously blanket data retention.

The justification for this is claimed to be paragraph 106 of Tele2, which states that data retention must be restricted to (i) particular time periods and/or geographical and/or a group of persons likely to be involved, in one way or another, in a serious crime or (ii) persons who could, for other reasons, contribute, through their data being retained, to fighting crime. In essence, the EU CTC argues that the entire population, perhaps with an opt-out for persons bound by a legal obligation of professional secrecy (such as lawyers, journalists and doctors), could fall under the second category, ”persons who could, for other reasons, contribute, through their data being retained, to fighting crime”.

While deliberately covering the entire population, the EU CTC emphasises that other aspects of the data retention measure must be limited to what is absolutely necessary. What this means is not clear from the proposal, but it could include some differentiation with respect to categories of data and service providers. Minor operators, such as WiFi access points at pizza restaurants could be excluded since that data ”may potentially not be indispensable for retention”, as the EU CTC carefully notes. As far as the purpose limitation is concerned, there is nothing novel about the reinvention of restricted data retention. The annulled Data Retention Directive also limited data retention to the purpose of investigation, detection and prosecution of serious crime.

The critical aspect of restricted data retention is obviously that the entire population is covered. The EU CTC argues that this can meet the necessity test. However, the CJEU has ruled twice that a data retention measure which covers all subscribers exceeds the limits of what is strictly necessary. Referring to the entire population as ”persons who could, for other reasons, contribute, through their data being retained, to fighting crime” clearly fails to satisfy the requirement of objective criteria that establish a connection between the personal data to be retained and the objective pursued. The CJEU has referred to this principle several times, most recently in paragraph 191 of opinion 1/15 on the EU-Canada PNR agreement. Moreover, paragraph 110 of the Tele2 judgment specifically says that ”conditions must be shown to be such as actually to circumscribe, in practice, the extent of that measure and, thus, the public affected.”

The DAPIX FoP meeting report mentions that, while the CJEU rules out general data retention, it “does not solely permits” (sic) targeted data retention (which appears to mean that data retention that is not forbidden by the ruling may be permitted). Therefore, there are other legally possible regimes for non-general data retention. This is undoubtedly true, but largely irrelevant. Since the proposed unrestricted yet “restricted” data retention covers the entire population, it cannot possibly be classified as non-general data retention. The DAPIX FoP report refers to the proposed concept as ”restricted data retention and targeted access”, but the Tele2 judgment makes it very clear that safeguards and limitations at the access stage are not sufficient and cannot justify blanket (general) data retention.

The proposal from the EU CTC contains some general comments about the data categories (communication services) to be retained. It is claimed that approaches in some Member States show that a number of data categories are indeed not necessary (and, by implication, illegal).

The new focus on cyber attacks, where data retention is claimed to be key for attribution and investigation, could easily lead to more retention of internet traffic data, in particular, perhaps even internet connection records as in the UK Investigatory Powers Act (information about every internet packet, including all destination IP-addresses). Moreover, Europol has recently complained about the unavailability of data from internet service providers that use Carrier Grade network address translation (CG-NAT) since a large number of subscribers may share the same IP address. Data retention requirements to address the technical limitations caused by CG-NAT would, in most cases, substantially increase the amount of data collected. The DAPIX FoP report describes a matrix with categories of data to be retained, for example content data, traffic data, location data, and subscribers’ data. Except for content data (where generalised data retention would, incidentally, not respect the essence of the fundamental rights), this is simply the list of data categories in the annulled Data Retention Directive and the current data retention laws in Member States. In summary, the proposal of the EU CTC could easily lead to more data being retained per subscriber, despite the claim that a “peeling off” approach is taken to limit the data categories.

Data retained for business purposes, such as billing data, will be complementary to the data covered by the mandatory data retention regime. The EU CTC foresees that the new mandatory data retention regime will also cover over-the-top (OTT) service providers like Google and Facebook, and it is noted in the proposal that OTT operators collect much more data for business purposes than traditional telecommunications operators. In this connection, the EU CTC fails to mention (or, possibly, understand) that the proposed e-Privacy Regulation seeks to create a level playing field by subjecting all electronic communications service providers, whether OTT or telecommunications providers, to the same privacy rules.

The proposal from the EU CTC respects the strict access conditions set out in the second part of the Tele2 ruling. Access to retained data must be solely for the purpose of fighting terrorism and serious crime and must be subject to a prior court review. With the exception of terrorism cases, access can only be granted to data of individuals suspected of involvement in serious crime (Tele2 paragraph 119). The EU CTC also mentions pseudonymisation and encryption, and that this could facilitate searches of the retained encrypted data with decryption only on the basis of a warrant. The purpose of this is not entirely clear, since the retained data, as the general rule, can only be accessed with a prior court review for a specific person. It could perhaps mean that searches of encrypted or pseudonymised data are not intended to count as access to the retained data, and that such searches can be used to find persons of interest who can then, under certain substantive conditions, be depseudonymised subject to a court review. If data on specific persons could only be accessed after a prior court review, there would not really be a need for encrypted searches. Encryption is, of course, a useful security measure for the stored data, but that is an entirely different issue.

In the final part of the proposal, the EU CTC considers the role of the draft e-Privacy Regulation in relation to restricted data retention. The EU CTC notes that the Tele2 judgment is stricter than the annulment of the Data Retention Directive since Article 15(1) of the e-Privacy Directive makes data retention an exception to the main rule of erasure once the communication is completed. The EU CTC hypothesises that the draft e-Privacy Regulation could be amended to make blanket data retention easier. According to the EU CTC, it should be considered to allow storage of communications data in Article 7 of the draft e-Privacy Regulation if legally required to assist governments to fight serious crime and terrorism. However, a provision of this type would still be a restriction on the fundamental rights to privacy and data protection of subscribers, and the restriction would have to satisfy the conditions of Article 52(1) of the Charter of Fundamental Rights. This would not necessarily be different from the current situation with Article 15(1) of the e-Privacy Directive or Article 11 of the draft e-Privacy Regulation.

Working document on contributions to the discussion on data retention, EU Counter-Terrorism Coordinator, WK 9699/2017 INIT, LIMITE (15.09.2017)
http://www.statewatch.org/news/2017/nov/eu-council-ctc-working-paper-data-retention-possibilities-wk-9699-17.pdf

Retention of communication data for the purpose of prevention and prosecution of crime, Council document 13845/17, LIMITE (30.10.2017)
http://www.statewatch.org/news/2017/nov/eu-council-data-retention-legal-aspects-13845-17.pdf

Carrier-Grade Network Address Translation (CGN) and the Going Dark Problem, Council document 5127/17, LIMITE (16.01.2017)
http://www.statewatch.org/news/2017/jan/eu-europol-cgn-tech-going-dark-data-retention-note-5127-17.pdf

(Contribution by Jesper Lund, EDRi member IT-Pol, Denmark)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
29 Nov 2017

Eurojust: No progress to comply with CJEU data retention judgements

By IT-Pol

A recently published Eurojust report on data retention in Europe confirms that EU Member States failed to make meaningful progress towards complying with fundamental rights standards, as clarified by the two Court of Justice of the European Union (CJEU) rulings banning blanket data retention.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The CJEU has delivered two rulings on mandatory data retention of traffic and location data (metadata) for electronic communications services. In the Digital Rights Ireland judgement of 8 April 2014 (joined cases C-293/12 and C-594/12), the Data Retention Directive 2006/24/EU was declared invalid. This was followed by the Tele2 judgement of 21 December 2016 (joined cases C-203/15 and C-698/15), where the CJEU ruled that Article 15(1) of the e-Privacy Directive, read in the light of the Charter of Fundamental Rights of the European Union, precludes national laws which require general and indiscriminate retention of metadata (blanket data retention). Only targeted data retention is allowed under EU law.

A month after the Tele2 ruling, the Council Legal Service sent an analysis of the judgement to Member States, where it concluded that ”a general and indiscriminate retention obligation for crime prevention and other security reasons would no more be possible at national level than it is at EU level, since it would violate just as much the fundamental requirements as demonstrated by the Court’s insistence in two judgements delivered in Grand Chamber”. This was a clear message to Member States who had hitherto claimed that the annulment of the Data Retention Directive in April 2014 did not affect their national data retention laws. When the analysis of the Legal Service was released to the public on 27 March 2017 (Council document 5884/17), the paragraph containing this critical sentence was redacted.

Despite the clear judgement in the Tele2 case, blanket data retention laws are still in place in most Member States. EDRi member Privacy International surveyed 21 national data retention laws and examined their compliance with fundamental rights standards. None of the 21 laws are currently in compliance with these standards, as interpreted by the CJEU judgements in Digital Rights Ireland and Tele2.

This conclusion is confirmed by a recent Eurojust report ”Data retention regimes in Europe in light of the CJEU ruling of 21 December 2016 in Joined Cases C-203/15 and C-698/15” (Council document 10098/17, LIMITE) which was made publicly available by EDRi member Statewatch on 20 November 2017. The Eurojust report covers 25 EU Member States (as well as Norway and Switzerland), and is based on a detailed questionnaire sent to members of the European Judicial Cybercrime Network (EJCN) in March 2017.

According to the survey, five Member States (Austria, the Netherlands, Romania, Slovenia and Slovakia) do not currently have mandatory data retention, as their previous laws were invalidated by constitutional or high courts in accordance with the CJEU judgement on the Data Retention Directive. For the remaining Member States that responded to the survey, the Eurojust report concludes that “none of the countries have national legislation that obliges the targeted retention of data linked to specific persons or geographical locations”. In other words, their national data retention laws cover all subscribers, which is illegal under EU law.

Some respondents indicated that “they considered that their data retention regime is targeted by virtue of the limitations set with regard to retention periods and/or reason for the data retention”. However, this notion of “targeted” is rejected by the Eurojust report, as it is clearly not in line with the standards of the Tele2 judgement.

For access to the retained data, the majority of respondents state that a judicial review is required before access is granted. The replies also state that access is granted depending on the seriousness of the crime being investigated. The Eurojust questionnaire does not ask the respondents whether access to the retained data, as a general rule, can only be granted to “data of individuals suspected of planning, committing or having committed a serious crime or of being implicated in one way or another in such a crime”. This is a requirement in the Tele2 judgement (paragraph 119), except in terrorism cases.

Respondents of the Eurojust survey were also asked about the impact of the CJEU judgement in relation to the admissibility of evidence in court. Five countries reported on court rulings where the admissibility of evidence from data retention was evaluated by the court. So far the evidence has been deemed admissible by courts, although one of the five cases (in Ireland) is still pending on appeal. This part of the Eurojust report shows a clear concern that evidence obtained from illegal data retention could one day be ruled inadmissible by courts.

The legal uncertainty regarding the admissibility of evidence obtained from data retention is by no means surprising. Unless Member States quickly amend their data retention laws to bring them into compliance with the CJEU standards, it is reasonable to expect that there will be more challenges to the admissibility of the evidence. Even if national courts generally allow illegally obtained evidence in specific cases, the courts may eventually rule differently when prosecutors consistently submit evidence that is only available because of illegal data retention laws. The fundamental right to a fair trial may certainly be questioned if the state systematically relies on evidence that is obtained in violation of established human rights standards.

Finally, the Eurojust survey asks about initiatives at the national level to change the data retention legislation. In ten Member States, a review or assessment of the legislation is ongoing, and three Member States are in the process of drafting amendments. The Eurojust report also outlines the substantive legal changes being planned or considered by Member States. Most of these seem concerned with access to the retained data, such as limiting access to serious crime only. This would address a narrow reading of the 2014 Digital Rights Ireland ruling, whereby blanket data retention may be understood as theoretically possible if sufficient safeguards for access are put in place. With the 2016 Tele2 ruling that interpretation is clearly rejected by the CJEU. Only one Member State (Austria) specifically mentions the introduction of targeted data retention and quick freeze.

Informal remarks of the respondents show a clear preference for blanket data retention with arguments that it is impossible to determine in advance the individuals who will commit crimes and thus the data that needs to be retained. There are also claims that storing data indiscriminately for all citizens is more acceptable since the alternative, targeting specific persons or particular geographical locations, could result in criminal investigations that are considered discriminatory. Some respondents also indicated that the necessary balance is already guaranteed by the limitations placed on access to the retained data.

The last argument is particularly odd since the CJEU has clearly ruled in Tele2 that restrictions on access to the retained data are not sufficient. The retention of data must also meet objective criteria that establish a connection between the data to be retained and the objective pursued. In particular, such conditions must ensure that data is not retained on everyone (Tele2 paragraph 110). However, this does not mean that “the individuals who will commit crimes must be determined in advance”. The CJEU rulings in Digital Rights Ireland and Tele2 only require objective evidence to identify a public whose data is likely to reveal a link, at least an indirect one, with serious criminal offences (Tele2 paragraph 111).

There is also the possibility of retaining data on specific persons or a group of persons at an early stage of an investigation based on evidence or intelligence which does not currently meet the substantive requirements for access to metadata. If the police gathers further evidence to substantiate the suspicion for the person of interest and can make a reasoned request for access to data, retained metadata from the past of the suspected person will become available to the police. However, it will not be possible to “look into the past” of every possible citizen since this will require retention of data on everyone. The CJEU has ruled twice that this practice of mass surveillance is illegal.

Eurojust Report: Data retention regimes in Europe in light of the CJEU ruling of 21 December 2016 in Joined Cases C-203/15 and C-698/15
http://statewatch.org/news/2017/nov/eu-eurojust-data-retention-MS-report-10098-17.pdf

Information note from the Council Legal Service on the judgement of the Court in joined cases C-203/15 and C-698/15, Council document 5884/17, unredacted version (01.02.2017)
https://netzpolitik.org/wp-upload/2017/05/rat_eu_legal_service_vds_20170201.pdf

National Data Retention Laws since the CJEU’s Tele-2/Watson judgement, Privacy International (06.09.2017)
https://privacyinternational.org/node/1511

Data retention regimes in Europe in light of the CJEU ruling of 21 December 2016 in Joined Cases C-203/15 and C-698/15, Eurojust, Council document 10098/17
http://statewatch.org/news/2017/nov/eu-eurojust-data-retention-MS-report-10098-17.pdf

(Contribution by Jesper Lund, EDRi member IT-Pol, Denmark)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
29 Nov 2017

Italy extends data retention to six years

By Hermes Center

On 8 November 2017, the Italian Parliament approved a Regulation on data retention that allows telecommunication operators to save telephone and internet data for up to six years.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

The Italian Coalition for Civil Liberties and Rights (CILD) and EDRi observer member Hermes Center for Transparency and Digital Human Rights published their statement criticising the lack of scrutiny and meaningful debate about the Regulation prior to its approval. They also stated that the measure is to the detriment of the privacy of citizens, and could have extremely serious consequences for all of us. The two organisations have been voicing concerns since July 2017, when the provision was inserted into a transposition law following a European Council Directive 2014/33/EU on the “safety of lifts”.

In particular, the Regulation is in unequivocal breach case of law of the Court of Justice of the European Union and results in a clear conflict of law with current Italian privacy regulations, as pointed out by the president of the Italian Data Protection Authority Antonello Soro in October 2017.

Also, on 13 November, the European Data Protection Supervisor Giovanni Buttarelli commented that the newly approved Italian Regulation definitively fails to respect the European approach to data retention.

It seems inevitable that the law will be challenged in court.

Our phone and web data will be stored for 6 years: what about our rights? (12.11.2017)
https://cild.eu/en/2017/11/12/phone-web-data-will-stored-6-years-rights/

Court of Justice of the European Union: The Members States may not impose a general obligation to retain data on providers of electronic communications services (21.12.2016)
https://curia.europa.eu/jcms/upload/docs/application/pdf/2016-12/cp160145en.pdf

European Data Protection Supervisor: “EU is the leader in data protection” (only in Italian, 13.11.2017)
http://www.lastampa.it/2017/11/13/esteri/garante-privacy-ue-sulla-protezione-dei-dati-leuropa-leader-edTINi7G4UzW0KvDtM6emL/pagina.html

‘6 years data retention, Court of Justice of EU may cancel it’. Interview to Prof. Filippo Benelli on likely CJEU action (only in Italian, 09.11.2017)
https://www.key4biz.it/data-retention-6-anni-corte-giustizia-ue-annullarla-intervista-filippo-benelli-universita-macerata/204732/

Metadata of phone and internet traffic: must be stored for 6 years (only in Italian, 08.11.2017)
http://www.repubblica.it/tecnologia/sicurezza/2017/11/08/news/dati_traffico_telefonico_e_telematico_dovranno_essere_conservati_per_6_anni-180604974/

(Contribution by Antonella Napolitano, the Italian Coalition for Civil Liberties and Rights CILD, and Fabio Pietrosanti, Hermes Center for Transparency and Digital Human Rights, Italy)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
20 Sep 2017

Should video-sharing platforms be part of the AVMSD?

By Maryant Fernández Pérez

The Audiovisual Media Services Directive (AVMSD) is currently being reformed. After going through several legislative stages, the AVMSD is now being negotiated in trilogues, that is, informal, secret negotiations between the European Parliament (representing citizens) and the Council (representing EU Member States), facilitated by the European Commission (representing EU interests). As part of the negotiations, a key question will have to be addressed: should some or all video-sharing platforms be covered by the AVMSD and, if so, how?

On the one hand, there are demands for holding video-sharing platforms like YouTube responsible for content (including legal content) that is published on their sites or apps because of the impact online content has on the public debate and our democracies. On the other hand, these platforms are not producing or publishing content, but only hosting it. The AVMSD covering platforms that are so radically different from those that the Regulation was originally created to regulate – cross-border satellite TV services – would not make sense, as EDRi’s position paper, published on 14 September 2017, argues.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Video-sharing platforms, and social media generally, are not traditional media. While their activities influence (and even manipulate) the population, regulating video-sharing platforms as traditional media is not the solution to undesired impacts on our societies. When two services – linear broadcasting of editorially-controlled content and non-linear hosting of content produced by others – are significantly different, achieving a level playing field through a “one-fits-all” approach is not always possible. The consequences of getting it wrong can have a damaging effect on freedom of expression, competition, the fight against illegal material online and the protection of children in the online environment. At the Council meeting, seven Member States made unusually impassioned pleas to reject the proposed approach, mainly on grounds of freedom of expression. For these reasons, the deletion of the provisions that extend the scope of the AVMSD would be the most rational option, as the EDRi’s position paper suggests.

Failing deletion, EDRi recommends to clarify the definition of what constitutes “video-sharing platforms” and “user-generated content”. In addition, EDRi’s position paper asks for more predictability when asking companies to take action, to avoid abuses, ensure predictability and defend freedom of expression. For instance, some proposals on the table in the trilogue negotiations ask video-sharing platforms to restrict incitement to hatred based on political opinions or “any other opinions”. Asking platforms to delete hate speech based on “any other opinions” is likely to lead to arbitrary restrictions, and affect how we express ourselves online. Another reason to be cautious is that certain provisions would ask these companies to have a “self-regulatory” role in the “moral” development of children. Do we really want companies to decide what is good for the “moral” development of our kids?

Fighting against illegal hate speech, terrorism and child abuse is very important. However, asking companies, to decide what should be acceptable or not in our society is worrisome. Numerous examples demonstrate that content is being restricted in video-sharing and social media platforms without accountability or real redress. Creating a situation where video-sharing platforms are forced to regulate more of our communications and give themselves more leeway to decide on what content we can access or not, despite what the law deems to be illegal, will not be beneficial for the EU.

EDRi position on AVMSD trilogue negotiations (14.09.2017)
https://edri.org/files/AVMSD/edriposition_trilogues_20170914.pdf

ENDitorial: AVMSD – the “legislation without friends” Directive? (14.06.2017)
https://edri.org/avmsd-the-legislation-without-friends-directive/

Audiovisual Media Services Directive reform: Document pool
https://edri.org/avmsd-reform-document-pool/

(Contribution by Maryant Fernández Pérez, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
06 Sep 2017

Denmark: Targeted ANPR data retention turned into mass surveillance

By IT-Pol

Since mid 2016, Denmark has a nationwide automatic number plate recognition (ANPR) system with stationary cameras at 24 locations and mobile cameras mounted on 48 police cars. The ANPR system is currently being integrated with POL-INTEL, the new Danish system for intelligence-led policing (predictive policing), which is supplied by Palantir Technologies. Expansion of the ANPR system with more cameras can be expected in the coming years.

Preparations for the ANPR system started in 2014. Besides the public tender and subsequent deployment of the ANPR equipment, a legal framework for using ANPR was also put in place. The Ministry of Justice decided in 2015 that it was sufficient to lay down rules for processing ANPR information in an administrative order. This meant that surveillance with ANPR was introduced in Denmark without ever being debated in the Parliament.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The legal framework for ANPR makes a distinction between hits and no-hits when a number plate of a vehicle is scanned by the ANPR equipment. Hits are number plates on the police hotlist – that is vehicles which are wanted by the police for reasons ranging from unpaid insurance, mandatory inspections skipped by the owner, vehicles reported stolen, to suspected involvement in criminal activities. Vehicles registered in the Schengen Information System (under Council Decision 2007/533/JHA) by other EU Member States for discreet checks (Article 36) or sought for purposes of seizure (Article 38) can also be put on the hotlist. No-hits are number plates with no match on the hotlist.

The ANPR system is designed to serve a dual purpose. If a police car with mobile ANPR equipment encounters a vehicle on the hotlist, the police officers get a signal from the ANPR device, so that they can decide whether to pursue the vehicle or not. This part of the ANPR system is actively promoted by the Minister of Justice and the Danish National Police as a huge help for police officers on the road. The second purpose of the ANPR system, which is rarely mentioned in public by the same authorities, is the passive retention of number plates encountered by either mobile ANPR in police cars or the stationary ANPR cameras. The location, timestamp, and a picture of the vehicle, which may include the driver and passengers, is also stored in the central ANPR database.

Retention periods for ANPR hits range from three months to two years, depending on the reason for being on the hotlist. If a vehicle is on the hotlist because of unpaid insurance or skipped mandatory inspections, the mobile ANPR equipment can be used to stop the vehicle and confiscate the number plates. Retention of location information in cases like this is neither necessary nor proportionate since any further processing of the ANPR data will be totally unrelated to the reasons for putting the vehicle on the hotlist.

However, the main controversy has been around the retention of no-hits, that is vehicles that are not even wanted for minor offences such as driving without insurance. The original plan of the Danish National Police was to retain all no-hits for 30 days and use this information for backward-looking investigations, such as using data mining (profiling) to determine persons of interest based on their proximity to the time and place where a crime was committed. The Danish Data Protection Agency (DPA) objected to the proposal to retain all ANPR no-hits. In an Opinion of 17 March 2015, the DPA concluded that blanket retention of all no-hits was not legal, and that retention of no-hits could only be done under certain conditions, for example in connection with targeted surveillance at the border.

Due to the opinion of the Danish DPA, the ANPR administrative order of December 2015 provides that no-hits can be retained for up to 30 days only if the no-hit is registered in connection with a targeted police operation, which must be limited in time and geographic area. These conditions bear some resemblance to paragraph 59 of the judgment on the Data Retention Directive (joined cases C-293/12 and C-594/12) by the Court of Justice of the European Union (CJEU) in April 2014. Accordingly, only targeted data retention, and not blanket data retention, is allowed for the Danish ANPR system. Unfortunately, the administrative order does not give any guidance as to how a limited time period and a limited geographic area should be interpreted, except that this will be specified in internal guidelines by the Danish National Police.

During the summer 2017, it was revealed through freedom of information (FOI) requests that most no-hits were actually retained in the ANPR system. Specifically, the Danish National Police decided in November 2016 that all 24 locations with stationary ANPR cameras are part of targeted police operations running until the end of 2017. This decision paved the way for retaining all no-hits from the stationary ANPR cameras for 30 days. No-hits from the mobile ANPR equipment are not covered by this decision, and hence not necessarily retained on a general basis for 30 days, but the mobile cameras account for less than 10% of the scanned number plates.

The FOI request further revealed that 830 000 no-hits are retained every day, and that the ratio between retained no-hits and hits is 90:1. The Danish National Police has repeatedly denied FOI requests for documents showing the location of the stationary ANPR cameras, but since the cameras are very visible in the landscape, their location has been mapped by activists. The unofficial map at the website www.anpg.dk shows that roughly half of the ANPR cameras are placed at border crossings (all intra-Schengen borders), whereas the other half covers major traffic intersections. The map indicates a strategic positioning of the stationary ANPR cameras in areas where lots of vehicles are encountered every day.

In essence, the ANPR system has become a tool for mass surveillance since 99% of the retained number plates are not of any interest to the police when the location of the vehicle is stored in the central database. The justification for storing no-hits is subsequent processing for unknown purposes and that the data may be useful for the police. Moreover, the opinion of the Danish DPA, that no-hits can only be processed in the ANPR system under certain conditions rather than generally as the police wanted initially, and the targeted data retention regime prescribed by the ANPR administrative order, have been completely subverted by the decision of the Danish National Police to include all stationary ANPR cameras all the time in “targeted” police operations where no-hits can be retained for 30 days.

After the story was reported in Danish news media, the police confirmed that all no-hits from the stationary ANPR cameras are retained. In a later interview with Dagbladet Information, the Danish National Police called the criticism misguided. The retention of no-hits is geographically limited to the locations where the police has decided to put up stationary ANPR cameras. Even though there are cameras throughout Denmark, as seen on the unofficial map, not every road in Denmark is covered by ANPR, and in that sense, only a limited geographic area is subject to surveillance. According to the police, the requirement of “a limited time period” is satisfied by putting an end date on the targeted police operation allowing no-hits to be retained. This end date can, however, be extended with a later decision by the police.

On 13 August 2017, EDRi member IT-Pol Denmark and Bitbureauet filed a complaint with the Danish DPA about the retention practices for ANPR no-hits. The complaint is currently being investigated by the DPA.

EDRi: New legal framework for predictive policing in Denmark (22.02.2017)
https://edri.org/new-legal-framework-for-predictive-policing-in-denmark/

EDRi: Denmark about to implement a nationwide ANPR system (02.07.2014)
https://edri.org/denmark-implement-nationwide-anpr-system/

Unofficial map with the location of Danish ANPR cameras
https://anpg.dk/

Danish car owners subject to extensive surveillance even though they are not suspected of anything, Dagbladet Information (only in Danish, 25.07.2017)
https://www.information.dk/indland/2017/07/danskere-bil-udsat-omfattende-overvaagning-politiet-mistaenkt

Complaint to the Danish Data Protection Agency about retention practices for ANPR no-hits (only in Danish, 13.08.2017)
https://itpol.dk/sites/itpol.dk/files/anpg-klage.pdf

(Contribution by Jesper Lund, EDRi member IT-Pol, Denmark)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
01 Aug 2017

Italy plans to extend telecoms data retention and increase censorship powers

By Hermes Center

On 19 July 2017, the Chamber of Deputies of the Italian Parliament approved two amendments to existing laws. One of the amendments aims at extending telecommunications data retention to six years, while the other gives Agcom, the communications regulator, powers to order takedown and blocking of online content without judicial oversight.

Data retention in Italy is governed by Art. 132 of the Privacy Law – 24 months for phone communications metadata, 12 months for Internet metadata, 30 days unanswered phone calls. The amendment will extend the retention period for all categories of the above data to six years.

The Data Retention Amendment (Art. 12-Ter of DDL 4505-A) was written by Walter Verini (Democratic Party, PD), and Giuseppe Berretta (Democratic Party, PD) as an amendment to a law that regulates the safety of lifts, which led to many members of Parliament (MPs) voting it without even reading the amendment. Later, several MPs issued public statements of regret, admitting their mistake.

After the public criticism, one of the co-signatories of the Data Retention Amendment, Ms Mara Mucci (Mixed Group) acknowledged that she had not really realised the sensitivity of the issue and that she is now available to foster a wider debate to change the law in the Senate.

Antonello Soro, the President of the Italian Data Protection Authority, condemned Mr Verini’s amendment arguing that it does not clearly guarantee the principles of proportionality as defined by the EU regulatory framework and the rulings of the Court of Justice of the European Union (CJEU) – there have been two separate CJEU rulings against indiscriminate telecommunications data retention.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

The Italian Association of Internet Providers (AIIP) also criticised the Data Retention amendment as too broad and as being in contradiction with EU case law. It also criticised the fact that the amendment was introduced without involving the key stakeholders impacted by the bill for contribution and opinions.

With regard to the “Takedown Power” amendment, currently in Italy only a court order can mandate Internet Service Providers (ISPs) to takedown a website or restrict access to specific content, by IP address or domain name. Under the new law, an administrative authority Agcom will be given the power do so without any kind of judicial oversight.

The Takedown Power Amendment (Amendment n. 1022 published in Annex A of parliamentary sitting 19/07/2017) sparkled an immediate reaction again by AIIP, which sharply criticised it due to the excessive powers it gives to the Agcom.

The amendment further gives Agcom the mandate to issue a technical regulation that will define the requirements for the implementation of permanent blocking infrastructure to be implemented by ISPs, de-facto requiring the deployments of Deep Packet Inspection system.

These amendments have been approved by the Chamber of Deputies of the Italian Parliament in the context of the process to implement European Union legislation, “Disposizioni per l’adempimento degli obblighi derivanti dall’appartenenza dell’Italia all’Unione europea — Legge europea 2017 (DDL 4505-A)”.

The two amendments had a total amount of parliamentary debate of less than 2 minutes (video).

The Law, including the two amendments, still has to be approved by the Senate, likely to happen in early September.

Indiscriminate Retention of data for 6 years – keeping updated track on all resources on the topic (only in Italian 23.07.2017)
https://www.hermescenter.org/conservazione-indiscriminata-dei-dati-per-6-anni/

Data retention: President Soro, 6 years data retention term for telephony metadata are too much (only in Italian, 26.07.2017)
http://www.gpdp.it/web/guest/home/docweb/-/docweb-display/docweb/6651715

From elevators to Massive Surveillance (only in Italian, 22.07.2017)
https://popinga.it/dalla-sicurezza-degli-ascensori-alla-sorveglianza-di-massa-4eac2144c6d7

6 years as terms of data retention on all phone and internet data just approved in a directive on elevator’s safety (only in Italian, 21.07.2017)
http://fulviosarzana.nova100.ilsole24ore.com/2017/07/21/6-anni-e-il-termine-di-conservazione-dei-dati-telefonici-e-telematici-di-tutti-i-cittadini-appena-approvato-alla-camera-in-una-direttiva-sugli-ascensori/

Italian ISPs say new copyright amendment infringes human rights (28.07.2017)
https://torrentfreak.com/italian-isps-say-new-copyright-amendment-infringes-human-rights-170728/

The full text sent to the Senate
http://www.senato.it/service/PDF/PDFServer/BGT/01036793.pdf

(Contribution by Fabio Pietrosanti, EDRi observer Hermes Center, Italy)

Twitter_tweet_and_follow_banner

close
26 Jul 2017

Oversight Board report: Illegal surveillance of Danish citizens

By IT-Pol

The annual report from the Danish Intelligence Oversight Board (TET) was published on 7 July 2017. Under Danish law, TET is tasked with overseeing the data collection and data processing practices of the Danish Security and Intelligence Service (PET) and the Danish Defence and Intelligence Service (DDIS). Both intelligence services operate mostly outside European Union (EU) law because of the national security exemption in the EU Treaties.

The previous annual reports for activities in 2014 and 2015 contained substantial criticism of especially PET. In a large number of cases, PET retained personal data which was no longer necessary, and in the opinion of TET, further processing of that data was therefore unlawful. PET disagreed with this interpretation of the PET law, and the matter was referred to the Minister of Justice in May 2016. His solution was to propose an amendment of the PET law which essentially removed the requirement to erase personal data that was no longer necessary, and this amendment was swiftly adopted by the Danish Parliament in December 2016.

This year, the most interesting revelations are in the report covering the activities of DDIS, which is the foreign intelligence service. Under Danish law, DDIS can collect any information for essentially any foreign intelligence purpose, as long as the operations are abroad. DDIS can also process information about Danish citizens and foreign residents in Denmark (collectively referred to as “Danish persons”), if this occurs as incidental collection in connection with an operation that is directed against developments abroad. The only real legal restriction for DDIS is that targeted collection against Danish persons is not allowed.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

An amendment of the DDIS law in 2015 introduced an exception to this rule: if a Danish person is believed to be travelling abroad and is suspected of involvement in terrorist activities against Denmark or Danish interests (which includes Danish allies), DDIS can obtain a court order for targeted collection against that person. The required level of suspicion is lower than in regular criminal investigations of terrorist cases by the Danish police. Association with “radicalised individuals” is mentioned in the comments of the law as sufficient grounds for DDIS to obtain a court order for targeted collection of intelligence information. This information can be shared with the Danish police and used as evidence in a criminal prosecution.

In summary, the DDIS law represents an extensive data collection regime with very few restrictions that only pertain to Danish persons. Nonetheless, TET found several cases of data protection violations by DDIS during its oversight activities in 2016.

First, TET criticised that some mass collection activities contained a disproportionately large fraction of Danish persons. Mass collection, called “raw data”, is allowed under the DDIS law as long as the mass surveillance is directed against developments abroad, and as long as DDIS does not actively search for (“target”) Danish persons in the collected raw data. However, there is an upper limit on the allowed fraction of Danish persons in the collected raw data, presumably for compliance with the “directed against developments abroad” requirement. The TET report does not say anything about the type of collection, except that it is signals intelligence, SIGINT, which generally means electronic communications. A plausible example could presumably be international telephone calls from or to Denmark, or internet traffic which terminates in Denmark, rather than transiting through Denmark.

Secondly, in a sample of searches of SIGINT raw data by DDIS analysts, TET found that 12 percent of the searches unlawfully targeted Danish persons. Specifically, in these cases, the DDIS analysts should have known beforehand that the search results would mainly contain information about Danish persons. Targeted collection against Danish persons is only allowed with a court order, which was not obtained for these searches. The total number of searches in SIGINT raw data by DDIS is not mentioned in the report, so the estimated number of Danish persons affected by these unlawful searches remains unknown.

Thirdly, TET also found irregularities in the targeted collection against Danish persons that was authorised with a court order. In 11% of the cases surveyed by TET, the targeted searches of raw data did not respect the time limitations of the court order. What this means is not entirely clear. It could simply refer to searches done before the court order was obtained or after it has expired. Alternatively, the court order for targeted collection could potentially impose time-related limits on the raw data that can be searched, for example a prohibition on searching SIGINT raw data collected before the date of the court order. In this way, the court order would only authorise future interception of the electronic communications of the target.

The unlawful searches of SIGINT raw data by DDIS highlight the massive privacy problems inherently associated with the mode of operation of defence intelligence services. Law enforcement authorities generally only intercept communications of specific persons subject to prior approval by an independent judicial authority, and the targeted interception (“collection”) is done by the electronic communications provider, typically a private company. Defence intelligence services, on the other hand, collect electronic communications of everyone on their own accord, often referred to as the “collect it all” principle. The privacy and data protection safeguards provided for by law are solely implemented as internal policy restrictions on how these massive databases of electronic communications can be searched and analysed. Independent oversight of compliance with these restrictions is difficult, at best, and the oversight relies on accurate access logging of all searches by analysts. The TET report also criticised the lack of access logging in several cases, again without providing specific details.

The public reaction in Denmark to the unlawful searches of raw data by DDIS in 2016 has been very limited so far. On the day the TET report was published, the head of DDIS gave a short interview to Danish media and explained that the unlawful searches were all done by mistake since there was no systematic pattern in the various searches. The chairwoman of TET seems to agree with this rather odd explanation, but she also told Danish media that TET would intensify the future oversight of DDIS after the discovery of the unlawful searches.

The political reaction has been even more limited than the media coverage, probably owing to the fact that most Danish politicians are on holidays in July. However, the Minister of Defence will be asked to appear before a parliamentary committee later in the year. In previous years, the reports from TET were published in May, while Parliament is still in session. It is not clear why the publication of the annual report was delayed to July in 2017. TET submitted the report to the Danish government on 16 May 2017. The government must then present the report to the intelligence committee of the Danish Parliament before the report is published. For unknown reasons, this process took almost two months in 2017, compared to 2-3 weeks in earlier years, pushing the publication of the TET report into the month of July and the political holiday period.

Homepage of the Danish Intelligence Oversight Board, annual reports (only in Danish)
http://www.tet.dk/en/

EDRi: Denmark: Weakening the oversight of intelligence services (05.04.2017)
https://edri.org/denmark-weakening-the-oversight-of-intelligence-services/

EDRi: Danish anti-terror proposal expands surveillance (11.03.2015)
https://edri.org/danish-antiterror-proposal-expands-surveillance/

Spy service on illegal searches: it happened by mistake, DR Nyheder (only in Danish, 07.07.2017)
http://www.dr.dk/nyheder/indland/spiontjeneste-om-ulovlige-soegninger-der-er-tale-om-fejl

Watchdog intensifies oversight of intelligence service after repeated breaches of law, Jyllands-Posten (only in Danish, 14.07.2017)
http://jyllands-posten.dk/indland/ECE9725723/vagthund-intensiverer-kontrol-med-efterretningstjeneste-efter-gentagne-lovbrud/

(Contribution by Jesper Lund, EDRi member IT-Pol, Denmark)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
28 Jun 2017

An end to copyright blackmail letters in Finland?

By Heini Järvinen

On 12 June, the Finnish Market Court ruled in a case Copyright Management Services Ltd vs. DNA Oyj that Internet Service Providers (ISPs) are not obliged to hand out the personal data of their clients based only on the suspicion of limited use of peer-to-peer networks. Stronger proof of significant copyright infringements need to be presented in order to obtain the data.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Law firms have been sending letters to demand payments as damages for distribution of copyright-protected contents, and to threaten the people suspected of copyright infringement with legal proceedings. The ruling will put an end to this practice.

The Finnish Market Court has previously interpreted even the distribution of minor amounts of data in peer-to-peer networks as a “significant copyright infringement”. However, thanks to the case law of the Court of Justice of the European Union (CJEU), the court has now changed its interpretation. The CJEU has emphasised in its recent rulings that when evaluating the significance of the infringement, the concrete harm caused by the distribution done through a single IP address has to be taken into account.

The compensation claim brought to the court was based on approximately a thousand observations of cases in which films had been made available in BitTorrent peer-to-peer network. The court did not consider these cases to constitute a “significant amount”, because it was not possible to draw conclusions on the repetitiveness, duration, number of distributed works, and the concrete impact on other peer-to-peer users.

The seven judges decided unanimously to refuse obligation for the ISPs to hand out their clients’ personal data. Another important aspect of the decision was that the burden of proof for a “significant copyright infringement” was considered to be on the plaintiff, not the defendant.

On the other hand, on 14 June 2017, the Market Court gave its decision in a case Copyright Management Services Ltd vs. Elisa Oyj, another Finnish ISP. The court stated in its decision that the ISP is obliged to retain its clients’ data for the purpose of releasing it later. The decision, however, emphasised that the purpose of retaining the data is not to grant the plaintiff the access to it, but to avoid the loss of the data until the possible release. This requirement to store consumer data is hard to reconcile with two Court of Justice of the EU rulings prohibiting suspicionless retention of communications data (the Digital Rights Ireland case and the Tele2 ruling) and one explaining the requirement to have a specific law when imposing restrictions such as data retention (the Bonnier Audio case).

Finnish Parliament argued over the copyright initiative (21.05.2014)
https://edri.org/finnish-parliament-argued-over-the-copyright-initiative/

Finland: Common Sense in Copyright Law (24.04.2013)
https://edri.org/edrigramnumber11-8finland-copyright-blackout/

Finnish Big Brother Award goes to intrusive loyalty card programme (07.09.2017)
https://edri.org/finnish-big-brother-award-goes-intrusive-loyalty-card-programme/

Copyright letters facing headwinds – Market Court changed its line (only in Finnish, 12.06.2017)
https://www.turre.com/markkinaoikeus-muutti-linjaansa-tekijanoikeuskirjeista/

Farewell to the blackmail letters? Market Court decision makes it more difficult to claim compensation from peer to peer users (only in Finnish, 15.06.2017)
http://www.hs.fi/talous/art-2000005256360.html

Lawyers are sending blackmail letters to ask for compensation for downloading TV series and movies – “It’s useless to ask a lawyer about moral” (only in Finnish,19.01.2017)
http://www.hs.fi/talous/art-2000005052577.html

(Contribution by Heini Järvinen, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close