15 May 2017

Audiovisual Media Services Directive reform: Document pool

By Maryant Fernández Pérez

On 25 May 2016, the European Commission proposed to reform the Audiovisual Media Services Directive (the “AVMS Directive” or “AVMSD”). The current AVMS Directive (2010) is the European Union (EU) legal framework that regulates traditional TV broadcasters and on-demand services in the EU Member States. The AVMSD contains rules on audiovisual advertising; jurisdiction over providers; promotion of European works; and on providers’ obligations with regards to commercial communications, protection of minors from potentially harmful content, fight against “incitement to hatred”, among other measures. The new proposal broadens the scope of the Directive to cover the regulation of video-sharing platforms and potentially even other social media companies.

European Digital Rights (EDRi)’s main concern is the lack of clarity and safeguards for respecting the rule of law and protecting fundamental rights. The AVMSD fails to clearly draw the line between different concepts and different services. A lack of clarity affects competition, freedom of expression and opinion, the fight against illegal material online and the protection of children online.

EDRi is working with the EU institutions throughout the different stages of the legislative process. In this document pool, you will find the relevant information, documents, and analyses on the AVMSD. We’ll be updating this document pool as the process advances. Last update: 23 May 2017.


Legislative documents

Find more information and documents in PRELEX (the EU Database on preparatory acts), OEIL (the European Parliament’s Legislative Observatory), Council’s document register, IPEX (the Interparliamentary Exchange Platform) and Statewatch news.


EDRi’s analyses and recommendations


EDRi’s blogposts and press releases


Twitter_tweet_and_follow_banner

close
08 May 2017

Killing parody, killing memes, killing the internet?

By Joe McNamee

We love the internet because it creates fantastic opportunities to express ourselves and to innovate.

But do we love it enough to pass it on to future generations?

Nearly 20 years ago, politicians made decisions that gave us the internet we have today. Visionary policy-makers decided not to punish internet companies for the actions of their customers. This created legal certainty to allow companies to build their businesses, reduced incentives to censor communications and allowed the internet to flourish.

It is our duty to protect this freedom.

Crazily, only one article of the European Union’s draft Copyright Directive (PDF) threatens to destroy this positive model, both in the EU and globally.

Article 13, fewer than 250 words, is designed to provoke such legal uncertainty that internet companies will have no option other than to block, filter and monitor our communications, if they want to have any chance of staying in business. Ultimately, only the current internet giants, shedding crocodile tears at the prospect, will be able to survive. From global internet to “Googlebook”.

The 250 words and their explanatory notes contain five proposals that would destroy the fundamental building blocks of internet freedom.

1. Companies as the villains

The title of the key provision of the Copyright Directive includes the words “use of protected content by information society service providers”. In other words, it is companies’ use of the content, not their users’. They are doing it. If any contents violating copyright is found on their websites, if they do not censor enough information, they are guilty. This approach is reinforced by amendments tabled in the European Parliament that shifts responsibility for “making available” of content, generated or uploaded by the users, onto the companies.

2. Moving the goalposts

Under EU law, companies that store files for their customers (“hosting” companies) are not responsible for illegal or unauthorised content about which they have no knowledge. The new Directive redefines this activity in a way that virtually no hosting company would be covered by this protection.

3. Filtering and monitoring

Companies hosting “a large amount” of files would be obliged to “prevent the availability” of material that has been “identified” by rightsholders. This means monitoring EVERYTHING – video, text, images, and blocking ANYTHING that rightsholders want to block. Forget parody, forget quotation, forget memes.

4. Forgetting the people

Remember when policy-makers used to tell us our opinion mattered? Multistakeholderism, they called it. Ah, the good old days. Under the Copyright Directive, this mass filtering, monitoring and blocking of our speech, our memes and our parodies must be carried out based on cooperation between rightsholders and (whatever is left of) the internet companies – no mention of actual people.

5. Easter egg

Some “good news” though, there is a “safeguard”. The new legislation will give you the right to complain to the companies that are following their legal obligation to delete whatever they are told to delete. Sort of like the way we have the right to shout at clouds, to complain about the rain.

This is real. It is happening now. Internet freedom is being strangled by vested interests, incompetence, and indifference.

For people that write TL;DR more than fifteen times a week, find out more at: https://savethememe.net/en

Do you want to tell EU Parliamentarians what you think about this monstrosity? You can do so here:
https://act1.openmedia.org/savethelink

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

For people who like to read more:

The complete guide to the horrors of Article 13: https://edri.org/files/copyright/copyright_proposal_article13.pdf

ENDitorial: Transparency and law-making on EU copyright – mutually exclusive?https://edri.org/transparency-and-law-making-mutually-exclusive/

Civil society urges EU institutions to stop the “censorship machine” in the copyright proposal
https://edri.org/civil-society-urges-eu-institutions-to-stop-the-censorship-machine-in-the-copyright-proposal/

Copyright Directive: Lead MEP partly deletes the “censorship machine”
https://edri.org/copyright-directive-lead-mep-partly-deletes-censorship-machine/

A positive step forward against the “censorship machine” in the Copyright Directive
https://edri.org/positive-step-forward-censorship-machine-copyright-directive/

ENDitorial: What do two copywrongs make? Definitely not a copyright
https://edri.org/enditorial-two-copywrongs-make-definitely-not-copyright/

Copyright reform: Document pool
https://edri.org/copyright-reform-document-pool/

Independent academic analysis: A Brief Exegesis of the Proposed Copyright Directive
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2875296

Read this article on our Medium:
https://medium.com/@edri/killing-parody-killing-memes-killing-the-internet-b864df222047

Twitter_tweet_and_follow_banner

close
03 May 2017

EU data protection watchdogs support stronger ePrivacy legislation

By Guest author

On 10 January 2017, the European Commission (EC) published its long-awaited proposal for an e-Privacy Regulation (ePR) to replace the 2002 e-Privacy Directive (ePD). In April 2017, two Opinions were issued to provide comments and recommendations on how to better safeguard the right to privacy, confidentiality of communications, and the protection of personal data in the proposed ePR; one by the Article 29 Data Protection Working Party (WP29), and another one by the European Data Protection Supervisor (EDPS).

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Both Opinions share the idea that the EC took the right decision when proposing this legislation. As mentioned by WP29 and the EDPS, the proposal has several positive elements. However, the Council of the European Union and European Parliament now need to focus on fixing the negative aspects that undermine the level of protection accorded by the General Data Protection Regulation (GDPR). The most sensitive issues among the improvements identified by both Opinions are:

Keep definitions in Regulation: Both the EDPS and WP29 share the opinion that the definitions under the ePR could become “moving targets”, if they are imported from the still unfinished European Electronic Communications Code (EECC). WP29 is proposing alternatives, including additional clarifications in the ePR or a simultaneous adoption of both proposals. The EDPS is asking for independent terms, as the definitions created for purposes of economic (market) regulation cannot be expected to be adequate for the protection of fundamental rights.

Privacy by default and by design are essential and not optional: The principle of “privacy by default”, as provided in the GDPR, has been replaced with “privacy by option” in the ePR. This implies that end-users would be given the “option” to determine through software settings whether they allow third parties to access or store information on their devices. Given the inconsistency of this provision with Article 25 of the GDPR, both authorities are proposing to impose an obligation on hardware and software providers to implement default settings that protect end-users’ devices against any unauthorised access to or storage of information on their devices. The EDPS goes even a step further and argues for a provision that would allow users not only be informed about privacy settings during installation or first use of the software, but also at other moments when users make significant changes to their devices or software.

Tearing down “tracking walls”: Tracking walls deny users access to the websites that they are seeking to use, because they do not consent to being tracked across other sites by large numbers of companies. Both Opinions are advising against this possibility to continue allowing tracking walls, with some nuances. While WP29 recommends a weaker solution, the EDPS is asking for a complete and explicit ban on tracking walls. The EDPS argues that according to the GDPR, giving consent has to be a genuinely free choice, and these digital walls cannot result in real consent.

Neither online nor offline tracking: WP29 addresses the issue of offline tracking, and argues that data controllers should, only in limited number of circumstances, “be allowed to process the information emitted by the terminal equipment for the purposes of tracking their physical movements without consent of the individual concerned”. WP29 Opinion also suggests that device tracking should only be permitted if the personal data collected is anonymised. Moreover, the EDPS recommends that the provisions allowing for device tracking be deleted and replaced by a simpler requirement of consent (by all end-users concerned).

Keep an eye on the restrictions: Under the current Directive and the proposed Regulation, non-targeted data retention measures are allowed. Both Opinions re-state that national data retention regimes have to comply with the requirements of the European Union Charter of Fundamental Rights and of the case law of the Court of Justice of the European Union (CJEU), both of which require strict safeguards for mass storage of data.

Give redress to both individuals and organisations: The EC’s proposal leaves the right to collective redress out of the ePR Regulation text, which is puzzling. The EPDS took note of this omission and made it clear that an explicit provision for collective redress and effective remedies (or more simply a reference to Article 80 of the GDPR) are needed. Including such provision is essential to ensure consistency with the GDPR, and to allow individuals to access collective redress through, for example, consumer groups.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

WP29: Opinion 01/2017 on the Proposed Regulation for the ePrivacy Regulation (2002/58/EC) (04.04.2017)
http://ec.europa.eu/newsroom/document.cfm?doc_id=44103

EDPS: Opinion 6/2017 on the Proposal for a Regulation on Privacy and Electronic Communications (ePrivacy Regulation) (24.04.2017)
https://edps.europa.eu/sites/edp/files/publication/17-04-24_eprivacy_en.pdf

New e-Privacy rules need improvements to help build trust (09.03.2017)
https://edri.org/new-e-privacy-rules-need-improvements-help-build-trust/

e-Privacy Directive revision: Document pool
https://edri.org/eprivacy-directive-document-pool/

(Contribution by Romina Lupseneanu, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
03 May 2017

Hakuna Metadata – Let’s have some fun with Sid’s browsing history!

By Guest author

But I am not interesting enough for someone to bother to look into my browsing history.

The most common argument for people not to be more wary of the threats to their online privacy is that, simply, no one cares. Or at least not enough. But still, don’t we all like to delete our browsing history from time to time, at least to prevent details about some of our searches being exposed unintentionally? Maybe we would care more if we knew just how many surprising insights can be gleaned from our online activity.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

We are aware that when surfing online we give out information about all of our searches and all the websites that we visit. Furthermore, everything we do – click on buttons, move our mouse pointer, type something, scroll up or down – will be tracked through little monsters called “cookies”. All together this information composes your “browsing history”, which is the metadata of your browsing activity. As EDRi member SHARE Foundation showed by diving into one person’s browsing history, we can follow a person through the day and learn about his or her interests, passions, worries – almost as it is seen through that person’s eyes.

This seems scary, perhaps even enough to change our online behaviour a little. However, we imagine that we have control over our browsing history through our computer. We also trust our browser not to abuse the information about our searches. But there are also other interested parties. For example, your Internet Service Provider (ISP), who can access your metadata, has almost full access to your browsing history. Interested in what they can learn about you?

EDRi’s Ford-Mozilla Open Web Fellow Sid Rao built an open source browsing history visualisation tool, which can show you what exactly is it that you give away and how.

So, let’s imagine Sid is connected to the internet through his Internet Service Provider (ISP) – let’s call it “Telekome”. What does Telekome know about Sid, relying on the metadata from his browsing (or simply his browsing history), without ever asking for his consent?

Like most of us, Sid is a creature of habit. That means it is quite easy to learn about his usual everyday routines from his browsing patterns. Sid uses the same laptop both for his work and personal activities, which is a common practice these days. However, the ways he uses the internet during his working and leisure hours are very different.

A simple “heatmap” of his browsing actions gives a snapshot of his lifestyle. In this heatmap, colours are assigned to his browsing history: lightest shade to the time when he has visited the biggest number of unique websites, and darkest when the number decreases. As the graph shows, his sleeping time has darkest patches, meaning that during those hours, he hasn’t been browsing much. His leisure time has light coloured patches – showing that during those times he probably watches online videos, but does not systematically spend all this time online. Finally, the most cluttered part of heatmap with a lot of light coloured patches is his work hours, which he typically spends mainly online, visiting many websites.

With adding up other metadata, such as the suffix of the domain name of the browsed websites, which generally correlate to a specific country, Telekome can easily learn that he travelled to a different time zone, but continued working on his usual hours.

Anomalies (in this case, the strange patches of different shades of colour in irregular places) in the pattern could mean different things: Has Sid’s workload increased? Is he planning a trip? Searching for a job? In this case, we see how browsing activity indicates the holidays Sid took. This depicts that he planned his holidays by checking flights, confirming hotel booking, and so on, and took a break from work. He then returned home, where a sudden increase of activity is possibly due to following up on work activities. Finally, he resumed to his usual work pattern.

By now, Telekome knows about Sid’s schedule, but what about his interests? On the basis of keywords, metadata can reveal quite a lot about people, organisations and locations that Sid is interested in. Sid is a security and privacy researcher, so vocabulary related to his work stands out, but also other keywords related to his identity – both from professional and personal life.

We see that Telekome already knows quite a lot about Sid. Is he a potential customer for insurance companies and travel agencies, or a candidate for management related jobs? What could his next travel destination be? Who are the people he is interested in? But hey, this information doesn’t seem to be very harmful after all! Why should he be worried?

He should be worried because the legislation does not adequately protect users’ metadata. It can easily be used by advertisers, data brokers, or even political campaigns. They can then target Sid, according to these data, and adding to it what they already previously knew about him. It might change his consumer behaviour and turn it into profit, or it might as well change how he votes in the next elections!

Wait, there is more! Because of the nature of Sid’s work, some suspicious words, such as “attack” and “security”, turn up frequently. Combined with the fact that he travels often to various destinations, assumptions about his racial profile, and all the other information that can be accessed through his browsing metadata, he might end up on a watchlist of a government agency. Sid’s browsing pattern suddenly changes, he is browsing more than usual, he goes to the airport, and out of nowhere, the authorities do not let him onboard a plane. Something that he planned as relaxing holidays turns into a nightmare.

Metadata can easily be processed with the use of algorithms, which extract our behavioural patterns and profile us. However, metadata can never give the whole picture of who we are. Assumptions have to be made to compensate for the missing pieces of the puzzle – and they can be wrong.

What does your browsing history say about you? (22.02.2017)
https://edri.org/what-does-your-browsing-history-say-about-you/

SHARE Lab: Browsing Histories – Metadata Explorations
https://labs.rs/en/browsing-histories/

EDRi: Hakuna Metadata – Exploring the browsing history (22.03.2017)
https://edri.org/hakuna-metadata-exploring-the-browsing-history/

Hakuna Metadata (1) – Exploring the browsing history
http://www.privacypies.org/blog/metadata/2017/02/28/hakuna-metadata-1.html

Video: Metadata Explained
https://www.youtube.com/watch?v=xP_e56DsymA

(Contribution by Zarja Protner, EDRi intern, and Siddharth Rao, Ford-Mozilla Open Web Fellow, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
03 May 2017

Encryption – debunking the myths

By Guest author

How to send a sensitive message protecting it from spying eyes? Encrypt it. You think your message is not sensitive or that no one is spying on you? Encrypt it anyway.

When you send your message encrypted, no-one else but the intended recipient can read it. Even if someone manages to catch the message when it’s on its way to the recipient, they will not be able to read its contents – they can only see something that looks like a random set of characters.

Encryption is essential for the protection of our digital infrastructure and communications, but it is still burdened by some myths that keep on surfacing in discussions.

1. For spies and geeks only

Not only spies, criminals and privacy geeks use encryption. In fact, everyone is benefiting from it on a daily basis, even if everyone is not aware of it. Encryption not only guarantees the confidentiality of our communications, but it also makes our lives easier and enables digitalisation of the society.

Electronic banking? Encryption is what makes our transactions safe and secure. The same goes for any online activities of businesses to protect themselves against fraud. Citizens submit digital tax returns, the intelligence community encrypts state secrets, the army sends orders securely in order to avoid compromising military operations, and civil servants negotiate trade deals by sending messages that only the addressee can read (or they should!). Journalists rely on it to protect their sources and information when investigating confidential or potentially dangerous issues of crime, corruption, or other highly sensitive topics, performing their role of the democratic watchdogs. Without encryption ensuring authenticity, integrity, and confidentiality of information, all this could be compromised.

2. Who cares?

Encryption enables us to collect information and communicate with others without outside interference. It ensures the confidentiality of our communications, for example with our doctors, lawyers, partners. It is an increasingly important building block for freedom of expression and respect for privacy. When you achieve privacy through confidentiality of your communication, you are able to express yourself more freely. People prefer to use messaging apps like Signal and WhatsApp, which enable privacy of their communications by employing end-to-end encryption. In a survey, requested by the European Commission, nine out of ten respondents agreed they should be able to encrypt their messages and calls, so they can only be read by the intended recipient. No matter whether you are making dinner plans, sharing an intimate message or dealing with state secrets, whether you are a president, a pop star or just an ordinary citizen, the right to have control over your private communication and protect it from hackers and government surveillance matters.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

3. Criminals, terrorists, and the old “privacy versus security”

How do you make sure encryption is not used with bad intentions? It’s simple – you cannot. But this does not mean it makes sense for governments to weaken encryption in order to fight terrorism and cybercrime. It only opens Pandora’s box – when supposedly making sure that terrorists have no place to hide, we are exposing ourselves at the same time.

From a technical point of view, encryption cannot be weakened “just a little”, without potentially introducing additional vulnerabilities, even if unintentionally. When there is a vulnerability, anyone can take advantage of it, not just police investigators or intelligence services of a specific country when necessary. Sooner or later, a secret vulnerability will be cracked by a malicious user, perhaps the same one it was meant to be safeguarding us from.

Therefore, weakening or banning of encryption in order to monitor any person’s communications and activities is a bad idea. The number of possibilities for criminals to evade government-ordered restrictions on encryption is vast. Knowledge of encryption already exists, and its further development and use cannot be prevented. As a result, only innocent individuals, companies, and governments will suffer from weak encryption standards.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

EDRi: Position paper on encryption (25.01.2016)
https://www.edri.org/files/20160125-edri-crypto-position-paper.pdf

EDRi paper: How the internet works?, page 6: Encryption
https://edri.org/files/2012EDRiPapers/how_the_internet_works.pdf

Surveillance Self-Defense: What Is Encryption?
https://ssd.eff.org/en/module/what-encryption

Winning the debate on encryption — a 101 guide for politicians (21.04.2017)
https://medium.com/@privacyint/winning-the-debate-on-encryption-a-101-guide-for-politicians-4ff4353d427

(Contribution by Zarja Protner, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
03 May 2017

Member Spotlight: epicenter.works

By Guest author

This is the sixth article of the series “EDRi member in the Spotlight” in which our members have the opportunity to introduce themselves and their work in depth.

Today we introduce our Austrian member epicenter.works.

1. Who are you and what is your organisation’s goal and mission?

We are epicenter.works (formerly AKVorrat.at – Arbeitskreis Vorratsdaten Österreich, Working Group on Data Retention) from Austria, a non-profit organisation committed to the preservation of fundamental rights in the digital age and a pluralistic society.

2. How did it all begin, and how did your organisation develop its work?

In the course of taking action against the data retention law in Austria, a group of lawyers, technicians and concerned citizens founded our NGO. We organised one of the most successful citizens’ initiatives in Austria, collecting 106 067 signatures, and successfully contested the national implementation of the Data Retention Directive before the Austrian Constitutional Court (VfGH), supported by 11 167 plaintiffs. Combined with an Irish case, our complaint lead to the complete annulment of this Directive by the Court of Justice of the European Union (CJEU) in April 2014. Because of the massive support from civil society in this case, AKVorrat decided to continue its work of defending civil rights in the digital age. For more than two years, we have been operating a back office with a small number of employees, but our work would not be possible without the many helping hands and volunteers who support us.

Photo: Arbeitskreis Vorratsdaten

3. The biggest opportunity created by advancements in information and communication technology (ICT) is…

… global access to information and the chance to build networks of like-minded people around the globe. Additionally, ICT has the potential to enhance citizen participation, transparency and the democratic accountability of policy and decision-making.

4. The biggest threat created by advancements in information and communication technology is…

… a whole new approach to mass surveillance, and the possibilities it offers for repression of entire sections of society as well as individual critical minds. History shows that surveillance technologies carry the potential to be misused not only by autocratic governments, but also by democratic states that generally abide the rule of law.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

5. Which are the biggest victories/successes/achievements of your organisation?

Our most important success, benefitting all citizens of the European Union, was the annulment of the Data Retention Directive (2006/24/EC) by the CJEU in 2014, in a joined case with EDRi member Digital Rights Ireland. We are also proud of our leading involvement in the savetheinternet.eu campaign, fighting for strong net neutrality rules in Europe.

6. If your organisation could now change one thing in your country, what would that be?

We would introduce a mandatory impact assessment of new surveillance measures before their implementation in law. This assessment is necessary in order to safeguard our fundamental rights and return security policy back to what is factual and effective.

7. What is the biggest challenge your organisation is currently facing in your country?

In January 2017, the Austrian government published an updated working program which introduces a whole new set of surveillance measures. These measures range from comprehensive, networked camera surveillance with real-time picture streaming, government malware to monitor encrypted communication, and a new attempt at introducing telecoms data retention, to mandatory registration of prepaid SIM cards, and monitoring of vehicle number plates. The laws implementing these measures are likely to be adopted by the end of 2017, and we are currently running a nationwide campaign to educate citizens on the proposed measures and to convince politicians to respect citizens’ fundamental rights.

8. How can one get in touch with you if they want to help as a volunteer, or donate to support your work?

You can reach us via e-mail, Twitter and Facebook, visit our website, and check out the website of our campaign to prevent a whole new bundle of surveillance measures proposed by the Austrian government.

epicenter.works
https://epicenter.works/

epicenter.works Facebook
https://www.facebook.com/epicenter.works/

epicenter.works Twitter
https://twitter.com/epicenter_works

Stoppt Das Überwachungspaket! – Campaign website against the surveillance measures proposed by the Austrian Government (only in German)
https://xn--berwachungspaket-izb.at/

(Contribution by EDRi member epicenter.works, Austria)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
27 Apr 2017

AVMS Directive: It isn’t censorship if the content is mostly legal, right?

By EDRi

AVMSD – What is it?

The Audiovisual Media Services Directive (AVMSD) was originally designed for satellite TV, where broadcasters are a) in full editorial control and b) content is actively transmitted to viewers. It was subsequently extended to “on-demand” services, where providers a) make an active choice to decide what is made available, but b) where viewers choose what to watch. The plan is now to extend it to video-sharing and (some) social media platforms, where there is a) no editorial control and b) where viewers choose what to watch. In other words, there is almost no similarity between the original purpose and what is now being done. In many ways, this is like regulating a Porsche using legislation designed for regulating a donkey cart.

What about the E-Commerce Directive on service provider liability?

In both the Council of the European Union and the European Parliament, there has been a lot of discussion about whether the AVMSD undermines the E-Commerce Directive, adopted in 2000. That Directive protects freedom of expression by ensuring that internet companies are not unduly incentivised to delete content. It does so by limiting liability to situations where they fail to act diligently upon receipt of a notice of illegality of the content in question.

The Council and the Parliament want a wide variety of content to be regulated – anything that (based on the wisdom of the provider, in the first instance) might impact the physical, mental and moral development of minors. At the same time, video-sharing and (some) social media platforms are expected to restrict content that is an “incitement to violence or hatred” by reference, for example, to sex, racial or ethnic origin, disability, age, or sexual orientation.

The content that the providers will be required to regulate is not, or not necessarily, illegal. As a result, it is argued that this privatised regulation of freedom of expression does not breach the E-Commerce Directive, because the obligation is to regulate content. In short, restriction of legal content is not a breach of rules that cover illegal content.

So, how will video-sharing platforms do all of this?

One of the options is for states to regulate freedom of expression by regulating the terms of service of the social media companies and video-sharing platforms. This will allow content to be deleted without ever referring to the law. This fits with other EU instruments, such as the Europol Regulation, which allows police authorities to coerce companies into deleting online content. The Europol Regulation creates the task of “making of referrals of internet content, by which such forms of crime are facilitated, promoted or committed, to the online service providers concerned for their voluntary consideration of the compatibility of the referred internet content with their own terms and conditions.” It does not, however, fit so well with the Charter of Fundamental Rights and the European Convention on Human Rights, both of which require restrictions on fundamental rights to be provided for by clear, predictable law.

Craziest proposal – European Parliament

The craziest part of the Parliament’s proposal is probably importing, ironically from the Charter of Fundamental Rights of the European Union, the list of types of discrimination that the EU Member States are prohibited from imposing. These prohibited types of discrimination then become the list of types of “incitement to hatred” that social media companies should protect us from with their terms of service. So, video-sharing platforms would have to protect people from “incitement to hatred” as a result of “other opinions”. The list makes complete sense in the Charter of Fundamental Rights, and no sense at all in the Directive that regulates audiovisual media services.

Craziest proposal – Council of the European Union

Remarkably, the Council text proposes that video-sharing and social media platforms should regulate live-streamed video. The Council also proposes banning of content that is already banned by the Terrorism Directive. The Council’s position before this week’s discussions was leaked by EDRi member Statewatch and is available here.

This is nuts! Are there no voices of sanity?

Yes, just not enough, so far. Seven European Union Member States have expressed serious concerns regarding the proposals to further extend the scope of the AVMSD. They did so in an unpublished joint “non-paper” sent to the EU Council Presidency. The UK has made its reservations known separately. Those seven Member States (Czech Republic, Denmark, Finland, Ireland, Luxembourg, the Netherlands, and Sweden) pointed out the obvious problems of requiring video-sharing platforms to “police” non-illegal content over which they do not have editorial control.

The “non-paper” diplomatically but meaningfully points to the absurdity of the proposal to expand the scope of the Directive to services, which could not “reasonably be expected by an end-user to be regulated similarly to audiovisual media services”, such as animated GIFs.

Some of the political groups in the Parliament have been working astonishingly hard to try to achieve even small improvements in the text. Ironically, while the AVMS Directive represents much of what is worst in EU policy-making, the huge efforts made by some politicians behind the scenes on this file represent some of the finest, selfless, thankless work from EU parliamentarians.

AVMS Directive – censorship by coercive comedy confusion
https://edri.org/avms-directive-censorship-coercive-comedy-confusion/

Audiovisual Media Services Directive – is it good enough to be a law?
https://edri.org/audiovisual-media-services-directive-is-it-good-enough-to-be-law/

Revision of the Audiovisual Media Services Directive (AVMSD), 2016 proposal
http://eur-lex.europa.eu/legal-content/EN/TXT/?qid=1464618463840&uri=COM:2016:287:FIN

Europol Regulation
http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32016R0794

Council text of 24 April
http://www.statewatch.org/news/2017/apr/eu-council-ams-8242-17.pdf

Twitter_tweet_and_follow_banner

close
25 Apr 2017

European Parliament Culture Committee takes strong position against upload filtering

By EDRi

Today, 25 April 2017, the European Parliament Committee on Culture and Education (CULT) voted on the draft Audiovisual Media Services Directive (AVMSD). In a surprise move, the Committee voted to prohibit filtering of uploads by video-sharing platforms. This position, adopted by a majority of 17 to 9, will be the position of the Parliament in its upcoming negotiations with the EU Council, which aim to finalise the text.

A vote opposing upload filtering sends a strong signal, ahead of negotiations on the Copyright reform

said Joe McNamee, Executive Director of European Digital Rights. The European Commission proposes mandatory upload filtering in its draft Copyright Directive. “Now that the CULT Committee has wisely taken a position against mandatory filtering, which is a dangerous tool in the fight against incitement to hatred and violence, it would be absurd if they supported upload filtering for copyright reasons,” McNamee continued.

The Committee clearly tried, in months of compromise negotiations, to find a common ground between expanding the policing obligations of video-sharing and social media platforms and the protection of citizens’ fundamental rights. Unfortunately, the agreed text is far from perfect, so EDRi will keep working with the EU institutions, in the next stages of the process, to order to maximise protection for fundamental rights.

As amended by the CULT Committee, the AVMSD proposes that internet video-sharing platforms should take measures to protect children from (legal) content that could “impair their physical, mental or moral development”. This is extremely broad and dangerous. Internet video-sharing platforms would also be required to protect the general public from “incitement undermining human dignity”, incitement to terrorism, violence and hatred defined by reference to, among other traits and features, “political or any other opinion”. Requiring companies to, for example, restrict how we express ourselves online to protect society from “incitement to hatred” on the basis of “any other opinion” falls below minimum standards of legal predictability required by the EU Charter of Fundamental Rights.

It might sound like a good idea to protect people from bad things. However, nobody actually knows what the video-sharing platforms are meant to be protecting us from, whether such measures would be counterproductive or not, or the scale of the problems that they are supposed to be fixing. It is unclear why the AVMSD included these measures, as there are other legal instruments that deal with the same issues, such as the Child Exploitation Directive, the Terrorism Directive, and the Europol Regulation. It is also unclear whether the proposed measures would actually protect anyone. What is known is that such measures pose a threat to our freedom of expression, by encouraging video-sharing platforms and social media companies to delete perfectly legal material.

Read more:

AVMS Directive – censorship by coercive comedy confusion (19.04.2017)
https://edri.org/avms-directive-censorship-coercive-comedy-confusion/

German Social Media law – sharp criticism from leading legal expert (19.04.2017)
https://edri.org/german-social-media-law-sharp-criticism-from-leading-legal-expert/

EDRi position paper on the proposed revision of the Audio-Visual Media Services Directive
https://edri.org/files/AVMSD/edrianalysis_20160713.pdf

EDRi’s proposals for amendments to the AVMSD (13.07.2016)
https://edri.org/files/AVMSD/edriamendmentproposals_20160713.pdf

Twitter_tweet_and_follow_banner

close
19 Apr 2017

AVMS Directive – censorship by coercive comedy confusion

By Joe McNamee

On 25 April 2017, the European Parliament Committee on Culture and Education (CULT) will vote on its report on the European Commission’s proposal on Audiovisual Media Services Directive (AVMSD).

To understand just how confused the proposal is, it is worth understanding its history. In 1989, the EU adopted the “Television without Frontiers” Directive, to regulate cross-border satellite TV, covering issues such as jurisdiction and protection of minors. This Directive was out of date very quickly, leading to a revision that was adopted in 1997. That, in turn, was quickly out of date and revised in 2007. Then, in 2010, the EU adopted its fourth revision, this time trying to fit video on demand (VOD) services, such as Netflix, HBO Go, Amazon Video and others, into this legislation. In 2016, the European Commission proposed yet another revision, this time trying to squeeze yet another type of service – video-sharing platforms – into regulation designed in the mid-eighties for satellite TV.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

The current proposal, which proposes even more obligations on video-sharing platforms, is horribly contradictory and unclear. It does contain, however, a reasonable amount of comedy, which is an innovation for the EU institutions. For example, this legislation on “audiovisual” content covers, on the basis of Parliament compromise amendments, “a set of moving images”, which would cover, for example, an animated GIF.

Furthermore, it doesn’t cover all online video-sharing. For example, it does not cover video sections of news sites that are “indissociably complementary” to the site (borrowing wording from a Court of Justice of the European Union (CJEU) ruling in the New Media Online case). This means that video contents, featured on a news website, should only be regulated according to the Directive if they are not complementary to the journalistic activity of that publisher and are independent of written press articles on the site.

In a further (failed) effort to add to legal certainty, the Parliament’s draft compromise text also seeks to clarify the notion of “user-generated content” by removing from the Commission’s proposal the notion that it has to be user-generated. If the compromise text is adopted, the new definition of “user-generated” video would be “a set of moving images with or without sound constituting an individual item that is uploaded to a video-sharing platform”. This means that to be a “user-generated video”, it would not need to be user-generated nor, indeed, would it need to be a video.

On a more serious note, the proposal requires badly defined video-sharing platforms to take measures to protect children from content that would harm their “physical, mental or moral development” (“moral” added by the Parliament to various new parts of the Directive). This involves measures to restrict (undefined) legal content. The European Commission proposed also that the companies should enforce the law on incitement to racism and xenophobia. The Parliament’s suggestion is to extend law enforcement to areas where there is no law – such as incitement to hatred of “a person or group of persons defined by reference to nationality, sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion”. The Parliament also proposes setting up dispute resolution systems to verify decisions about which videos should stay online or not after accusations that they might lead to hatred of a person due to, for example, “any other opinion”. Video-sharing platforms will also need to make sure that video uploaders “declare” whether or not their videos contain advertisements, product placement or sponsored content.

It is clear that the broad restrictions of legal and illegal content that video-sharing platforms are meant to impose will lead to significant levels of removal of legal content, particularly due to the spectacularly unclear scope of their obligations. Restrictions on freedom of communication must, under the Charter of Fundamental Rights of the European Union just be “provided for by law” and necessary and genuinely meet objectives of general interest. The Commission’s text failed to achieve this minimum standard, while the draft compromise amendments to be voted on 25 April by the Parliament fall very far short of this standard. The only possible result of the legal chaos that this will create for video-sharing platforms is the deletion of a large amount of legal content, in order to minimise their exposure to possible state sanctions or other litigation.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

Television broadcasting activities: “Television without Frontiers” (TVWF) Directive – Summaries of EU legislation
http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=URISERV%3Al24101

Audiovisual Media Services Directive (2010/13/EU)
http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2010:095:0001:0024:en:PDF

Revision of the Audiovisual Media Services Directive (AVMSD), 2016 proposal
https://ec.europa.eu/digital-single-market/en/revision-audiovisual-media-services-directive-avmsd

(Contribution by Joe McNamee, EDRi)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
19 Apr 2017

Dangerous myths peddled about data subject access rights

By Guest author

Now that the date on which the General Data Protection Regulation (GDPR) becomes enforceable is rapidly approaching, the European Data Protection Authorities (DPAs) are in the process of clarifying what their shared positions will be on various topics, including profiling. This is done through stakeholder consultation meetings.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

During the latest meeting, one of the more contentious issues surrounding profiling turned out to be the transparency requirements regarding the algorithms used for automated decision making and profiling. While industry representatives in general provided constructive input on the various topics, this issue was more challenging. Several industry representatives were pushing for a very narrow interpretation of the right to access regarding the logic in automated decision making.

The basic argument is that industry has a right to hide the precise details of the calculations used to make decisions that discriminate against individuals. Three points were made in support of claims that the right of information regarding the logic of processing should not extend to disclosing the actual algorithms used:

  1. they would be protected trade secrets;
  2. intellectual property rights would preclude such disclosure;
  3. it would create a moral hazard in case of applications of profiling in fraud prevention.

Regarding the protection of trade secrets, the situation is fairly simple. The Trade Secrets Directive (2016/943/EU), for all its flaws, mentions specifically in its recitals that it shall not affect, among other rights, the right to access for data subjects. Since this Directive has to be implemented by June 2018, there is only a window of a few weeks in which trade secrets protections in some member states could, theoretically, prejudice data subject access to the logic used in automated decision making. So for all practical intents and purposes, trade secret legislation cannot be invoked to prevent disclosure of such underlying algorithms.

As far as intellectual property rights are involved, this is even more of a non-issue. The only so-called intellectual property rights that bear relevance here are copyright law and patent law.

Software copyright law does not explicitly cover underlying algorithms, a view that is reiterated in the ruling of the SAS Institute Inc. v World Programming Ltd case (C‑406/10 CJEU), in which the Court of Justice of the European Union (CJEU) ruled that the functionality of a computer program is not protected by copyright under Computer Programs Directive (91/250/EEC).

As far as patent law is involved, the European Patent Convention states that “schemes, rules and methods for performing mental acts, playing games or doing business, and programs for computers” shall not be regarded as patentable inventions (article 52(2) sub c). It would be difficult to argue that the logic for automated decision making in profiling of personal data is not a method for doing business. A requirement for patent law protection is disclosure of the underlying technology, which makes it even less likely to support an argument that it might prejudice disclosure of logic in automated decision making. Given that none of the other intellectual property rights even come close to covering the logic of algorithms, it follows that there are no barriers in intellectual property laws to disclosure of logic for automated decision making.

Even if there were intellectual property rights covering the underlying logic of software algorithms, it would still not necessarily be a given that these should override the data protection legislation. The CJEU has repeatedly considered competition law interests outweighing intellectual property interests in cases where it had to balance competition against intellectual property.

The last argument, that of a moral hazard, may or may not come into play in the context of fraud detection and insurance risk assessment. First of all, the European legislator has never made any exceptions for it in the GDPR, secondly, this can be addressed by disclosure of the logic as applied to a specific data subject instead of disclosure of the general logic as applied to all data subjects affected.

The logical conclusion for DPAs enforcing the GDPR in the future is to interpret the aforementioned arguments from parts of industry with a great deal of cynicism. They simply have no basis in the EU law and/or reality.

Rejections of data subject access requests to the underlying logic of automated decision making based on “trade secrets” or “intellectual property rights” should be treated by DPAs as violations of the GDPR and addressed accordingly.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

The Trade Secrets Directive (2016/943/EU)
http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:32016L0943

Ruling of the SAS Institute Inc. v World Programming Ltd case
http://curia.europa.eu/juris/document/document.jsf?text=&docid=122362&pageIndex=0&doclang=EN&mode=req&dir=&occ=first&part=1&cid=154228

European Patent Convention
http://www.epo.org/law-practice/legal-texts/html/epc/2016/e/index.html

Insurance: How a simple query could cost you a premium penalty (30.09.2013)
https://www.theguardian.com/money/2013/sep/30/insurance-query-higher-premiums

(Contribution by Walter van Holst, EDRi member Vrijschrift, the Netherlands)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close