EU Censorship Machine: Legislation as propaganda?
The provisions are so controversial that supporters in the European Parliament have resorted to including purely political – and legally meaningless – "safeguards" in the text as a way of getting the proposal adopted.
The European Parliament’s Legal Affairs Committee will vote on 20 June on a proposal which will require internet companies to monitor and filter all uploads to web hosting services.
The provisions are so controversial that supporters in the European Parliament have resorted to including purely political – and legally meaningless – “safeguards” in the text as a way of getting the proposal adopted.
For example:
– “the measures referred to in paragraph 1 should not require the identification of individual users and the processing of their personal data.”
The proposal requires internet companies to provide an “effective and expeditious complaints and redress mechanism”. It is logically impossible to have a filtering system that neither identifies the users nor processes their personal data but still, when content is removed, allows them to complain. What do they complain about when there is no record of the content uploaded by that specific person being deleted?
– “ensure the non-availability”
This is simply a more complicated and less easy to understand way of saying “upload filtering”
– “1.b Members States shall ensure that the implementation of such measures shall be proportionate and strike a balance between the fundamental rights of users and rightholders”.
The Charter of Fundamental Rights applies to governments and the European Commission. The “agreements” to block and filter content would be a commercial decision and therefore outside the reach of fundamental rights legislation.
The Parliament and Member States already agreed (in the recently concluded Audiovisual Media Services Directive) to reject proposals for specific laws to protect fundamental rights in this field.
“and shall in accordance with Article 15 of Directive 2000/31/EC, where applicable not impose a general obligation on online content sharing service providers to monitor the information which they transmit or store”
Article 15 of Directive 2000/31/EC prohibits Member States from imposing a general obligation on internet companies to monitor information that they store. This text suggests upload filters indirectly, in order to circumvent the Charter and EU courts. The reasoning behind is that an obligation to enter into “voluntary” commercial agreement between two private parties “to prevent the availability” of online content will respect EU legislation, while the practices derived from its implementation can only lead to de facto general monitoring of uploads.
– “The definition on online content sharing service providers under this directive does not cover services acting in a non-commercial purpose capacity such as online encyclopaedia, and providers of online services where the content is uploaded with the authorisation of all concerned rightholders, such as educational or scientific repositories.”
The fact that they had to include this text proves how wide the effects of Article 13 can be. The problem with this carve out is in the details: what is “acting in a non-commercial purpose” for foundations accepting donations? Similarly, how could future uses of such services be monetised without being “non-commercial”? Furthermore, these carve outs (allegedly targeting individual organisations like Wikipedia and GitHub) are written so vaguely that may not leave sufficient room for them – depending how each court in each Member State will interpret this – or for future similar services.
The vote is on 20 June. If you want to have your say and tell Parliamentarians what you think about this, go to www.saveyourinternet.eu to find out how.