Social media as censors of public sphere: YouTube vs. Ombudsman
In early September 2016, a popular Norwegian writer got suspended from Facebook, on account of “child nudity”. The matter escalated into an international incident, involving Norway’s largest printed newspaper and the country’s prime minister. Finally the writer’s Facebook status was restored, the suspension was ended, and Facebook promised to do better.
One month before the “Norwegian incident”, a similar problem involving social networks’ automated services occurred in Serbia. The Ombudsman Saša Janković reported on Twitter that his YouTube channel had been suspended. The notification suggests that the suspension was a result of other users’ reports. Janković clarified that, after his associates sent an appeal, an automated reply was sent on 12 August, stating that the account would remain suspended. Again, no explanation was given as to the grounds on which the content was flagged and removed, the reply pointing in general only to the service’s Community Guidelines and Terms of Service.
Meanwhile, the email account used for uploading videos on YouTube was blocked as well, and the provider responded to the appeal in a similar way, pointing to general rules banning hate speech, threats and alike. Local Twitter community noticed that right after Janković posted the info on YouTube suspension, trolls “knew” what it was about.
Saša Janković used the YouTube channel for reposting TV shows and news reports on cases the Ombudsman’s office was involved with. The channel was not regularly used and uploaded videos have fairly low numbers of views, meaning that wider public had little or no knowledge of this means of communicating the Ombudsman’s activities. Furthermore, since its content is made of TV broadcasting segments produced under strict national laws, the YouTube channel could not contain offensive or violent material.
SHARE Foundation’s legal and policy director, an observer member of EDRi, contacted EDRi Brussels office, to facilitate communication with the Google (the owner of YouTube) policy team in Brussels. The feedback from Google was that videos and channels are not removed automatically from YouTube, no matter how often they are flagged. They said that reports of content are reviewed by a team that deals with each report individually. Users should receive notifications during both suspension and reinstatement.Lack of the latter in Ombudsman’s case was deemed as a mistake, not standard practice.
Google offered the following clarification of their procedures regarding YouTube:
“Users whose accounts have been terminated are prohibited from accessing, possessing or creating any other YouTube accounts. When an account is terminated, the account owner will receive an email detailing the reason for the suspension. If you feel an account has been suspended in error, you may appeal the suspension by visiting the following form. We work quickly to remove channels in instances of abuse, or reinstate videos or channels that have been suspended in error, and users should receive notifications during both suspension and reinstatement. Flagged videos and channels are not automatically taken down by the flagging system.”
The YouTube Help Center pages on account terminations offer links to full Community Guidelines and Terms of Service, and point out that any appeals can be submitted according to the explained procedure. The flagging system is also explained in detail. According to the latest data published by Google, over 90 million people have flagged videos on YouTube in the last ten years, while in 2015 alone some 92 million videos were removed.
The moral of the story could be found in several conclusions. First, the system that suspensions and removals are based on is not transparent, and it is not clear how the decisions are made, or why an appeal is rejected. The limits of freedom of speech in online sphere are set by powerful global platforms. This leads to an absurd situation in which the Ombudsman, responsible for protecting the right to freedom of expression of citizens in Serbia, cannot protect his own right to freely receive and impart information on the internet, confronted with human and algorithmic YouTube censors.
Meanwhile, Facebook removed yet another historical image, which apparently proves that Facebook is not doing “better” after its failures in the ‘Napalm girl’ case.
Copyright, hate speech or child pornography provisions, translated into the language of algorithms or general guidelines for untrained human moderators, became tools for the abuse of rights in censoring the internet. In securing freedom of speech and free exchange of information, much more is needed than “terms of service”.
Although internet is in fact a private space, the fact that rights and freedoms of netizens depend on unclear rules and mechanisms, apparently arbitrarily implemented by private companies, is particularly worrisome. Those rules all too often prevail over national and international regulations. Despite efforts made by IT companies to make their practices more transparent and accountable to the users, such as transparency reports, there are still many questions as to what can be posted on our Facebook or YouTube accounts. That is why it is important to support initiatives by activists and advocates for digital rights in Europe and in the US, so that policies of removal user generated content is dragged into line with legal norms and standards of human rights within the digital environment.
This article was originally published on http://www.shareconference.net/en/defense/social-media-editors-public-sphere-youtube-vs-ombudsman.
Open letter to Facebook (09.09.2016)
http://www.aftenposten.no/meninger/kommentar/Dear-Mark-I-am-writing-this-to-inform-you-that-I-shall-not-comply-with-your-requirement-to-remove-this-picture-604156b.html
Facebook had no right to edit history (09.09.2016)
https://www.theguardian.com/commentisfree/2016/sep/09/facebook-napalm-vietnamese-deleted-norway
EDRi: New documents reveal the truth behind the Hate Speech Code (07.09.2016)
https://edri.org/new-documents-reveal-truth-behind-hate-speech-code/
(Contribution by SHARE Foundation, Serbia)