A privately managed public space?
- Our “public spaces” online where we meet each other, organise, or speak about social issues, are often controlled and dominated by private companies (platforms like Facebook and YouTube).
- Pushing platforms to decide which opinions we are allowed to express and which not is not going to solve major problems in our society.
- The EU rules on online content moderation are soon going to be reviewed. To ensure our right to freedom of expression, we need to make sure these updated rules will not encourage online platforms to over-removal of content to avoid being taken to court.
Your video in YouTube got removed, without a warning. Or the page you manage on Facebook was blocked because your posts breached the “community standards”. You’ve sent messages to the platform to sort this out, but there’s no reply, and you have no way of getting your contents back online. Maybe you’ve experienced this? Or if not, you surely know someone who has.
The internet is a great place – a sort of “public space” where everyone has equal possibilities to share their ideas, creations, and knowledge. However, the websites and platforms where we most frequently hang out, share and communicate, like Facebook, Twitter, Instagram or YouTube, are not actually public spaces. They are spaces controlled by private businesses, with private business interests. That’s why your page got blocked, and your video removed.
Anyone should be free to express their opinions and views, even if not everyone likes those opinions, as long as they aren’t breaking any laws. The problem is that the private businesses dominating our “public spaces online would rather delete anything that looks even remotely risky for them (a potential copyright infringement, for example). There are also financial interests: these businesses exist to make profit, and if certain content doesn’t please their ad business clients, they will likely limit its visibility on their platform. And they can easily do it, because they can use their arbitrary “terms of service” or “community standards” as a cover, without having to justify their decisions to anyone. This is why it shouldn’t be left for the online companies to decide what is illegal and what is not.
There’s an increasing trend to push online platforms to do more about “harmful” contents and to take more responsibility. However, obliging the platforms to remove contents is not going to solve the problems of online hate speech, violence, or polarisation of our societies. Rather than fiddling around trying to treat the symptoms, the focus should be on addressing the underlying societal problems.
Whenever contents are taken down, there’s always a risk that our freedom to express our opinions is being limited in an unjustified way. It is, however, better that the decisions about what you can say and what you can’t are done based on a law than on interests of a profit-seeking company.
There are rules in place that limit online companies’ legal responsibility for the contents users post or upload on their platforms. One of them is the EU E-Commerce Directive. To update the rules on how online services should deal with illegal and “harmful” content, the new European Commission will likely soon review it, and replace it by a new set of rules: Digital Services Act (DSA). To ensure we can keep our right to freedom of expression, we need to make sure these updated rules will not encourage online platforms to over-remove content.
When dealing with videos, texts, memes and other content online, we need to find a nuanced approach to treating the different types of content differently. How do you think the future of freedom of expression online should look like?
E-Commerce review: Opening Pandora’s box? (20.06.2019)
Facebook and Google’s pervasive surveillance poses an unprecedented danger to human rights (21.11.2019)
LGBTQ YouTubers are suing YouTube over alleged discrimination (14.08.2019)
(Contribution by Heini Järvinen, EDRi)