A pattern is emerging. After blocking a controversial video, YouTube nonpologises for doing so, and reinstates the video… just to block it again a few months later. The procedures around content moderation need to improve, but that’s not all: more needs to change.
In June 2018, EDRi member Bits of Freedom reported that YouTube had already taken down a Dutch pro-choice NGO Women on Waves’ accounts three times in 2018, each time without proper justification. As if that wasn’t ridiculous enough, their account was taken down a fourth time just as they were being interviewed by the Dutch television program Nieuwsuur about the previous takedowns, again without notice, and without a satisfactory explanation. YouTube subsequently did what it has done many times before: the company issued a nonpology and reinstated the account. Based on experience, it is a question of when, not if, it gets removed again.
It’s odd that an account can be wrongfully blocked several times over the course of just a few months. One would expect that, after an account has been wrongfully blocked once or, at worst, twice, moderators would receive a warning that triggers a process in which a(n additional) person is involved as soon as the account is recommended for blocking. However, at best, this would only prevent the most obvious mistakes. Whether there’s a properly functioning process in place to block videos or accounts or not, there will always be controversies. The company will not be able to prevent the occasional moderation error from happening.
YouTube is in a near-monopoly position when it comes to uploading and watching videos, and it has a huge reach. Every decision YouTube makes about whether a video can be accessed through its platform has the possibility of having an enormous impact. This becomes especially clear regarding videos that deal with controversial topics. Nieuwsuur gives a few examples: bodily integrity, sexual freedom, and cannabis. Of course you’ll always be able to find someone somewhere in the world who has a problem with these topics, which is probably the reason for YouTube to ban certain videos about these topics upfront, and to quickly remove other videos as soon as someone complains. Videos and accounts disappear if one or more viewers report them as offensive, or if YouTube’s computers detect certain images or combinations of words.
This puts everyone in a tough position: the creator, the viewer and the platform itself. Creators see their videos fall off the internet from time to time and can’t do anything about it. Viewers can’t watch the videos they want to watch, regardless of their feelings about certain topics. Platforms will never be able to please everyone; opinions will continue to differ. Moreover, due to public and political pressure, a company can no longer decide for itself how to run its platform.
The only solution to all this lies in ensuring that everyone – the uploader, viewer and the platform – has options to choose from. The only way to do that is to ensure that multiple platforms exist side by side. Each with their own interests, considerations, and audience. It enables creators to choose the platform that fits them best. As a viewer you can choose a platform that is as open-minded as you are. And the platform can go back to making its own decisions about what it deems acceptable and what not.
And the beauty of it all: in this scenario the procedures for moderating content become less crucial. If a platform handles complaints in a very sloppy way, then one can simply choose a better functioning alternative, because they aren’t dependent on that particular platform.
YouTube puts uploaders, viewers and itself in a tough position (25.10.2018)
Women on Waves’ three YouTube suspensions this year show yet again that we can’t let internet companies police our speech (28.06.2018)
YouTube censors Dutch organizations’ videos (only in Dutch)
(Contribution by Rejo Zenger, EDRi member Bits of Freedom, the Netherlands)