Blogs | Information democracy | Freedom of expression online | Platform regulation

ENDitorial: Facebook can never get it right

In 2017, a man posted live footage on Facebook of a murder he was committing. The platform decides whether you get to see this shocking footage or not – an incredibly tricky decision to make. And not really the kind of decision we want Facebook to be in charge of at all.

By EDRi · November 21, 2018

In 2017, a man posted live footage on Facebook of a murder he was committing. The platform decides whether you get to see this shocking footage or not – an incredibly tricky decision to make. And not really the kind of decision we want Facebook to be in charge of at all.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

I didn’t actually see the much-discussed footage of the murder – and I really don’t feel the need to see it. The footage will undoubtedly be shocking, and seeing it would without doubt leave me feeling very uncomfortable. When I close my eyes, I unfortunately have no trouble conjuring up the picture of Michael Brown after he had just been shot. Or the footage of the beheading of journalist James Foley. The thought of it is enough to make me sick.

Should these kinds of images be readily available? I certainly can’t think of a straightforward answer to this question. I would even argue that those who claim to know the answer are overlooking a lot of the finer nuances involved. The images will, of course, have the bereaved family and friends cowering in pain. Every time one pops up somewhere, they will have to go through it all again and again. You wouldn’t want people to accidentally come across any inappropriate images either: not everyone will be affected equally, but the images are inappropriate nonetheless. No one remains indifferent.

That said, I still have to admit that visuals are sometimes essential in getting a serious problem across. A while back I offered a journalist some information that we both agreed was newsworthy, and we also agreed it was important to bring it to people’s attention. Even so, his words were: “You’ve got a smoking gun, but where is the dead body?”. I didn’t realise then that this sometimes needs to be taken very literally. Sometimes, the photographs or footage of an awful event can act as a catalyst for change.

Without iconic images such as the one of Michael Brown’s body, discrimination by police in the United States might not have been given much attention. And we would probably have never seen reports on the countless mass demonstrations that ensued. The fact we’re not forgetting about the Vietnam War has something to do with a single seminal photograph. Had we never seen these images, they could never have made such a lasting impact, and the terrible events that caused them would not be as fresh in our collective memory today.

I have no doubt that these images sometimes need to be accessible – the question is when. When is it okay to post something? Should images be shared straight away or not for a while? With or without context, blurred or in high definition? And perhaps most importantly: who gets to decide? Right now, an incredible amount of power lies with Facebook. The company controls the availability of news items for a huge group of users. That power comes with an immense responsibility. I wouldn’t like to be in Facebook’s shoes, as Facebook can never get it right. There is always going to be someone among those two billion users who will take offence, and for legitimate reasons.

But there, for me at least, lies part of the problem – and maybe also part of the solution. Facebook decides for its users what they get to see or not. Many of the questions floating around about Facebook’s policy would be less on people’s minds if Facebook wasn’t making important decisions on behalf of its users, and if instead users themselves were in control. The problem would be less worrisome if users actually were given a choice.

One way to make that possible is to go back to a system where you can choose between a large variety of providers of similar services. Not one Facebook, but dozens of Facebooks, each with its own profile. Compare it with the newspapers of the past. Some people were satisfied with a subscription to The New York Times while others felt more at home with The Sun. And where the editors of one newspaper would include certain images on its front page, the editors of another newspaper would make a different choice. As a reader, you could choose what you subscribed to.

But even without such fundamental changes to the way the internet is set up, users might be able to get more of a say – for instance if they can do more to manage their flood of incoming messages. Get rid of certain topics, or favour messages with a different kind of tone. Or prioritise messages from a specific source if they are the only ones writing about something. Users may not even have to make all those decisions by themselves if instead they can rely on trusted curators to make a selection for them. And even though that sounds quite straightforward, it really isn’t. That one interface has to accommodate those same two billion users, and shouldn’t create any new problems – like introducing a filter bubble.

So what it is we’re supposed to do about that shocking murder footage, I really wouldn’t know. There is no straightforward and definite answer to that question. But one thing is very clear: it is not a good idea to leave those kinds of decisions to a major tech company that holds too much power and does not necessarily share our interests. One way out would be to give users more of a choice, and consequently more control, over what they see online.

Facebook can never get it right (20.11.2018)
https://www.bitsoffreedom.nl/2018/11/20/facebook-can-never-get-it-right/

Bits of Freedom
https://www.bitsoffreedom.nl/

(Contribution by Rejo Zenger, EDRi member Bits of Freedom, the Netherlands; translation from Dutch by Marleen Masselink)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner