Blogs | Information democracy | Disinformation and electoral interference | Equal access to the internet | Freedom of expression online | Platform regulation

Shedding light on the Facebook content moderation centre in Athens

By EDRi · December 4, 2019

Following months of efforts, in early September 2019, EDRi observer Homo Digitalis managed to shed light on a case that concerns each and every Facebook user: a content moderation centre in Athens, Greece, tasked to moderate Facebook ads. As many other content moderation policies run by virtually unaccounatable private companies, this can pose threats to our freedom of expression.



A person who claimed to be working as a “Facebook content moderator” in Athens contacted Homo Digitalis in February 2019. But how could this be? Facebook has various content moderation centres around the globe, but none of them was known to operate in Athens. It turned out that Tassos (name has been changed) was indeed one of the hundreds who work in Athens as content moderators for Facebook. These content moderators determine, on behalf of Facebook, what is inappropriate or misleading and needs to be deleted from the platform. The particularity of the moderation centre in Athens is that it moderates exclusively advertisements, not content that individual Facebook users are posting to the platform. However, “advertisements” for Facebook does not only mean advertisements posted by transnational corporations or prominent newspapers. It includes all the posts by any professional page on Facebook, including personal professional pages of a lawyer, a dentist, a journalist, a photographer, a model and any professional in general.

Facebook has been operating this centre at least since September 2018, through a subcontractor called Teleperformance, a France-based multinational specialised in acting as customer services for other companies, or “outsourced omnichannel customer experience management” in business jargon.

But how did Tassos end up there? By responding to a brief, vague job post. No qualifications were required for the post, except speaking one of the 25 working languages. The lack of requirement of specific experience on content moderation or even experience in technology raises serious concerns regarding the quality of the selection process.

Tassos told that during a brief interview, he was informed that the job involved the possibility of being exposed to violent imagery. He went through three weeks of training before starting work. This training mostly consisted of presentations of Facebook’s policies and practical examples of how to solve a case. However, Facebook’s policies tend to change a lot and quite rapidly with no extra training provided on the new policies. According to Tassos, Facebook typically also took months to respond to any questions and doubts that arose around the new policies – or did not respond at all to many questions. This lead into situations in which the moderators had no adequate means to properly deal with an advertisement which might have been against a new or an amended policy.

Although there was no formal daily quota that the moderators should meet, informally, they had to screen a hundred posts per hour. Tassos said that as the shift drew to a close, some of his co-workers would simply approve or reject content without further deliberation. Their aim, he said, was to meet a desirable target. Wrong decisions made due to pressure could subsequently lead to biases in the artificial intelligence (AI) system that will allegedly be used to perform the same task for Facebook in the coming years.

Facebook, as part of their business operations, manages the way that ads run on its platform. It seems, however, that it has chosen some wrong ways to do so. The current operation of the Athens moderation centre is a threat to the rights of the users, since it may put their freedom of expression and freedom of information in danger. It is unacceptable that decisions which could inherently affect the personal or business life of one or more Facebook’s users, are made in seconds, by persons who have no expertise in conducting a proper assessment.

In addition to that, the operations of the centre raise concerns for the well-being of moderators who work without psychological support or supervision.Tassos spoke about the working conditions and the training the moderators received. According to his contract, which he shared with Homo Digitalis, he provided call support and “services primarily in the Facebook customer service department”. Tassos explained that his work did not include telephone calls. On the other hand, it occasionally included depictions of violence. If Tassos was not sure whether he should accept or reject a post, he had to go through a long manual, which included the cruelest examples of what is not accepted under Facebook policies. “We would check this manual many times per day. I was dreaming of two dead dogs hung on a tree, who were shown as an example in this manual, every night for months.” Employees could book a 30-minute session every two weeks with one of the three psychologists, who were available on site. However, most moderators were unwilling to attend such sessions, concerned that their conversations might leak and result in their dismissal. Tassos was prohibited from discussing his work with anyone outside the organisation, and he was not allowed to reveal the identity of his employer in social media.

During the last 14 months, approximately 800 persons have gone through the recruitment procedure and have been hired by Teleperformance to moderate advertisements for Facebook in Athens. The employees cover a wide range of languages and nationalities (allegedly 25 languages and more than 70 countries are covered). The languages include Greek, English, French, German, Spanish, Italian, Arabic, Turkish, Norwegian, Finnish, Israeli, and Russian. It is easily understood that the operation of this centre concerns every Facebook user from countries, in which one of these languages are spoken.

Homo Digitalis communicated the story to Kathimerini, a prominent newspaper in Greece. Kathimerini had been conducting research on the issue for years, but had not managed to get the testimony of a moderator. The night before Kathimerini published an article on the issue in its front page, Facebook made a formal statement for the first time, admitting that it indeed operates a content moderation centre in Athens.

Homo Digitalis
https://www.homodigitalis.gr/en/

Homo Digitalis comments for Facebook’s content moderation center in Greece (20.10.2019)
https://www.homodigitalis.gr/en/posts/4493

Full Story at Kathimerini’s website: Inside Facebook’s moderation hub in Athens
http://www.ekathimerini.com/246279/gallery/ekathimerini/special-report/inside-facebooks-moderation-hub-in-athens

(Contribution by Konstantinos Kakavoulis, EDRi observer Homo Digitalis, Greece)