Contribution by: Leanna Garfield, Digital Engagement Lead, Access Now. The article was first published by EDRi member Access Now here.
The documents paint a grim picture of how Facebook has repeatedly and knowingly put profit ahead of people’s safety. From Facebook’s role in the U.S. Capitol insurrection, to its system for sorting the world into “tiers,” to its disregard for clear evidence its platform was causing harm, here is just a snapshot of what we know from the Facebook Papers so far.
Facebook sorts the world into “tiers.”
Internal documents show how Facebook decides which countries will receive higher levels of monitoring and moderation, and where investment in people’s safety doesn’t make it into the budget, The Verge reports. They also reveal where the company has been slow to respond to controversial content when it believes it may be politically costly to do so. Only eight countries made the priority list. The rest of the world didn’t make the cut.
Facebook increasingly suppresses political movements it deems dangerous.
Facebook employees have also complained about the company’s ad hoc approach to content moderation. According to The Wall Street Journal, Facebook is essentially “playing whack-a-mole with movements it deems dangerous,” straying from its public commitments to preparedness and neutrality.
Facebook lets politicians “bend rules.”
Facebook’s senior executives stepped in to allow U.S. politicians and celebrities to post whatever they wanted on the platform, even though employees stressed that they were breaking Facebook’s own rules, The Financial Times reports. Much of the content helped further spread misinformation and harmful content. There have also been past reports of this happening outside the U.S. too, although Facebook said it ended the practice in June 2021.
Facebook’s language gaps allow hate speech to flourish.
Internal documents show that Facebook still doesn’t have enough moderators who speak languages other than English and understand cultural contexts outside of the U.S, says The Associated Press. In Afghanistan and Myanmar, for example, these loopholes have allowed hate speech toward marginalised groups to thrive on the platform. While in Syria and Palestine, Facebook squashes ordinary speech, imposing overreaching bans on common words, as well as posts from activists facing violence.
Facebook knew it was being used to incite violence in Ethiopia.
Facebook employees repeatedly criticised the company for failing to limit the spread of posts inciting violence in Ethiopia, where a war has continued for the past year. CNN reports that they warned managers about how “problematic actors” were using Facebook to spread content that seeded calls for violence in the region. The company did close to nothing. A similar story is playing out in India.
Bureaucracy and technical glitches slowed Facebook’s U.S. Capitol riot response.
When supporters of Donald Trump tried to stop Joe Biden from being declared president, Facebook employees say its response was inadequate and too slow. Internally, critics complained that Facebook didn’t have a real game plan for addressing harmful election disinformation, according to The Washington Post. And some plans, like a change that would have prevented Groups from changing their names to things like “Stop the Steal,” faced technical problems.’
Facebook CEO Mark Zuckerberg repeatedly chose growth over safety.
According to a series of complaints, Zuckerberg made countless decisions and statements that ultimately put people at risk. The Washington Post notes: ”The company’s use of ‘growth-hacking’ tactics, such as tagging people in photos and buying lists of email addresses, was key to achieving its remarkable size.” In some countries with repressive regimes, Zuckerberg has also agreed to increase censorship of anti-government posts.
On Monday, Zuckerberg began Facebook’s quarterly earnings call by addressing the wave of news coverage surrounding the Facebook Papers.
“Good faith criticism helps us get better, but my view is that we are seeing a coordinated effort to selectively use leaked documents to paint a false picture of our company,” he said. “The reality is that we have an open culture that encourages discussion and research on our work so we can make progress on many complex issues that are not specific to just us.”
In many places around the world, Facebook’s actions (and inaction) can have deadly consequences. The Facebook Papers reveal that it’s time to regulate Facebook and other Big Tech companies. That starts with passing a comprehensive U.S. federal data protection law. Not only would that help protect our right to privacy, it would prevent companies from harvesting the data that is weaponised against us, a practice fueled by profit-at-any-cost business models.