Uncovered moderation guidelines show Facebook in quagmire

Gelo Gonzales

This is AI generated summarization, which may have errors. For context, always refer to the full article.

Uncovered moderation guidelines show Facebook in quagmire
The Guardian shows the complexity Facebook faces in moderating sensitive content

MANILA, Philippines – On Monday, May 22, The Guardian published secret Facebook content moderation guidelines on sensitive content such as revenge porn, violence, child abuse, animal abuse, and terrorism, among others. 

The publication said they saw “more than 100 internal training manuals, spreadsheets and flowcharts” that pushed the doors wide open on how Facebook moderates the content found on their platform, illustrating the complexity of the content moderation Facebook has in their hands.

At the end of March 2017, the social network had 1.94 billion monthly active users, just a few hundred million short to hosting a third of the world’s population. For reports relating to potentially fake accounts alone, one document says that Facebook reviews more than 6.5 million. The 4,500 moderators – a small portion of them in-house, with most of them subcontractors, – often have “just 10 seconds” to make a decision on each report, one source told The Guardian.  

The short time moderators have for judging content may be contributing to the controversies Facebook sometimes finds itself in for either censoring content or letting some content stay on the platform that some deem unfit for the general audience.

In 2016, Facebook received criticism for removing an iconic war photo because the photo showed a naked girl, and in another case of censorship, a cancer awareness video featuring animated women with circle-shaped breasts. (Read: How Facebook algorithms impact democracy)

In 2017, Facebook was in the news again, this time for content that some say should be removed quickly: online videos of killings. (Read: Online videos of killings pose tricky problem for Facebook)

To its credit, Facebook has either acknowledged these issues publicly as real issues they have to fix or apologized.

Regulation demands, free speech concerns

Facebook has corralled a significant portion of the world’s population onto its platform, and it’s now not a reach to say that it’s the biggest town square the world has ever seen.

Before the rise of social media, the internet had felt like the days of the wild West in the United States – unmarshalled, unregulated, wide open and just messy. These same traits can be observed in Facebook now – except now, there is a sheriff in town, citizens know him as Mark Zuckerberg, and he currently appears to have a big headache, no small thanks to the loud demands to keep things in order. 

The uncovered documents show a portion of the measures Mark and company are employing to keep things in order. What the uncovering showed is that moderators are currently overwhelmed by the influx of content that has to be moderated; and that given Facebook’s recurring appearances in the news for letting content stay or not, they have not yet completely figured out how to toe the line between what to censor and what to keep.

Because of the latter, Facebook tends to ruffle the feathers of both sides of the spectrum: those who are saying that they should be regulated like other mainstream broadcasters and publishers or that they should sweep away all sensitive content; and on the other end, free speech advocates.

The business of moderating content – whether on Facebook, Twitter, news sites, Facebook pages, even Google with the fake news websites attempting to game the search engine – is tricky. The sheer number of Facebook users multiplied by their diverse upbringings, cultural backgrounds and the content types they can publish makes things even trickier. 

One source of The Guardian shares their summary of the problem: “Facebook cannot keep control of its content,” and that “It has grown too big, too quickly.”

Facebook’s head of global policy management, Monica Bickert, offered another perspective on the problem, as quoted by the UK news outfit too. Bickert noted the difficulty of reaching consensus given the community’s diversity.

She said, “We have a really diverse global community and people are going to have very different ideas about what is OK to share. No matter where you draw the line there are always going to be some grey areas. For instance, the line between satire and humor and inappropriate content is sometimes very grey. It is very difficult to decide whether some things belong on the site or not.” – Rappler.com

Comment below or email technology@rappler.com for your thoughts. 

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Loading
Download the Rappler App!
Clothing, Apparel, Person

author

Gelo Gonzales

Gelo Gonzales is Rappler’s technology editor. He covers consumer electronics, social media, emerging tech, and video games.