Leaked Facebook files: Crucial numbers to know

Gelo Gonzales

This is AI generated summarization, which may have errors. For context, always refer to the full article.

Leaked Facebook files: Crucial numbers to know
These figures show you the enormity of the task that the house that Zuckerberg built now faces

MANILA, Philippines – The recently uncovered Facebook moderation guidelines and documents exposed the weaknesses of Facebook’s content moderation procedures.

First: the sheer volume of content reports and the breadth of issues moderators have to deal with are drowning moderating teams. Number two: the complexity of the issues has resulted in occasional flip-flopping from Facebook, leaving some inconsistencies and holes in the guidelines. These are two big reasons why many have expressed great dissatisfaction with Facebook’s current ways. 

One of them is the head of child safety online at UK-based National Society for the Prevention of Cruelty to Children (NSPCC), Claire Lilley, who questioned Facebook’s guidelines relating to child abuse: “[Facebook] shouldn’t get to decide what’s in the best interests of children or the public. If something needs to be investigated or prosecuted and the perpetrators of that crime brought to light, that’s not for Facebook to make the call.”

Facebook has acknowledged responsibility and has said they are “proactively keeping the site safe.” But unlike their internal guidelines, judgment will not be coming solely from themselves. It’s the governments, the users, and the relevant organizations that will be deciding if they’ve indeed kept the site safe. 

In numbers, the facts below illustrate the current conditions of Facebook content moderation:

1.94 billion – the number of users on Facebook at the end of March 2017, making it a melting pot of cultures, upbringings, and sensibilities. The sheer number of people on the platform highlights the grave importance of the guidelines governing it – a significant portion of the world’s population is being shaped by what Facebook allows and what it doesn’t. 

4,500 – the number of moderators deciding which content to delete and which to keep on Facebook. Doing the math, that’s 431,111 users per moderator. Moderators have expressed being overwhelmed by the task. 

6.5 million – the number of reports per week relating to potentially fake accounts alone. These are the accounts that people report as fake, which the moderators then review. Along with fake account reports, moderators review content relating to child abuse, terrorism, revenge pornography, animal abuse, bullying, sexual activity, and threats of violence, among others. 

54,000 – the number of potential cases of revenge pornography and “sextortion” moderators had to review in one month. Facebook defines revenge pornography as the use of explicit images to shame, humiliate or gain revenge against an individual. Sextortion is the use of such images to ask for money, more intimate images, or possibly other demands.

14,000 – the number of accounts disabled in January 2017 relating to revenge pornography and sextortion.Thirty-three of the cases reviewed involved children.  

3,000 – the number of moderators Facebook is planning to add over the year. Facebook chief Mark Zuckerberg made the announcement after a man broadcast himself killing his 11-month-old-daughter in April 2017. The planned addition will bring the number of moderators to 7,500, meaning the ratio becomes one moderator per 258,677 users.

2 weeks – the amount of training time new moderators undergo. The manuals for moderating are devised by Facebook executives at the company headquarters in Menlo Park, California.

10 seconds – the amount of time moderators typically have to decide on each reported content. 

100,000 – the number of followers a user must have to be considered a public figure. Facebook makes the distinction between a private person and public figure; moderation rules for the two vary. 

Famous people and controversial individuals are public figures, which may include politicians, journalists, or people whose names or titles appear in the “title or subtitle of five or more news articles or media pieces within the last two years.” That is in addition to people who meet the 100,000-follower criteria.

One example mentioned in the documents is singer Rihanna, who Facebook says “can be mocked for her singing but not for being a victim of domestic violence,” as that may be considered a “cruelty topic” given her past experiences. 

Another document went, “Certain people who are famous or controversial in their own right and don’t deserve our protection.” These, as enumerated by The Guardian, include “Jesus, the mass murderer Charles Manson, Osama bin Laden, rapists and domestic abusers, any political and religious leaders before 1900 and people who violate hate speech rules.”

4,531 – the number of reports of self-harm moderators had to escalate to managers in one two-week span in 2016. Sixty-three were escalated to Facebook’s law enforcement response team, which coordinates help efforts with police and other authorities.  

Numbers of self-harm reports are higher in 2017. In separate two-week periods, there were 5,106 reports in one and 5,431 in another. But Facebook also grew from 1.65 billion users to 1.94 billion, year-on-year.

Facebook’s secret moderation guidelines were revealed to the public on Monday, May 22, Philippine time by The Guardian. Facebook had already been under pressure from governments in the US and the UK, some parties from which are demanding that they be regulated like a traditional media outfit and that they be fined for allowing content generally seen as negative or damaging to stay on the platform.

The publication of the guidelines has put Facebook on the back foot as they face added pressure to reassess their processes. – Rappler.com

Comment below or email technology@rappler.com for your thoughts. 

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Loading
Download the Rappler App!
Clothing, Apparel, Person

author

Gelo Gonzales

Gelo Gonzales is Rappler’s technology editor. He covers consumer electronics, social media, emerging tech, and video games.