14 facts: Facebook moderation on threats, bullying, and suicide

Gelo Gonzales

This is AI generated summarization, which may have errors. For context, always refer to the full article.

14 facts: Facebook moderation on threats, bullying, and suicide
Some Facebook statements from the leaked guidelines have been a source of shock



MANILA, Philippines – The breadth of issues that Facebook tackles in its leaked moderation files is immense enough that observers wonder, “Is the social network equipped to decide on these matters somewhat on their own?” 

Facebook said they hired experts to help them shape their moderation policies, but the dissatisfied sentiments arising from the leaks are enough to make one believe that Facebook doesn’t have things completely under control. (READ: Leaked Facebook files: Crucial numbers to know)

Some dissatisfaction and disapproval arose from Facebook’s statements on several issues.  For example, one document advised moderators to ignore images mocking people with disabilities – until they flip-flopped on it “in recent months,” Facebook told The Guardian.

Such statements telling the moderators to ignore something that is important to certain segments have been a source of shock for those who have read the leaks. Listed below are the important statements from the leaks that show you where the social network stands on the issues of violent threats, bullying, and suicide.

We’ve also included some general facts on Facebook moderation here to shed more light on the platform’s moderation processes.  

1) Moderators have 3 choices when they receive a reported post: ignore, escalate, or delete

If a moderator can’t decide on ignoring or deleting a reported post, they escalate it to a more senior manager who makes the decision. 

In some instances, a post can be escalated to support agencies and charities, which a team in Facebook liaises with. This is especially important when it comes to posts relating to self-harm or potential suicides said The Guardian, as the social network can smoothly coordinate with a proper organization. (READ: Social media ‘death groups’ encouraging teen suicides prompt panic in Russia)

2) Moderators make use of the “single review tool” 

The Facebook-designed tool, a special page meant specifically for moderators, features “a menu of options to help them filter content into silos,” revealed The Guardian. It is designed to speed up moderation processes. The tool has helped but moderators have still been reported as feeling overwhelmed by the volume of content to be reviewed. 

3) Moderators can also ask people posting cruel comments to consider taking their comments down

If the user continues to post cruel comments, Facebook can act on this by temporarily closing the offending account. 

4) Facebook has a law enforcement response team

The team is responsible for liasing with police and other related agencies and authorities that request for Facebook’s help. 

5) Moderation manuals are occasionally updated

New versions of the manuals are regularly sent to moderators. Smaller changes that do not require immediate updates to the manual are disseminated to “subject matter experts” who then tell the moderators of the tweaks. The subject matter experts are sort of like team leaders – they oversee the moderators’ work and review their performance regularly. (READ: Uncovered moderation guidelines show Facebook in quagmire)

Regarding moderator turnover, no specific figures have been found at the moment, but some of The Guardian‘s sources have said that the turnover rate is high and the moderators suffer from anxiety or post-traumatic stress disorders to varying degrees.

“A lot of the content is upsetting,” said Facebook. They also said they want to make sure their moderators are mentally and emotionally healthy, and are confident enough to make review decisions.  

6) Facebook will not delete comments or posts mocking people with disabilities – unless a photo accompanies the posts and the posts have been reported.

Facebook allows photo-less mocking remarks on people with disabilities so that others may “engage with or challenge them.” Anonymous profiles that make insensitive remarks or jokes are forced to publish their name next to the remark or else, Facebook unpublishes the page. 

The documents showed that Facebook used to let photos showing a person being mocked for having a serious disease or ability stay on the platform. Moderators were told not to delete these. As examples, Facebook showed photos of people with Down’s syndrome being mocked in the captions, about which Facebook said they didn’t have to be deleted. The policy had been changed “in recent months,” a spokesman from the company told The Guardian.  

7) All pages that are found with 100,000 followers or fans on any of their social media accounts are considered a “public figure” 

Moderation rules vary for public and private figures. Generally, Facebook sees private individuals as more susceptible to bullying, based on the revealed documents.

Facebook also considers politicians, journalists, and people who appear in the headline or the subhead of “five or more news articles or media pieces within the last two years.”

Monica Bickert, head of global policy management at Facebook, speaking to The Guardian, explained their policy on public figures: “We allow more robust speech around public figures, but we still remove speech about public figures that crosses the line into hate speech, threats, or harassment. There are a number of criteria we use to determine who we consider a public figure.”

8) Some people are excluded from protection if they are “famous or controversial in their own right.” 

The Guardian enumerates some of them: Jesus, the mass murderer Charles Manson, Osama bin Laden, rapists and domestic abusers, any political and religious leaders before 1900 and people who violate hate speech rules.

However, famous people can still be protected in some cases. The documents cite singer Rihanna, who Facebook says “can be mocked for her singing but not for being a victim of domestic violence,” as that may be considered a “cruelty topic” given her past experiences. 

If a reported post says “Rihanna, why are you working with Chris Brown again? Beats me” and it has a photo of Rihanna accompanying the post, then Facebook deletes it. 

Facebook is particular with accompanying photos, at least when it comes to their guidelines on bullying. 

9) Facebook allows sharing videos of physical bullying so long as no other comments are made on the post

Moderators are also told to ignore images of physical bullying or those that depict non-sexual physical abuse of children under 7 even if the images come with negative comments. But if there is a sadistic or celebratory element to the said photos, that’s when Facebook deletes or takes action on them. 

Facebook has stiffer guidelines on child abuse of a sexual nature. 

10) Facebook used to let users livestream attempts of self-harm

Facebook said, in the documents, that they don’t want to “censor or punish people in distress who are attempting suicide” but that they will remove the footage once any and all opportunities to help the person have passed. They have an exception for the rule: they’ll keep the footage if it’s newsworthy. 

Facebook has since revised their policy on this. Now, moderators are asked to take down all videos showing suicide, even when these videos are shared “by someone other than the victim to raise awareness.” The only exception that remains is the newsworthiness of a suicide video. 

On a sidenote, newsworthiness has become an important consideration for Facebook moderation after they controversially took down a iconic Vietnam war photo that featured a naked girl.

Facebook’s decision to not take down videos of people attempting suicide or other forms of self-harm is said to have been guided by the advice of experts, the documents say.

11) Facebook picks which suicide threats to take action on

When the intention “is only expressed through hashtags or emoticons” or when the method said is not likely to succeed or when the threat to kill themselves is more than five days in the future, moderators are asked to ignore. 

12) Facebook takes action on threats of violence only if there is “specificity of language”

Facebook believes that, in general, violent language is “not credible” until it specifies enough details that it becomes a plot to do harm on a target. Without the “specificity of language,” Facebook takes written violent threats as “not credible” and are a “violent expression of dislike and frustration.” With this stance, Facebook believes that expressions such as “I’m going to kill you” or “Fuck off and die” are not credible, and therefore, do not merit action.

13) Facebook believes videos of violent deaths, while disturbing, create awareness.

The documents say that “minors need protection” while “adults need a choice.” 

Videos depicting the violent deaths of humans are marked “disturbing.” They are not deleted automatically but should be “hidden from minors.” 

Facebook’s reasoning is that of awareness. Videos of the sort may help create awareness for “self-harm afflictions and mental illness or war crimes and other important issues,” the documents said. 

14) Animal abuse content may stay on the platform for awareness reasons but they remove “content that celebrates cruelty against animals.” 

Extremely disturbing imagery of animal abuse including but not limited to animal mutilation, torture, kicking or beating an animal up may be marked as disturbing, according to the documents. 

Raising awareness is, once again, Facebook’s reasoning behind the policy. They said they condemn the abuse, adding there are types of animal abuse content that are removed – content that “celebrates cruelty.” – Rappler.com

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Loading
Download the Rappler App!
Clothing, Apparel, Person

author

Gelo Gonzales

Gelo Gonzales is Rappler’s technology editor. He covers consumer electronics, social media, emerging tech, and video games.