Facebook disputes claims they retain extreme content for money

In a new documentary by Channel 4 in the UK, former Facebook investor Roger McNamee says it's the extreme, dangerous content that's the most engaging – something he says Facebook stands to profit from

MANILA, Philippines – Facebook’s head of public policy, Richard Allan, contested claims the company stood to benefit from allowing extreme, disturbing, and highly questionable content to stay on Facebook instead of having them removed. 

“Shocking content does not make us more money, that’s just a misunderstanding of how the system works. People come to Facebook for a safe, secure experience to share content with their family and friends,” Allan said in a new documentary by the long-running investigative show, Dispatches, from the United Kingdom (UK). 

“There is a minority who are prepared to abuse our systems and other internet platforms to share the most offensive kind of material. But I just don’t agree that that is the experience that most people want and that’s not the experience we’re trying to deliver,” he said.

The statements are a response to several claims made in the same documentary, which is anchored by the findings of an undercover reporter implanted back in March in CPL Resources, Facebook’s largest center for UK moderation. (READ: Uncovered moderation guidelines show Facebook in quagmire)

As far as third-party content moderation is concerned, the findings and statements collected from CPL Resources employees are damning, and run counter to Allan’s stance. The findings either imply that somewhere in the process of company-to-subcontractor-to-moderator, moderation rules are not being communicated clearly enough; or Facebook truly wants its moderation this way, for financial gain, as one critic would have it. 

That critic is Roger McNamee – former Facebook investor and also described as a mentor to Mark Zuckerberg – who was quoted in the same show, comparing extreme content on Facebook to a form of highly addictive stimulants:

“From Facebook’s point of view this is…the crack cocaine of their product. It’s the really extreme, really dangerous form of content that attracts the most highly engaged people on the platform. Facebook has learned that the people on the extremes are the really valuable ones because one person on either extreme can often provoke 50 or 100 other people and so they want as much extreme content as they can get.

“[It] understood that it was desirable to have people spend more time on site if you’re going to have an advertising-based business,” McNamee also said. (READ: 14 facts: Facebook moderation on threats, bullying, and suicide)

McNamee is also among the founders of Center For Humane Technology – an alliance between Silicon Valley veterans that also includes a former Facebook operations manager, and the creator of Facebook’s ‘Like’ button, all looking to challenge some practices of Big Tech. 

Moderator statements

One CPL Resources employee was quoted telling the undercover Dispatches reporter that some violent content is left on Facebook because “if you start censoring too much then people lose interest in the platform. It’s all about making money at the end of the day.”

Another moderator said that one page with a lot of followers couldn’t be deleted right away even after it has violated a Facebook rule that says a page has to be deleted after it has posted 5 or more pieces of content violating Facebook’s codes. The reason, says the moderator, is that “they have a lot of followers so they’re generating a lot of revenue for Facebook.”

The page in question is Britain First, described to be a far-right political group with two million followers. Moderators said it violated Facebook content policies at least 5 times, but wasn’t taken down because it was “shielded.” 

“It just had too many fans so it was shielded. Generally shielded, it’s like you just need to be more careful so if a page has a lot of followers you don’t want to willy-nilly just take it down,” the moderator said.

Being “shielded” means a page is under Facebook’s “shielded review,” according to the TV report. A page under “shielded review” means a page can’t be taken down even after it has made 5 or more content violations, as determined by the 3rd-party moderator. Instead, the case is escalated to Facebook itself, who then makes the decision to take the page down or not. 

Tommy Robinson, another page with known far-right leanings, was also identified as being shielded. It has currently around 900,000 followers, and is still active as of writing. “Don’t worry too much about deleting their stuff because those pages are shielded so if you delete a video or whatever, you haven’t deleted Tommy Robinson’s video. It just goes straight to shielded review,” said another CPL moderator.

Other pages that go under shielded review are government pages and news organizations. 

“I want to be clear this is not a discussion about money. This is a discussion about political speech. People are debating very sensitive issues on Facebook, including issues like immigration. And that political debate can be entirely legitimate. I do think having extra reviewers on that when the debate is taking place absolutely makes sense and I think people would expect us to be careful and cautious before we take down their political speech,” Allan also said in the Dispatches documentary. 

Despite the statements from Facebook, other questionable content remain on the platform, as shown in the documentary, including a boy being beaten by a grown man; a meme that shows a girl whose head is dipped underwater with the caption “when your daughter’s first crush is a little negro boy”; and a comment aimed at Muslim immigrants, saying, “f**k off back to your own countries”.

Allan, in media statements in response to the documentary, admitted that they’ve made mistakes and said they are “providing additional training,” and that they “remove content from Facebook no matter who posts it.”  – Rappler.com