Tagging fake news on Facebook has minimal effect – study

Gelo Gonzales

This is AI generated summarization, which may have errors. For context, always refer to the full article.

Tagging fake news on Facebook has minimal effect – study
Some may even become more likely to misconstrue fake news as true, the Yale University study says

 

MANILA, Philippines – A method employed by Facebook to control the spread of fake news is barely working, according to a Yale University study.

The study, first reported by Politico, said that the scheme of and tagging false posts as “disputed by third party fact-checkers” only improved people’s ability to correctly identify headlines as false by 3.7%. 

What’s worse is that some groups such as people under 26 and Trump supporters may end up believing fake news more. The researchers noted that because of the sheer volume of fake news coming in, it’s impossible for the fact-checking partner groups – Snopes, Politifact, and FactCheck.org – to analyze all of them.

The result is that some fake news gets tagged, and some don’t. The lack of a disputed tag on those that get past the fact-checkers now appear to be true relative to the disputed stories. “All of these effects are tiny. Even to the extent it’s doing anything, it’s a small effect. It’s not nearly enough to solve this problem,” the study’s author, pyschologist David Rand said. 

The study is co-authored by Gordon Pennycook, also a psychologist. The study, posted at online research network SSRN, has not yet been peer-reviewed. 

Facebook was critical of the study’s methodology, saying that it was done through an online survey and not on Facebook’s platform, where the study could have ensured that every individual is a Facebook user. “This is an opt-in study of people being paid to respond to survey questions; it is not real data from people using Facebook,” a Facebook spokesperson told Politico

Facebook also said that it has done other things to fight fake news, including the removal of the ability of pages that share fake news to advertise on Facebook; deleting fake pages when an election nears; and the publication of a detailed report on how information operations on Facebook work.

But if there’s one thing that fact-checkers hope Facebook would do, it’s this: share more data and information pertinent to the war against fake news. Alexios Mantzarlis, the director of the Poynter Institute’s International Fact-Checking Network told Politico, “I’m hoping Facebook will see this study and determine that it is even more appropriate for them to share data as to how this is actually going.”

Study findings

The study involved 7,500 people, who were asked to judge a number of headlines, some of which were true and some were false. The stories used came from 2016 and 2017. The people were asked to rate the accuracy of the headlines. People correctly judged real news stories as accurate 59.2% of the time while 18.5% believed false stories to be true. 

Then the study introduced another set of headlines to another group, but this time, they flagged some of the false headlines with the “disputed” tag. The change was negligible. People were able to identify real news and fake news by just a little more but nothing significant, given the size of the problem. What’s worse is that after the addition of the “disputed” tags, some groups such as Trump supporters, and those who were 18 to 25 became more likely not to identify fake stories as true.

While the improvements are minimal, Poynter Institute’s Mantzarlis is optimistic: “What I stress to people who are looking at whether they want to do this type of journalism is that any percent is a good percent, any correction is worth the work. Would it have been better if this found that it had a larger effect for fact checking? Yeah, it would have been a more encouraging sign.”

“Because this is the first research on the topic, I’m not yet ready to say this is not worth the time,” he told Politico.

Counterpoint

Buzzfeed provides a counterpoint to the argument, saying that the presence of fact-checkers and the disputed tag has an effect beyond its effect on people’s perception what’s true and what’s not. 

Mantzarlis told the website: “The [disputed label] is almost more valuable in terms of reduced reach than in terms of consequences of users understanding of the individual item.” Items that are slapped with the disputed tag immediately have their reach cut down; the items don’t spread on Facebook as freely as it would have without it. 

Along with that, all the data that is coming in from fact-checkers labelling the items go into Facebook’s huge database of false news, said Buzzfeed. The data is valuable in that it’s data that can shape future algorithms for Facebook and its news feed.

Simply put, the more Facebook knows what fake news is, the better it can shape its algorithm to rid the platform of Facebook – ideally. The fact that there are expert third-party checkers doing the job should give users confidence that Facebook will one day solve the problem of fake news on its platform. In effect, the fact-checkers serve as mentors to the automated smarts behind Facebook.

Users that tag news stories as false also provide data to Facebook’s database. 

Along with individual news stories, all that data that fact-checking generates may also help Facebook identify the websites that share low-quality content. Catching entire sites with low-quality content and having these down-voted on the News Feed expedites the fake news filtering process for Facebook. 

Last year, Facebook shut down a team of human curators for its Trending section, and the section promoted false stories to users. This year, the social network appears to have learned a lesson in that they are now making use of both human-led analysis and algorithm to craft News Feed. – Rappler.com

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Loading
Download the Rappler App!
Clothing, Apparel, Person

author

Gelo Gonzales

Gelo Gonzales is Rappler’s technology editor. He covers consumer electronics, social media, emerging tech, and video games.