WATCH: 'Fake news' and the dilemma it has created
MANILA, Philippines – The term "fake news" has been thrown around in recent years, especially in politically-charged debates in social media.
But what really is "fake news"? How can we spot it? What can we do to address it?
To answer these questions, Rappler invited Clarissa David, professor of the University of the Philippines (UP) College of Mass Communication, to discuss the matter.
"Fake news," said David, is an issue that has been a growing concern over the last few years. "It has clearly escalated into a scale that is difficult to manage."
While it's not a new issue, David said that we are now dealing with an information ecosystem "that allows for faster spread of information and many barriers to any corrective measures."
David said that "fake news" has become a catch-all term for many kinds of objectionable content. "But many of these are not 'fake news.' It is counter-productive, in fact, to use the term."
For instance, parodies and satirical posts "can easily be mistaken for news, but it is not at all news." Bad or sloppy reporting "is not really fake," added David, "and is often unintentional in its effect to mislead."
Posts with intentionally deceptive messages and hyper-partisan opinion pieces are highly problematic, but the ones that really qualify as "fake news" are manufactured information, or those that use fake information. It's the core of the problem, said David, but not the only problem.
"To call all of these 'fake news' and then to seek to regulate it is a dangerous proposition, the proverbial slippery slope," she added. (WATCH: Senate hearing on fake news online)
'Eyes of the beholder'
David then argued that the "fakeness" of news is in the eyes of the beholder.
"The facts may be the same, but the interpretations are vastly different. Both sides will call each other 'fake news'," said David.
There are also efforts to undermine legitimate news organizations, she added. "While this is a global phenomenon, our market in the Philippines has characteristics that make it especially vulnerable."
Facebook has contributed to the problem, she continued. "It has facilitated the blurring of the lines between news and anything else." (WATCH Rappler Talk: Clarissa David on fake news and disinformation)
She also observed that sharing of disinformation on Facebook "is easy and has no consequences for the person sharing it."
There is also the issue, said David, when public officials or news sources themselves lie. "It is most dangerous when the lie is quoted without question in the headline, when we all know that vast majority of people will not likely read the text to find out if these are corroborated with facts."
News outfits, she said, should come up with ways to handle these scenarios, like pointing out the lie in the headline.
Next, David spoke of the intent of problematic content, and distinguished between misinformation and disinformation, with the second one being the bigger problem and which should be met with aggressive resistance.
"Disinformation is designed to mislead; it is purposely spread in a strategic way to persuade. Misinformation, on the other hand, is fueled by ignorance, not malice."
But David argued that both are present in recent online posts. "At the source and initial spread, it is disinformation, but the wider it spreads, the sharing is no longer intentional. It is people genuinely believing they are spreading truth."
Making, spreading content
David then spoke of the two processes at play in this issue: content generation and dissemination.
The Senate inquiry into "fake news", said David, has focused on content producers like bloggers and influencers. "What we haven't been talking about is the second half of this problem, the distribution side or the social media platform."
She described the phenomenon of "echo chambers" in online engagement.
"Facebook learns what you agree with, based on your engagement metrics," she added. "Eventually, you only see content that agrees with you."
Problematic content spreads more easily on Facebook because "these are shared by trusted friends, the news feed is agnostic to brand, it makes no distinction between news and opinion, and the whole ecosystem is rigged to encourage liking, commenting, and re-posting," said David.
Add to the problem, said David, the appeals to extreme negative emotions that lead to polarized opinions.
If these are not fixed, David argued, this will lead to the erosion of trust in the enterprise of journalism. "If that goes, and we need real information, who can we then rely on?" she asked.
What to do
In the end, David listed down concrete steps to address these problems.
First, efforts on media literacy in schools should be increased. (READ: Schools 'responsible' for making sure graduates can detect fake news)
Web or mobile apps that carry only legitimate news outlets and vetted media institutions, as well as plug-ins that flag suspicious content can also be used.
"Strategic promotion of legitimate news content needs to catch up with the strategy that's being applied from the other side. And we have to enlist the support of key influential champions in the media industry to address the problem head-on and lobby the [online] platforms to intervene," David added.
The news industry's reliance on Facebook to deliver content could also be reduced.
David then pushed to reject limitations to free speech and freedom of the press. "If the solution that they're going to propose to us is legislation that would regulate our speech, then that cannot be acceptable."
"Each new wave of technological innovation in communication brings a flood of darkness, then the public will eventually sniff out the bad stuff and demand the good stuff," David said.
"This is the key: creating a demand for real, truthful, factual reporting." – Rappler.com