Misinformation in chatrooms

Facebook policy gaps leave Messenger users vulnerable to false information

Vernise Tantuco

This is AI generated summarization, which may have errors. For context, always refer to the full article.

Facebook policy gaps leave Messenger users vulnerable to false information
Despite efforts by chat apps to curb disinformation, rumors continue to circulate unchecked on these platforms amid the COVID-19 pandemic. They potentially contribute to real world harm.
At a glance:
  • WhatsApp is the most popular chat app in the world, but, in the Philippines, Facebook Messenger is the chat app of choice. WhatsApp places a low 4th in the country’s total number of active users.
  • Messenger has slowly been rolling out features to curb the spread of misinformation and disinformation, but rollouts are staggered. These features are available in only 8 countries, excluding the Philippines and 68 others where Messenger outstrips WhatsApp in popularity.
  • Unscrupulous users are exploiting these gaps in implementation to get around Facebook’s policies on mis- and disinformation.

Facebook personality Lynn Agno, aka “Lynn Channel,” has been speaking about COVID-19 in videos live-streamed through her Facebook page since late March 2020. She usually discusses conspiracy theories or unverified claims. 

Her content has been flagged by Facebook’s 3rd-party fact checkers at least 4 times, which means Facebook has put warning labels on her misleading posts and has notified those who interacted with them. As a repeat offender, her page has faced restrictions, including having its distribution reduced and its ability to monetize or advertise removed.  

Agno is aware of Facebook’s mis- and disinformation policies. In one post, she said she was sure she would get fact-checked again but she shared the content nonetheless. 

Misinformation is misleading content shared with no intent to harm, while disinformation is shared with a deliberate intent to deceive or harm.

On Agno’s personal wall, followers suggest ways by which she can get around Facebook’s fact-checking policies and policy restrictions. On a post that was flagged by the United States-based fact checking website Lead Stories, one Facebook user tells her to “create a group chat instead.” Other followers urge her to “click and save to phone instead, then send through messenger.”

201106-misinformation on messaging apps-screenshot-lynn channel-001
‘MAKE A GROUP CHAT.’ A netizen tells social media personality Lynn Agno, aka Lynn Channel, to start a group chat to share information that has been debunked by Facebook’s 3rd-party fact checkers.
201106-misinformation on messaging apps-screenshot-lynn channel-001
‘SAVE TO PHONE.’ A netizen advises others to share on Messenger information that has been debunked by Facebook’s 3rd-party fact checkers.

This conversation exposes one critical gap in Facebook’s misinformation and disinformation policy, which is clearly known to users: little is being done to curb the spread of misleading information on Messenger, the messaging app linked to the main Facebook app. 

This is important because apart from being linked to the main Facebook platform, Messenger is also the second most popular messaging app globally. In the Philippines and in 73 other countries, it is the most popular messaging app.

Growing exponentially

Like WhatsApp, the other messaging service owned by Facebook, growth in the use of Messenger has quadrupled since 2014. (See the graph below.)

Since WhatsApp was acquired by Facebook in 2014, its users worldwide have quadrupled. Globally, WhatsApp already has two billion active users as of 2021, according to Hootsuite and We Are Social’s annual digital reports. Up until 2017, the number of active WhatsApp and Facebook users differed only by a few hundred millions.

Also, in 2014, Facebook separated the chat function from its main app, redirecting users to download a separate one for Messenger. Since then, use of the Messenger app globally has also tripled.

In the Philippines, Messenger has been the chat app of choice for the past 6 years, and its use in the country has also grown exponentially. The chart below shows growth in the use of specific chat apps among internet users in the Philippines since 2015. Messenger use has quadrupled since 2015. 

Comparatively, use of other messaging apps among Philippine internet users has not grown significantly.

Data from SimilarWeb as of Tuesday, February 2, also show that Messenger is the number one app in the Philippines on the Google Play Store based on Usage Rank. Usage Rank, according to SimilarWeb, is an algorithm that is based on current installs and active users.

Given the growing number of messaging app users and the way some users are using gaps to get around Facebook’s mis- and disinformation policies, the impact of current fact-checking efforts could be significantly undermined if no urgent action is done.

Slow action, staggered rollouts

Below is a timeline of when 3 chat apps – WhatsApp, Messenger, and Viber – began putting in place features to minimize the spread of misinformation and disinformation on their platforms.

Action against misinformation and disinformation has been more aggressive on WhatsApp since 2018. That same year, WhatsApp introduced labels to indicate when a message has been forwarded and limited the number of times users can forward a message to multiple chats at once. 

In August 2019, they introduced a label to show that a message has been forwarded through a chain of 5 or more people. In April 2020, they announced that these kinds of messages could be forwarded to only one chat at a time.

These same policies have been introduced at a much slower pace in Facebook Messenger.

Messenger began to label forwarded messages in April 2019, almost a year after WhatsApp first began to act. These labels only say that the message has been forwarded and do not identify messages that have been forwarded through a chain of 5 or more people.

“We’re committed to ensuring everyone has access to accurate information, and removing harmful content related to COVID-19 across our family of apps,” a Facebook representative said in an email to Rappler in September 2020.

“We recently introduced a forwarding limit on Messenger, so messages can only be forwarded to five people or groups at a time. Limiting forwarding is an effective way to slow the spread of viral misinformation and harmful content that has the potential to cause real world harm,” the representative said. 

As of November 2, 2020, however, this feature remained unavailable in the Philippines. 

When they announced its release last September 3, Facebook didn’t mention that it was rolled out in 8 countries only. A Facebook representative specified in an email to Rappler that the forward limit had been made available in the US, New Zealand, Croatia, Sri Lanka, Chile, Tunisia, Australia, and Myanmar since August 27, 2020. Majority of these countries had elections in 2020.

Limits to fact-checking

Facebook has 3rd-party fact-checking partners in most countries where Messenger is popular. These include Canada, Australia, New Zealand, Iraq, Libya, Egypt, Norway, and Greece. In other countries like Afghanistan and South Sudan, they do not have partners.

Unfortunately, even though information spreads differently on public social media channels versus chat apps, Facebook regards their efforts against misinformation and disinformation on their main platform – like 3rd-party fact checking and removing content that may lead to real-world harm – as part of their efforts on Messenger. 

This is even though fact-checking posts on Facebook does not guarantee that misleading and potentially harmful messages do not spread on Messenger. 

In an email to Rappler, a Facebook spokesperson explained that the Messenger and WhatsApp platforms have different user bases and features, making it challenging to standardize approaches to misinformation and disinformation across the board. 

Facebook said it had run separate tests on forward limits – the number of chats a message can be forwarded to at once – to assess its impact for Messenger, while exploring other options to limit the spread of misinformation and disinformation. 

Facebook’s fact-checking program labels posts as false or misleading, and accordingly sanctions pages that are repeat offenders by limiting their ability to monetize, and reduces the distribution of those posts. 

Thus, while Facebook takes action against the screenshots of false messages shared publicly on the platform, it stops there. On Messenger, the messages can continue to spread unchecked. Users are still free to take screenshots of photos, download videos, or forward existing content to their friends and family.

Real world harm

Slower action and staggered rollouts might cause harm in countries where Messenger is the preferred chat app. Hootsuite and We Are Social’s data show that Facebook Messenger is preferred in 74 countries. Their data is based on the average daily Android app users in each country in December 2019. 

Among the countries that prefer Messenger over WhatsApp is the US, where Facebook says it had been cracking down on mis- and disinformation in the run-up to the presidential elections in November 2020. 

Myanmar, where Facebook was used to incite hate speech against Rohingya Muslims in 2018, is another country that prefers Messenger. They had a general election on November 8, 2020. At the time of writing, the country has experienced a coup d’etat, ousting the elected civilian government. The new military government has blocked access to Facebook.

Messenger implemented a forward limit in both of these countries.

Even before the COVID-19 pandemic, Rappler had spotted false claims on the platform.

Amid measles outbreaks in 2018, 3 different people emailed Rappler about a forwarded message warning others not to receive tetanus vaccines at health centers. The hoax claimed that members of the Islamic State (ISIS) were spreading AIDS and killing people through these shots. 

Rappler fact-checked the claim on December 21, 2018, or 10 days after Congress extended martial law in Mindanao, which was originally declared after ISIS-affiliated extremists clashed with government troops in October 2017. 

Amid the COVID-19 pandemic, Rappler fact-checked a number of forwarded messages that caused panic.

In January 2020, at least 5 rumors circulated, saying there were COVID-19-positive patients in hospitals or an office building. In March, a list of hotels and malls to avoid circulated – these were supposedly places that 11 COVID-19-positive patients frequented. 

Also circulated through Messenger were supposed cures against the virus which ranged from ingesting aspirin dissolved in lemon juice boiled with honey, or eating boiled garlic and drinking the water used to boil it. 

In June, a conspiracy theory about COVID-19 spread on Messenger, telling recipients not to get vaccinated against the disease once a vaccine is available in the Philippines. The vaccine, the message said, would be a way to forcibly inject microchips into citizens. 

As in the case of the Dengvaxia scare, this could affect how people will respond to vaccination when the COVID-19 vaccines are finally available. 
All of these illustrate the need for Facebook to act immediately, to prevent real world harm. – with Gemma Bagayaua-Mendoza/Rappler.com

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Loading
Download the Rappler App!
Mayuko Yamamoto

author

Vernise Tantuco

Vernise Tantuco is on Rappler's Research Team, fact checking suspicious claims, wrangling data, and telling stories that need to be heard.