Indonesia

Child porn thriving in WhatsApp due to insufficient moderation

Rappler.com

This is AI generated summarization, which may have errors. For context, always refer to the full article.

Child porn thriving in WhatsApp due to insufficient moderation
With 1.5 billion users and just 300 employees, WhatsApp has seen child porn slip through its content moderation systems

MANILA, Philippines – WhatsApp has a child porn problem, as reported by TechCrunch Friday, December 21. 

The messaging app owned by Facebook has been discovered to be home to a considerable number of chat groups that circulate child porn among its members.

The discovery was made by two Israeli non-governmental organizations, NGOs Screen Savers and Netivei Reshe. They became aware of the issue in July 2018, when a man reported having seen hardcore porn on the app.

Later in October, the NGOs found more than 10 child pornography groups, whose content they monitored, as well as the apps being used to find the said groups. 

Another group, AntiToxin Technologies, was cited by TechCrunch for having identified more than 1,300 videos and photos of minors involved in sexual acts on WhatsApp groups. 

Chat groups in WhatsApp can be found through an invite link, usually sent to someone by a friend. The number of members are also limited to 256 because, according to WhatsApp, limiting the number discourages abuse.

The NGOs discovered that people were able to find the group through a 3rd-party app that lists chat groups according to a category. One of the categories in these 3rd-party apps was “Adult,” which contained links for both legal pornography and illegal child pornography. 

The app was called “Group Links For Whats by Lisa Studio,” which TechCrunch reports has now been removed on the Google Play Store and Apple’s App Store. Some of the groups listed on the now-removed app include “child porn only no adv,” “child porn xvideos,” and “videos cp,” with “cp” being an abbreviation for child pornography. 

Lacking moderation

The group-searching app, along with two other important factors, can be identified as reasons why child porn is able to spread in WhatsApp.

First, the app offers end-to-end encryption. It’s a valuable privacy feature that means only the sender and its intended receiver or receivers will be able to read a message. The message cannot be deciphered as it travels between sender and receiver. However, that includes WhatsApp. That means their system cannot detect child pornography content.

WhatsApp can check a user’s profile photo, a group’s profile photo, and group information, and see if these elements involve child pornography. But it cannot check what’s in the chat threads to see what’s being shared in there. 

Second, TechCrunch reports that WhatsApp’s manual moderation is severely lacking. Manual moderation may have been the technique that could have identified the offending groups, and put a stop to them. Unfortunately, WhatsApp has 1.5 billion users, and just 300 employees – a portion of whom handles content moderation. Facebook – which has 20,000 content moderators – doesn’t help in WhatsApp moderation, TechCrunch says. Having more human moderators could help alleviate the troubling development once its parent company allocates some resources to them.

WhatsApp told TechCrunch it is investigating the groups mentioned in the reports, which have also been reported to Israeli police. WhatsApp also said it banned 130,000 accounts in a recent 10-day period for violating its policies against child exploitation. – Rappler.com

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Loading
Download the Rappler App!