social media platforms

Facebook updates tools for protecting teens from suspicious adults

Gelo Gonzales

This is AI generated summarization, which may have errors. For context, always refer to the full article.

Facebook updates tools for protecting teens from suspicious adults

FACEBOOK. A smartphone with Facebook's logo is seen with new rebrand logo Meta in this illustration taken October 28, 2021

Dado Ruvic/Reuters

Young users under 16 years old, or 18 years old in other countries, will be prevented from messaging 'suspicious adults' they are not connected to

MANILA, Philippines – Facebook parent company Meta on Monday, November 21, US time, announced that it has updated privacy and safety tools to protect teens from “potentially suspicious adults.” 

In 2021, adults were restricted from messaging teens – 16 years old and below in some countries, and 18 years old and below in others –  that they aren’t connected to and from seeing teens in their “People You May Know” recommendations. 

The update expands the feature so that teens themselves will not be able to message suspicious adults they are not connected to, and will not see them as well in their recommended connections. 

Meta defines a “suspicious” account as “one that belongs to an adult that may have recently been blocked or reported by a young person.” Meta also said that it’s testing removing the message button altogether on teens’ Instagram accounts when viewed by suspicious adults. The company will also be adding new automatic notifications, and reporting options to make it easier for targeted young people to take appropriate action. 

Teens will also be given more private settings by default, and will be shown messages encouraging them to keep settings private including who can see their friends list, who can see what they follow, who can comment on their posts, and who can see their posts when they are tagged. 

The company also announced that it was working with the National Center for Missing and Exploited Children (NCMEC) to “build a global platform for teens who are worried intimate images they created might be shared on public online platforms without their consent” similar to what they have built for adults. The platform appears to be semi-open source as other companies in the tech industry will be able to use it. 

The company said it was also working with another anti-human trafficking non-profit group, Thorn and its “NoFiltr” brand, to create educational materials meant to lessen shame and stigma with intimate images and empower teens to seek help and regain control if they find themselves in a situation where their images have been shared or are experiencing sextortion. 

In March 2022, Wired published an article wherein a researcher found groups looking for children as young as 11 years old while simply searching for numbers such as 11, 12, and 13 related to their disinformation research. One of the groups they found had about 9,000 members. They made an initial report to Facebook through the platform’s tools but the group was not taken down because the platform believed it had not found community standards violations. It was only taken down when they personally reached out to a connection in Facebook. 

Researchers also found posts in Filipino labeled “hanap jowa” (looking for partner), which led to an article about Filipino Reddit users’ efforts to get “hanap jowa” groups removed. 

Wired also cited recent research saying that about 25% of 9- to 12-year-olds report being solicited online sexually, with 1 in 8 having been asked to send a nude image or video. One in 10 have reported being asked to join a sexually explicit livestream. – Rappler.com

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

Summarize this article with AI

How does this make you feel?

Loading
Download the Rappler App!
Clothing, Apparel, Person

author

Gelo Gonzales

Gelo Gonzales is Rappler’s technology editor. He covers consumer electronics, social media, emerging tech, and video games.