SUMMARY
This is AI generated summarization, which may have errors. For context, always refer to the full article.
MANILA, Philippines – Child pornography concerns led Apple to pull messaging apps Telegram and Telegram X last week.
The report from 9to5Mac cites an email from Phil Schiller, Apple’s Senior Vice President of Worldwide Marketing, who said they removed Telegram due to “illegal content, specifically child pornography, in the apps.”
Schiller explained, “After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).”
9to5Mac added the child pornography was likely being served up by a third-party plug-in used by Telegram.
The App Store team then worked with Telegram’s development team to remove the content from its apps, in addition to banning the users who posted the content.
Apple only reinstated the apps on its storefront after the content and offending users were removed, and after more controls were set up to prevent something of a similar nature from reoccurring.
Said Schiller, “We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity. Most of all, we have zero tolerance for any activity that puts children at risk – child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral.” – Rappler.com
Add a comment
How does this make you feel?
There are no comments yet. Add your comment to start the conversation.