TikTok is refreshing the rules and standards that govern everything and everyone on the app amid threats of a nationwide ban in the US.
The short-form video giant said it’s sharing the changes it has made to its community principles to better inform the public on the internal decisions that work to build trust and keep the app safe. The principles, which guide how content on the app is moderated, are based on the company’s commitment to uphold human rights and align with international legal frameworks.
TikTok notes it worked with more than 100 organizations in the world on the new guidelines, including the International Association for Suicide Prevention, TikTok Southeast Asia Safety Advisory Council, and members of the community.
As part of the refreshed guidelines, content creators are now required to clearly disclose when a piece of media that depicts a realistic scene has been altered or manipulated by artificial intelligence (AI) tools. They can do so by indicating “synthetic” or “fake” in the caption or adding a sticker.
“We balance the expressive value that synthetic media has against the risk of harms to individuals,” reads the refreshed guidelines. “We do not allow synthetic media that contains the likeness of any real private figure.”
Tiktok additionally said it’s adding “tribe” as a protected attribute in its hate speech and hateful behavior policies.
It also shared more details on how it plans to protect civic and election integrity, including its new approach on political accounts.
“We do not allow paid political promotion, political advertising, or fundraising by politicians and political parties (for themselves or others).” TikTok said it also won’t allow misinformation about the civic and electoral process, regardless of intent.
The refreshed guidelines take effect on April 21, and TikTok plans to train moderators to enforce the new rules and standards as they roll out over the coming months.
Apart from this, TikTok laid out the four pillars that it claims guide its approach to content moderation. The first of these is to remove violative content. The second is to age-restrict mature content, so only those 18 years or older can view them. The third is to maintain the criteria of the recommendation algorithm to ensure content it pushes is appropriate for a broader audience. Lastly, TikTok wants to empower the community with tools and resources to control their own experience on the app.
“We’re proud to be sharing these refreshed Community Guidelines offering our community much more transparency about our rules and how we enforce them,” said TikTok in a press briefing.
The changes in TikTok’s community guidelines come ahead of CEO Shou Zi Chew’s testimony in front of the US House of Representatives Energy and Commerce Committee on Thursday, March 23. The tech executive is expected to address US lawmakers’ security concerns, including how it handles the data of US users.
TikTok is facing the threat of a nationwide ban in the US over its ownership. The app, which now has over 150 million active users in the US, is owned by Beijing-based tech giant ByteDance, a connection which has led lawmakers to accuse it of feeding data to the Chinese government.
There are no comments yet. Add your comment to start the conversation.