TikTok feeds sometimes can be considered the wild wild west because one could likely scroll through strange and occasionally dangerous feeds. TikTok is now taking the next steps to rein things in. In a blog post on Wednesday, the company announced a new content rating system called “Content Levels,” which it plans to implement in an early version “in the coming weeks.” TikTok hinted in February that it was considering age-based feed restrictions. Content Levels provides the first details on what that might entail.
App users will also have more control over their video streams and can mute hashtags selectively. Although the social media giant claimed that the film, television, and gaming industries use similar systems, the company won’t immediately start showing video clip ratings. The sorting and filtering will take place in the background instead. “A maturity score will be assigned to the video to help prevent those under 18 from viewing it across the TikTok experience,” the company stated.
Must Read: TikTok to Help Users Curb Addiction With These New Tools
“When we detect that a video contains mature or complex themes, for example, fictional scenes that may be too frightening or intense for younger audiences. We have prioritized improving teen experience safety first, and in the upcoming months, we intend to add new functionality to provide thorough content filtering options for our users.
TikTok stressed in its announcement that the incoming content moderation system is still in its early stages. “We also recognize that what we’re attempting to accomplish is complex, and we may make mistakes,” the company wrote. However, while we wait for comprehensive top-down, age-based content filtering, app users can now set their restrictions. Hashtags and words in “For You” or “Following” feeds can now be muted, making scrolls slightly more organized than before. The platform stated that these changes and additional efforts to diversify recommended videos would be implemented in the coming weeks.
Must Read: TikTok to start sharing Ad Revenue with Creators
The platform already has content policies in place. Based on user complaints and the reviews of staff members tasked with screening posts, it prohibits particular types of videos. Two former TikTok moderators filed a lawsuit against the company in March, claiming that their efforts to remove violent or other inappropriate videos from the platform caused them trauma. According to the lawsuit, moderators are not adequately protected or provided mental health services by TikTok. This may not be good news for the app’s intended expansion of moderation.
Parents who claim their children were harmed or even killed due to TikTok content are also suing the company. A 10-year-old girl’s mother filed a lawsuit against the company in May after alleging that her child died from asphyxiation while attempting the “Blackout Challenge,” which was made famous by the app. Similar lawsuits were brought by more parents this month. Parental lawsuits based on allegations of social media addiction may now be possible in California, thanks to new legislation.