Mourning parents asked TikTok for age verification, got maturity ratings instead

Mourning parents asked TikTok for age verification, got maturity ratings instead

TikTok’s safety features recently became the focus of a lawsuit filed by parents who claim that the app’s addictive design is responsible for the deaths of at least seven children, six of whom were too young to be on TikTok. Those parents suggested that TikTok take steps to protect young users, urging the platform to add an age verification process to restrict content or terminate the accounts of child users under the age of 13 — the minimum age required to join TikTok.

That’s not the direction TikTok has decided to go, though. At least, not yet. Instead, TikTok announced on Wednesday that it is adding new safety measures for all users designed to limit exposure to harmful content and give users more control over what shows up in their feeds. That includes giving users the power to block content containing certain words, hashtags, or sounds.

Specifically focusing on improving safety measures for TikTok’s “teenage community members,” TikTok is also “working to build a new system to organize content based on thematic maturity” —essentially, creating maturity ratings for TikTok videos, like the ratings you see on movies or video games.

“In the coming weeks, we’ll begin to introduce an early version to help prevent content with overtly mature themes from reaching audiences between ages 13-17,” TikTok’s Head of Trust and Safety Cormac Keenan wrote in the blog post.

Additionally, TikTok provided an update on the previously announced steps it is taking with its algorithm to protect users from endlessly scrolling through “potentially challenging or triggering viewing experiences.”

“Last year, we began testing ways to avoid recommending a series of similar content on topics that may be fine as a single video but potentially problematic if viewed repeatedly, such as topics related to dieting, extreme fitness, sadness, and other well-being topics, ”Keenan wrote. “We’ve also been testing ways to recognize if our system may inadvertently be recommending a narrower range of content to a viewer.”

Likely because TikTok community standards limit accounts to ages 13 and up, TikTok so far has only discussed the desired impact of new safety features on adult and teen users. That means TikTok still has yet to take action to address concerns that parents have about “hundreds of thousands of children as young as 6 years old,” whom the lawsuit alleges that TikTok knows “are currently using its social media product” without safety features designed just for them.

TikTok did not immediately respond to a request for comment on the new safety features or on how maturity ratings will work. It’s unclear if there are any future safety features — like age verification — planned to address growing concerns about children under 13 reportedly being harmed while using the app.

Leave a Comment

Your email address will not be published.

%d bloggers like this: