The new model aimed at creating a safer environment for younger users

The new model aimed at creating a safer environment for younger users

The new model aimed at creating a safer environment for younger users

According to TikTok, the updated model is more accurate and efficient at identifying content that may not be appropriate for all viewers. This is especially important given the popularity of the platform amongst teenagers. By more effectively identifying and removing this type of content, TikTok aims to create a safer and more family-friendly environment for its younger audience. “We’re making progress to reduce the prevalence of borderline or suggestive content recommendations overall and are now launching the next iteration of our borderline suggestive model which we expect to improve detection of such content, therefore creating a more appropriate and comfortable experience for teen account holders,” TikTok said in a blog post.

Challenges in detecting and removing borderline content

Challenges in detecting and removing borderline content

Challenges in detecting and removing borderline content

While other platforms like Instagram have also tried to filter out borderline content from recommendations, it has historically been difficult for automated systems to consistently detect content with more “mature” themes that do not contain explicit nudity. This is because the line between “borderline suggestive” and acceptable content can be subjective. The company will need to ensure that the new model is fair and doesn’t unfairly remove content that some users may find acceptable. TikTok did not provide specific details on how much more accurate the new system is, but the company said that it has “prevented teen accounts from viewing over 1 million overtly sexually suggestive videos” in the last 30 days.