Transformative Virtual Reality Console: Prioritizing Community Benefit Over Profits Transformative Virtual Reality Console: Prioritizing Community Benefit Over Profits

TikTok Allegedly Directed 13-Year-Old Users Toward Pornographic Content, Report Finds

TikTok Allegedly Directed 13-Year-Old Users Toward Pornographic Content, Report Finds

by | Oct 6, 2025 | Technology | 0 comments

A new investigation by UK watchdog Global Witness has revealed that TikTok may have guided young users toward sexually explicit content through its search suggestions. The report, published on October 3, raises concerns about the platform’s handling of age-appropriate content and online safety.

Global Witness created seven TikTok accounts in the UK, posing as 13-year-olds—the minimum age required to use the platform—on factory-reset phones with no search history. The investigation found that TikTok’s suggested search terms were “highly sexualized,” even for users browsing in “restricted mode,” which is intended to limit exposure to content that may not be appropriate for all audiences.

The findings come amid increasing scrutiny over child safety online in both the UK and the US, as TikTok faces lawsuits alleging harm to young users’ mental health.

A TikTok spokesperson responded, stating, “As soon as we were made aware of these claims, we took immediate action to investigate them, remove content that violated our policies, and launch improvements to our search suggestion feature.” The company emphasized that it uses over 50 safety features and settings to protect teens and that nine in ten violative videos are removed before being viewed.

Despite these measures, the report indicated that sexualized search suggestions appeared the very first time users clicked the search bar for three of the test accounts, with pornographic content surfacing after just a few clicks. “Our point isn’t just that TikTok shows pornographic content to minors. It is that TikTok’s search algorithms actively push minors towards pornographic content,” the report stated.

TikTok’s community guidelines prohibit nudity, sexual activity, sexual services, and sexually suggestive content involving minors. According to TikTok’s transparency report covering January through March 2025, about 30% of removed content was flagged due to sensitive or mature themes.

The investigation follows the UK’s Online Safety Act 2023, which went into effect in late July and imposes rules requiring platforms to implement safeguards such as age verification to prevent children from accessing harmful content. Media lawyer Mark Stephens described the findings as “a clear breach” of the act.

TikTok stated it approaches compliance with the Online Safety Act through a “robust set of safeguards” and has been regulated by UK communications regulator Ofcom since 2020. The company has also introduced safety measures for teens, including guided meditation features to limit screen time and the disabling of late-night notifications.

TikTok joins other tech giants facing pressure to protect children online. YouTube recently implemented AI-based age estimation for content restrictions, while Instagram now defaults teen accounts to private to enhance safety.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Loading...