WSJ: Communist China’s TikTok Suggests Videos of Drug Use and Sex to Minors

A teenager presents a smartphone with the logo of Chinese social network Tik Tok, on January 21, 2021 in Nantes, western France. (Photo by LOIC VENANCE / AFP) (Photo by LOIC VENANCE/AFP via Getty Images)
LOIC VENANCE/AFP via Getty Images

In a recent article, the Wall Street Journal describes how the algorithms of the Chinese-owned social media app TikTok suggests videos of drug use and sexually explicit content to minors. Children and underage teens make up a considerable percentage of TikTok’s American userbase.

In a recent article, the Wall Street Journal outlines how the popular Chinese-owned video app TikTok uses its algorithm to suggest videos to minors that include the use of illegal drugs and sexually explicit content.


TikTok app fined in US for illegally gathering children’s data (Joe Scarnici/AFP/Getty)

schoolkids using smartphones

schoolkids using smartphones ( dolgachov/Getty)

ByteDance CEO Zhang Yiming

ByteDance CEO Zhang Yiming (STR/Getty)

The WSJ notes the story of a 13-year-old TikTok user who searched the app for “OnlyFans,” the name of the subscription website that is primarily used to host pornographic content. The underage user then watched multiple videos, including two advertising access to pornography.

Upon returning to TikTok’s “For You” feed, which displays content to users based on their interests and previously watched videos, the 13-year-old user was served a number of videos related to the tag “sex.” These videos included role-playing videos where people pretended to be in a relationship with caregivers, the WSJ notes that in one video, a man’s voice instructs a woman wearing a latex leotard, stating: “Feel free to cry. You know it’s daddy’s favorite.”

The Journal writes:

As the user scrolled through the videos appearing in the feed, lingering on the more sexually oriented ones while moving more quickly past others, the For You feed was soon almost entirely dominated by TikToks involving sexual power dynamics and violence. The app’s algorithm had pushed the user into a rabbit hole that many users call “Kinktok,” featuring whips, chains and torture devices. Some of the content is banned by the platform.

The Wall Street Journal notes that the 13-year-old user that was served these videos by TikTok doesn’t actually exist, the account was one of dozens of automated accounts created by the WSJ to examine TikTok’s algorithm. The WSJ writes:

TikTok served one account registered as a 13-year-old at least 569 videos about drug use, references to cocaine and meth addiction, and promotional videos for online sales of drug products and paraphernalia. Hundreds of similar videos appeared in the feeds of the Journal’s other minor accounts.

TikTok also showed the Journal’s teenage users more than 100 videos from accounts recommending paid pornography sites and sex shops. Thousands of others were from creators who labeled their content as for adults only.

Read more at the Wall Street Journal here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or contact via secure email at the address


Please let us know if you're having issues with commenting.