Research finds TikTok shows new users harmful content quickly

Dec. 15 (UPI) — The popular social media app TikTok feeds harmful content to a mostly teenage audience within the first half hour of using it, a new report finds.

The Center for Countering Digital Hate conducted the study, setting up eight new accounts, posing as 13-year-old users, and recording what videos were played in the first 30 minutes of using it. They scrolled TikTok’s “For You” page, which over time caters itself to users based on how they engage with the content.

CCDH found TikTok played body image and mental health related videos every 39 seconds. New accounts were recommended videos discussing eating disorders and self harm “within minutes.” Content directly addressing suicide played every 2.6 minutes and some eating disorder videos have more than 13 billion views.

The report is not peer reviewed and eight samples is a relatively small sample size to be scientifically reliable. The researchers interacted with all videos featuring body image, mental health and eating disorders by pausing and liking the videos.

Four of the accounts were given generic female usernames and designated their locations as the United States, United Kingdom, Australia and Canada. The other four accounts were given “loseweight” usernames to test if TikTok would show different content. These were considered “vulnerable” teen accounts, which CCDH opines TikTok deliberately targets with more harmful content.

“Rather than entertainment and safety, our findings reveal a toxic environment for TikTok’s youngest users, intensified for its most vulnerable,” the report states.

The “loseweight” TikTok accounts were shown three times as many harmful videos and 12 times as many self-harm videos.

CCDH ends its report by calling on social media platforms like TikTok to take responsibility for the potential harm they may cause.

“Financial penalties and the risk of litigation will help to ensure that companies like TikTok are embedding safety by design and doing all that they can to ensure that their platforms are safe and harmful content is not amplified to young and vulnerable users,” the report said.

A spokesperson from TikTok said the behavior by researchers does not accurately “reflect genuine behavior or viewing experiences of real people,” CNET reports.

TikTok features short videos ranging for a few seconds to a couple minutes. Its community guidelines prohibit videos that “perpetuate the abuse, harm, endangerment, or exploitation of minors,” including animated videos. Videos that violate its guidelines will be removed.

COMMENTS

Please let us know if you're having issues with commenting.