TikTok Mental Health: Risks & Dangers | Movieguide

Published: diciembre 16, 2025

New analysis of TikTok reveals that the video app pushes mental health content to users at a higher rate than almost any other topic.

“TikTok’s algorithm favors mental health content over many other topics, including politics, cats and Taylor Swift,” said an analysis by The Washington Post, adding that mental health content is more “catchy” than other videos.

This “sticky quality” means that it is easier for your algorithm to find more mental health-related content to put in your feed, and that it is harder to get rid of it, even if you stop watching videos related to the topic.

Related: TikTok Employees Know Just How Dangerous the App Is

Kailey Stephen-Lane told the outlet that she actually had to stop using the app because of this; She has OCD, and TikTok bombarded her with videos about the condition, worsening her symptoms.

“The TikToks I’ve been receiving are not helpful for my recovery,” she said. “They lead me down a lot of spirals, and just clicking ‘not interested’ doesn’t seem to work anymore.”

What complicates things? The videos are not always accurate, leading viewers to have the wrong view of things like depression, autism, and other related topics.

“The algorithm says, ‘Well, you like this video about ADHD, even though it’s misleading, let’s give you another video,’” said Anthony Yeung, a psychiatrist and researcher at the University of British Columbia. “And it becomes this very vicious feedback loop of misinformation.”

This practice of pushing mental health content can have darker side effects. New research from AMNESTY INTERNATIONAL alleges that the app promotes content to young users that pushes them towards depression, suicidal ideation and self-harm.

“Our technical research shows how quickly teens who express interest in mental health-related content can be drawn into toxic black holes. In just three to four hours of interacting with TikTok’s ‘For You’ feed, teen test accounts were exposed to videos that romanticized suicide or showed young people expressing intentions to take their own lives, including information about suicide methods,” said Lisa Dittmer, Children and Youth Digital Rights Researcher at AMNESTY INTERNATIONAL.

TikTok is currently being sued by 14 attorneys general who allege that the app’s For You page algorithm has falsely advertised that it is not addictive.

In newly declassified documents about TikTok’s own internal investigation, it was revealed that the app was aware that “users only need to watch 260 videos before they can become addicted to the app,” and that “compulsive use is correlated with a number of negative mental health effects such as loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety.”

The documents also acknowledged that the algorithm had “better engagement” with young people.

While TikTok’s mental health content might seem helpful as people try to navigate their own diagnoses, it’s clear that the app is pushing the videos to its users, no matter how harmful they may be.

Read Next: Is TikTok’s Algorithm Really as Dangerous as We Think?

Watch UNSUNG HERO

Watch FERDINAND

Related Posts

Leave a Comment