TikTok's algorithm has been found to rapidly expose young users, especially teenagers, to mental health-related content that can be harmful, including content that normalizes or promotes suicide. Research shows that within just 5 hours of use, nearly half the videos on the "For You" feed relate to mental health struggles such as sadness, depression, and suicidal thoughts. This intense focus on such content happens even faster when users interact with these videos, sometimes within 20 minutes, creating a "rabbit hole" effect.
Amnesty International and other groups observed that accounts mimicking 13-year-olds were shown increasing amounts of depressive and suicidal videos, with some accounts encountering videos expressing suicidal ideas within 45 minutes. The algorithm's engagement-driven model exacerbates risks rather than protecting vulnerable users.
These findings raise concerns over TikTok's compliance with regulations meant to safeguard children and call for stronger measures to protect mental health on the platform.
