International criticizes TikTok's algorithm for potentially exposing young users to content that could harm their mental health.">
TikTok’s Algorithm Under Fire for Mental Health Risks
BUCHAREST — May 8, 2024 —
Amnesty International is raising concerns about TikTok‘s algorithm regarding its potential impact on the mental well-being of younger users. The human rights institution’s research suggests that TikTok’s algorithm can be manipulated to direct vulnerable individuals towards content that normalizes, or even encourages, self-harm and suicidal ideation. This has prompted criticism and calls for greater accountability from the platform. Read on for details.
tiktok’s Algorithm Under Fire: Amnesty International Sounds Alarm Over Mental Health Risks
tiktok, the popular short-form video platform, faces renewed scrutiny over its handling of content related to mental health, particularly concerning its younger users. Amnesty International has criticized tiktok for failing to adequately address the risks that its algorithm poses to the mental well-being of adolescents.The association’s concerns stem from research indicating that children are susceptible to being funneled toward content that can lead to depression and even suicidal ideation.
the Experiment: A Deep Dive into tiktok’s “for you” Page
amnesty International conducted an experiment using accounts designed to mimic 13-year-old children. The results were alarming:
- within 20 minutes of creating a new account and expressing interest in mental health topics, over half of the videos on the “for you” page were related to mental health issues.
- within an hour, the recommended clips began to romanticize, normalize, or even encourage suicide.
this experiment, part of a larger research project published in november 2023, underscores the potential dangers of tiktok’s algorithm.
did you know? tiktok’s algorithm is designed to learn your interests based on your interactions with the app, including the videos you watch, like, and share.this data is then used to curate your “for you” page.
tiktok’s Response: Familiar Measures, Unaddressed Concerns
ahead of the 2025 mental health awareness week, amnesty International questioned tiktok about the changes implemented sence the initial research findings. tiktok responded with a list of measures already in place during the 2023 study. These included:
- pre-emptive request of its rules.
- maintaining content standards.
- applying “dispersal techniques” to the “for you” page.
- offering liquidation options.
- providing an update function to reset the summary.
amnesty International noted that tiktok failed to acknowledge the core issue: the ease with which users are drawn into psychologically harmful content, despite these existing measures. no new, specific solutions were presented.
tiktok Acknowledges Potential Harm
tiktok has admitted that certain types of concentrated content can unintentionally amplify negative personal experiences for some viewers. some types of concentrated content, although the rules of tik tuk may not cause harm by enhancing a negative personal experience of some viewers unintentionally.
the platform acknowledges the potential impact on mental health, especially for younger users, related to content about:
- harsh dietary restrictions.
- body image issues.
pro tip: if you’re concerned about the content you’re seeing on tiktok, you can use the “not interested” button to tell the algorithm to show you less of that type of content. you can also report videos that violate tiktok’s community guidelines.
the algorithm Under the Microscope
amnesty International’s research highlights tiktok’s business model, which tracks user activity to predict interests and emotional states. the “for you” page algorithm can amplify negative emotions by feeding users increasing amounts of content related to depression and suicide, regardless of the potential harm. the organization stated that the summary of “for you” clearly picks up the psychological state of the person when he inflated huge amounts of depression -related content and even Suicidethen it is used for this content to recommend more it, regardless of the potential harm.
tiktok disputes this characterization, stating, amnesty International’s hint until tek tok is aware of the user’s emotional state in some way, and uses it to recommend the content of what is a wrong description of how our platform is working.
the company maintains that it has developed a dashboard for screen time management to provide users with a clearer understanding of their platform usage.