News

Amnesty: TikTok continues to expose young people to suicide-related content

TikTok denies the validity of the human rights group's report.

A boy scrolls through his phone in a silhouetted image.
Some 90 percent of Finnish youths are on TikTok, sometimes for hours a day. Image: Matt Cardy / Getty

Amnesty alleges TikTok is still steering vulnerable children and young people towards depressive and suicidal content — an issue previously flagged by Yle in a report which documented how the app's algorithm offers teens self-destructive content.

The human rights group this week said its research showed how the social media app is pushing French children and young people engaging with mental health content into a cycle of depression, self-harm and suicide content.

Their research, dubbed 'Dragged into the Rabbit Hole,' alleges TikTok's ongoing failure to address its systemic design risks affecting children and young people.

While the group's research focused on France, the takeaways also apply to Finland, where TikTok is one of the most popular apps, according to Amnesty spokesperson Mikaela Remes.

Some 90 percent of Finnish youths use the app, sometimes for hours a day.

"There is a lot of discussion now on how TikTok is affecting young people in Finland," she added. "TikTok is failing to comply with the EU's Digital Services Act (DSA), which requires platforms to protect children from algorithmic risks," Remes told Yle News.

The European Commission has launched formal proceedings to determine whether TikTok has violated the Digital Services Act (DSA) in areas concerning the protection of minors, transparency in advertising, researchers' access to data, and the management of risks linked to addictive design and harmful content.

Down the rabbit hole?

TikTok is regulated under the DSA to identify and mitigate systemic risks to children's rights. Amnesty, however, said it found that the app was recommending content on 'suicide challenges’.

"Within just three to four hours of engaging with TikTok's 'For You' feed, teenage test accounts were exposed to videos that romanticized suicide or showed young people expressing intentions to end their lives, including information on suicide methods," Lisa Dittmer, Amnesty International's researcher on Children and Young People's Digital Rights, said in a press release.

Amnesty International researchers created three teenage TikTok accounts — two female and one male — registered as 13-year-olds based in France, to manually study how the platform's algorithm amplifies content in its 'For You' feed. Within five minutes of scrolling, and before expressing any preferences, the accounts were shown videos depicting sadness and disillusionment.

Amnesty researchers reported finding two videos promoting the so-called 'lip balm challenge' in the feed of a manually operated test account in the summer of 2025. The trend initially appeared as a harmless game in which users guessed the scent of another person's lip balm. Over time, however, it morphed into a darker version that encouraged participants to scrape away part of their lip balm whenever they felt sad — and to self-harm or attempt suicide once the stick was gone.

Responding to Yle News regarding Amnesty's report via Finnish communications firm Manifesto, the company said, "with more than 50 features and settings designed specifically to support the safety and well-being of teens, and 9 in 10 violative videos removed before they're ever viewed, we proactively provide safe and age-appropriate teen experiences.

The statement was also critical of the methodology behind Amnesty's study: "with no regard for how real people use TikTok, this 'experiment' was designed to reach a predetermined result, the authors admit that the vast majority (95%) of content shown to their pre-programmed bots was in fact not related to self-harm at all," the Chinese company said in its message.