Seven families in France have united under the collective Algos Victima to file a civil lawsuit against TikTok in the Créteil judicial court, represented by their lawyer, Laure Boutron-Marmion, on Monday. The families allege the Chinese-owned social media platform exposes children to harmful videos that promote suicide, self-harm, and eating disorders.
The Algos Victima collective aims to ensure that justice acknowledges TikTok’s responsibility for allowing harmful content to circulate on its platform. They argue that the social network has contributed to the deterioration of the mental and physical health of these children.
Among the seven teenage girls involved, two took their own lives at the age of 15. One of these girls was named as Marie. In the weeks before her death in September 2021 in Cassis, Bouches-du-Rhône, she shared a TikTok video expressing frustration about harassment over her weight. Her parents have since filed a criminal complaint. Additionally, four of the seven girls attempted suicide, and one struggled with anorexia. The families seek to compel the social network to improve its content regulation to prevent minors from being inundated with videos that could promote suicide, especially during vulnerable moments.
In 2023, the French Senate launched a commission of inquiry into TikTok’s operations, scrutinizing its data usage, influence strategies, and potential interference from China. Senate reports also raised concerns about TikTok’s impact on public health, particularly the psychological effects on young users. Although TikTok officially prohibits children under 13 from using the app, around 45% of 11- to 12-year-olds reportedly have accounts. The Senate emphasized the need for an age verification system using an independent third-party checker, aligning with a proposed French bill aimed at establishing a legal age for social media use and combating online hate. According to Senate findings, minors in France currently spend an average of 1 hour and 47 minutes per day on the app.
The plaintiffs have brought this case, citing TikTok’s platform guidelines, which explicitly ban content that promotes suicide, self-harm, or disordered eating, while also restricting potentially harmful behaviors to users over 18. However, the families all report a similar experience. One of the plaintiffs, Charlize’s mother, shares that her daughter, who was struggling with harassment, sought refuge on TikTok. The young girl grew addicted to the platform, searching for content that related to her feelings of distress, which ultimately drew her into a downward spiral. She stated “The algorithm picked up on her search style and suggested other content that went from bad to worse, about depression and cutting. Moreover, the case draws on a British precedent in which a coroner determined that Molly Russell, a 14-year-old girl who took her own life after spending time on Pinterest and Instagram, had been systematically exposed to graphic content depicting self-harm and suicide. Amnesty International has also documented what it describes as the “rabbit-hole effect” of TikTok’s algorithm. Research indicates that users who engage with mental health-related videos are rapidly directed toward content about depressive thoughts, self-harm, and suicide. This content steering appears to be designed to maintain user attention.