Lawsuit alleges harmful content on TikTok contributed to deaths of two teens

Families claim TikTok's algorithm promoted harmful content

by · TechSpot

Serving tech enthusiasts for over 25 years.
TechSpot means tech analysis and advice you can trust.

What just happened? The impact that social media platforms have on young users' mental health is once again under scrutiny after French families of seven teenage girls filed a lawsuit against TikTok. They allege that the platform exposed their teenage children to harmful content that led to two of them taking their own lives at 15.

Filed in the Créteil judicial court, the lawsuit claims that TikTok's algorithm suggested videos to the teens that promoted suicide, self-harm, and eating disorders.

"The parents want TikTok's legal liability to be recognised in court," lawyer Laure Boutron-Marmion told broadcaster franceinfo. "This is a commercial company offering a product to consumers who are, in addition, minors. They must, therefore, answer for the product's shortcomings."

In September 2023, the family of 15-year-old Marie filed criminal charges against TikTok after her death, accusing the platform of "inciting suicide," "failure to assist a person in danger," and "promoting and advertising methods for self-harm," writes Politico. TikTok's algorithm allegedly trapped Marie in a bubble of toxic content linked to bullying she experienced because of her weight.

TikTok is facing numerous lawsuits in the US over claims that it is harmful to young people's mental health. In 2022, the families of several children who died while trying to participate in a dangerous TikTok challenge sued the company and its parent, ByteDance, after the app allegedly recommended videos of the 'blackout' strangulation challenge to the minors, all of whom were ten years old or under.

Last month, a group of 14 state attorneys general filed lawsuits against TikTok, accusing it of harming children's mental health and violating consumer protection laws. It's alleged that TikTok uses manipulative features to keep young users on the platform for longer. These include endless scrolling, autoplay videos, and frequent push notifications.

It's not just TikTok that remains under the spotlight over the alleged harms it can cause young people. All social media platforms face the same scrutiny. In October last year, the attorneys general of over 40 US states sued Facebook for harming children's mental health.

// Related Stories

In a Senate online child safety hearing in January, Meta CEO Mark Zuckerberg apologized to parents in the audience who said Instagram contributed to their children's suicides or exploitation.

The impact of social media on the mental health of not just children but also adults led to the US Surgeon General calling on Congress to apply cigarette-style labels on these sites and apps that alert users to the potential harms they cause.

Social media companies usually hide behind Section 230 of the 1996 Communications Decency Act, which shields them from liability for user-posted content.

TikTok still faces a potential ban in the US. Due to national security concerns over its Chinese ownership, President Joe Biden signed legislation in April requiring ByteDance to divest its US operations by January 19, 2025, or face a nationwide ban.

Masthead: Olivier Bergeron