TikTok Sued After 15-Year-Old’s Suicide Linked to Harmful Content

In France, seven families have initiated a legal action against TikTok, claiming that the app has neglected to regulate inappropriate or damaging material.

Since the tragic suicide of her 15-year-old daughter in 2021, Stephanie Mistre has been vocal about her concerns with TikTok. As reported by the New York Post, she alleges that the popular video platform recommended videos to her daughter that appeared to promote suicide.

She stated that it was more like psychological manipulation. They made it seem acceptable, even desirable, for people to experience deep sadness and self-inflicted harm, creating a warped feeling of camaraderie.

As a passionate follower, I’ve recently learned that along with seven other concerned families, I find myself taking a stand against TikTok France. We believe the platform is falling short in its responsibility to regulate harmful content and safeguard our kids from potentially dangerous materials.

Seven families file lawsuit against TikTok

According to Stephanie Mistre, they use sad or depressing content to draw children onto their platform, transforming them into valuable resources for getting repeat business.

As per the report, TikTok states that their platform does not allow videos encouraging suicide, and they have around 40,000 staff members dedicated to ensuring content of this nature is combated effectively.

Lawyer Laure Boutron-Marmion, who represents the seven families involved in the legal action, stated that their case is built on a substantial amount of proof. According to her, the company can no longer evade accountability by claiming they aren’t responsible because they don’t produce the content.

Additionally, the attorney brought up Douyin, which is similar to TikTok but owned by ByteDance in China. This platform features a “Young User Mode” that is compulsory for all users below the age of 14.

“It proves they can moderate content when they choose to,” she said.

Previously, TikTok has encountered lawsuits regarding inappropriate content. As far back as October, a group of 13 U.S. states initiated legal action against the company, arguing that their attractive filters and the automatic feed (For You Page) are detrimental to the wellbeing of children using the platform.

The developers allegedly asserted that the app was created to capitalize on children’s propensity for prolonged usage, as it would automatically present fresh videos to them during their time spent within the application.

Read More

2025-01-24 02:33