TikTok Accused of Pushing Suicidal Videos to Teen, Leading to Tragic Death

A family claims that TikTok utilized their son’s location data to deliver him railroad-related suicide videos as a teenager, which they believe played a role in his tragic passing.

For quite some time now, I’ve been following the developments surrounding TikTok, and it’s become increasingly concerning to see how this platform has found itself embroiled in legal battles. Families have filed numerous lawsuits, claiming that certain videos on the app indirectly contributed to the tragic loss of their loved ones.

At the beginning of February, the parents of four separate adolescents initiated a legal action against TikTok related to the popular ‘Blackout Challenge’, even though this trend originated well before the app’s existence by several years.

According to recent legal documents, it’s alleged that TikTok utilized location data to deliver railroad-focused suicide videos to 16-year-old Chase Nasca prior to his tragic death by train. In simpler terms, the court filing suggests that TikTok used the location tracking feature on the app to show Chase distressing videos about trains and suicide before he took his own life by stepping in front of a moving train.

Parents say teen was fed videos of suicide

Ever since I tragically lost my life in 2022, as a gamer and someone who used to spend time on TikTok, I can’t help but think that the app’s algorithm was a significant factor. My parents have been vocal critics of TikTok ever since, claiming that the platform’s addictive algorithms flooded me with videos about suicide, which they believe contributed to my demise.

As a gamer, I’ve come across claims that TikTok might have accessed my location data from my phone, specifically when I was near the railroad in Long Island, New York. This supposedly allowed them to push “railroad-themed suicide” videos onto my For You Page both before and after a tragic event involving me. The family has stated that they live close to this railroad, raising concerns about potential targeted content.

As per a document dated February 5th, it’s stated that Chase requested content that was uplifting and inspiring from the platform. However, the app reportedly delivered thousands of videos containing suicide-related content to his personal feed instead.

The court records mention that TikTok had a responsibility to safeguard Chase Nasca from potential harm that could reasonably occur when using their social media platforms as expected.

Last January, I found myself among seven French families who took legal action against TikTok. Our claim was that they weren’t doing enough to filter out harmful content on their platform.

Related

TikToker calls police after finding mysterious cabin built behind his house

One mother stated that these platforms lure children with sad or depressing material to keep them engaged, eventually treating them as valuable resources for repeat business.

In October, over a dozen State Attorneys General initiated a legal action against TikTok, arguing that features like beauty filters and autoplay pose potential risks to children who use the platform.

Read More

2025-02-20 01:19