Page Not Found!

Saltz Mongeluzzi Bendesky filed suit Thursday against TikTok and its parent company ByteDance
over the death of a 10-year-old girl who asphyxiated after apparently attempting a “blackout
challenge” that had allegedly circulated on the video-sharing platform.

The lawsuit, filed in Philadelphia federal court, is the first of its kind, according to lawyers from Saltz
Mongeluzzi.

The plaintiff contends that TikTok was responsible for the child’s death because the company is not
a passive purveyor of third-party content, but rather the producer of an algorithm that promotes
dangerous material and creates addiction in its user base.

Saltz Mongeluzzi’s Jeffrey Goodman said the suit is the first to his knowledge to target TikTok, or
indeed any social media platform, based on this type of products liability claim. He said TikTok is
particularly unique among social media for its heavily featured algorithm, which tailors suggested
videos to each specific user.

The complaint notes that the plaintiff does not seek to hold the company liable as a speaker or
publisher of third-party content, thereby evading protections afforded to such entities under Section
230(c) of the Communications Decency Act. Instead, the plaintiff “intends to hold the TikTok
Defendants responsible for their own independent conduct as the designers, programmers,
manufacturers, sellers, and/or distributors of their dangerously defective social media products and
for their own independent acts of negligence.”

Goodman said the case is not about the idea that potentially dangerous videos exist or are
accessible, but rather that, according to the complaint, TikTok promotes them directly to users.

According to Goodman, while litigation against social media companies is currently widespread,
claims against them largely center on data privacy matters or issues involving content moderators.
Approaching a case against a social media company as a products liability suit, he said, seems to be
a new tactic that could lead to other similar claims in the future.

“Big tech is the Big Tobacco of the modern age,” said Goodman. He said more types of litigation are
to come as more research is conducted examining the possible risks of social media.

The complaint cites reports of psychological harm posed by digital technology and claims that
“social media giants like the TikTok Defendants have seized the opportunity presented by the digital
wild west to manipulate and control the behavior of vulnerable children to maximize attention
dedicated to their social media platforms and thus maximize revenues and profits, all while shirking
any safety responsibilities whatsoever.”

The plaintiff is suing over the 2021 death of her daughter, who died from asphyxiation, allegedly
after attempting a “challenge” that she viewed in a video suggested to her on her “TikTok For You
Page.” The plaintiff claims that her daughter had been attempting to participate in a trend that
encouraged users to choke themselves until they lost consciousness.

“The TikTok Defendants’ algorithm determined that the deadly Blackout Challenge was well-tailored
and likely to be of interest to [a] 10-year-old, and she died as a result,” the complaint claims.

Attorneys with DiCello Levitt are representing the plaintiffs alongside Saltz Mongeluzzi. As
of Friday afternoon, no attorneys had appeared for the defendants.

In an emailed statement, a spokesperson for TikTok said, “This disturbing ‘challenge,’ which people
seem to learn about from sources other than TikTok, long predates our platform and has never been
a TikTok trend. We remain vigilant in our commitment to user safety and would immediately remove
related content if found. Our deepest sympathies go out to the family for their tragic loss.”

The case is currently assigned to U.S. District Judge Paul Diamond of the Eastern District of
Pennsylvania.

Do you have questions about harm or self-harm caused by social media?

Contact us for a free and confidential consultation with a member of our legal team.