UPDATED 21:05 EDT / AUGUST 28 2024

POLICY

TikTok told it must face lawsuit over deadly viral challenge, despite Section 230 protections

The Chinese social media giant TikTok could potentially be found liable for content posted on its platform that led to the death of a girl, despite federal protections for publishers in such cases.

A U.S. appeals court revived a lawsuit today that was filed by the mother of a 10-year-old girl who died in Pennsylvania after she’d taken part in a viral challenge that dared people to choke themselves to the point of unconsciousness.

In most cases, if someone has come to harm from what they’ve seen on a publisher’s platform, the platform is shielded from liability under Section 230, part of the Communications Decency Act. The law protects tech firms from being sued over content uploaded to their platforms and it provides immunity against lawsuits related to content moderation decisions. For years now, there have been calls to reform the law.

Circuit Judge Paul Matey of the three-judge panel wrote that indeed the law provides “immunity from suit for hosting videos created and uploaded by third parties,” but he added that TikTok could still be held accountable for what he said was “knowing distribution and targeted recommendation of videos it knew could be harmful.”

The lawyers representing the girl’s mother, Tawainna Anderson, had said the “blackout challenge,” which seemed to surface sometime in 2021, appeared on her daughter’s “For You” feed. TikTok’s algorithm had essentially curated for the girl something that ended with her “unintentionally” hanging herself. Judge Patty Shwartz wrote in the opinion that there is a difference between being a “repository of third-party content” and “an affirmative promoter” of content.

“Nylah [daughter], still in the first year of her adolescence, likely had no idea what she was doing or that following along with the images on her screen would kill her,” Matey wrote in a partial concurrence to the opinion. “But TikTok knew that Nylah would watch because the company’s customized algorithm placed the videos on her ‘For You Page.’”

In her original statement, Anderson wrote that TikTok “unquestionably knew that the deadly Blackout Challenge was spreading through its app and that its algorithm was specifically feeding the Blackout Challenge to children, including those who had died.” It’s believed as many as 20 children died while trying to perform the challenge.

Jeffrey Goodman, one of the family’s lawyers, said it’s time the Section 230 law came under more scrutiny. While this particular case is sure to elicit widespread support for the family, according to the American Civil Liberties Union, removing tech firms’ liability protections would lead to censorship, and that could mean activists’ voices being quieted as companies try to avoid lawsuits.

The case will now proceed, with the family’s lawyers stating that this proves that Section 230 doesn’t go as far as protecting firms from what they called “egregious and predatory conduct.” In a statement on the ruling, Anderson said that the ruling won’t bring her child back, but TikTok being held accountable might help other families in the future “avoid future, unimaginable suffering.”

Photo: Alexander Shatov/Unsplash

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU