Taiwanna Anderson’s life changed forever in December 2021, when she found her 10-year-old daughter Nylah unconscious, hanging from a purse strap in a bedroom closet.
Barely an adolescent, Nylah wasn’t suicidal. She had merely come across the “Blackout Challenge” in a feed of videos curated her for her by TikTok’s algorithm. The challenge circulating on the video-sharing app encouraged users to choke themselves with household items until they blacked out. When they regained consciousness, they were supposed to then upload their video results for others to replicate. After several days in a hospital’s intensive care unit, Nylah succumbed to her strangulation injuries. Anderson sued TikTok over product liability and negligence that she alleges led to Nylah’s death.
For years, when claimants tried to sue various internet platforms for harms experienced online, the platforms benefited from what amounted to a get-out-of-jail-free card: Section 230 of the Communications Decency Act, a 1996 statute that offers apps and websites broad immunity from liability for content posted to their sites by third-party users. In 2022, a federal district judge accepted TikTok’s Section 230 defense to dismiss a lawsuit filed by Anderson based on the assessment that TikTok didn’t create the blackout challenge video Nylah saw—a third-party user of TikTok did.
“TikTok reads 230 of the Communications Decency Act to permit casual indifference to the death of a ten-year-old girl.”
But on Tuesday, the federal 3rd Circuit Court of Appeals released an opinion reviving the mother’s lawsuit, allowing her case against TikTok to proceed to trial. TikTok may not have filmed the video that encouraged Nylah to hang herself, but the platform “makes choices about the content recommended and promoted to specific users,” Judge Patty Shwartz wrote in the appellate court’s opinion, “and by doing so, is engaged in its own first-party speech.”
“TikTok reads 230 of the Communications Decency Act to permit casual indifference to the death of a ten-year-old girl,” wrote Judge Paul Matey in a partially concurring opinion that sought to go even further than the other two judges on the panel.
Legal experts on tech liability say the panel’s overall decision could have immense ramifications for all kinds of online platforms that rely on algorithms similar to TikTok’s.
“My best guess is that every platform that uses a recommendation algorithm that could plausibly count as expressive activity or expressive speech woke up in their general counsel’s office and said, ‘Holy Moly,'” says Leah Plunkett, faculty at Harvard Law School and author of Sharenthood, a book about protecting kids online. “If folks did not wake up [Wednesday] thinking that, they should be.”
Advocates of Section 230 have long held the broad liability shield is necessary for the internet to exist and evolve as a societal tool; if websites were responsible for monitoring the heaps of content that hundreds of millions of independent users create, they contend, lawsuits would devastate platforms’ coffers and overwhelm the judicial system.
“If you have fewer instances in which 230 applies, then platforms will be exposed to more liability, and that ultimately harms the Internet user,” says Sophia Cope, senior attorney with the Electronic Frontier Foundation, a free speech and innovation non-profit. A narrower interpretation of Section 230 immunity would make platforms “not want to host third party content, or severely limit what users can post,” Cope says, adding that the shift would amount to platforms engaging in “preemptive censorship” to protect their bottom lines.
But critics of Section 23o’s current scope say the statute has been interpreted far too leniently and that companies should at least sometimes be responsible for dangerous content their online platforms disseminate. In its monumental ruling this week, the appeals court said that when platforms curate harmful content, they—not their third-party users—may be engaging in a form of “expressive activity” for which they can be sued.
“TikTok made the conscious decision to not rein in the challenge, but instead to serve it up to more and more kids, many of whom were likely under the influence of TikTok’s addictive algorithms,” says Carrie Goldberg, a lawyer who has been involved in several product liability suits against tech companies.
Tuesday’s decision is a departure from previous federal court rulings about the liability (or lack thereof) of online platforms. That’s because the appeals court had new case law to consider. In July, the Supreme Court released a favorable ruling to tech platforms at the behest of the trade group NetChoice, buttressing the ability of platforms to engage in expressive activity such as curating content or de-platforming politicians if they so chose. To interfere with that ability would implicate the First Amendment rights of platforms, the Supreme Court said in a narrow ruling of that case, NetChoice v. Moody.
Some experts believe the NetChoice decision has little to do with Section 230, which was originally passed by Congress with the intent of protecting fledgling tech platforms from being sued for moderating some content, but not doing a good enough job of it. “There is NO CONFLICT between moderation and ranking being (1) the platform’s speech and also (2) immunized by 230,” Daphne Keller of Stanford’s Cyber Policy Center wrote on Twitter. “The whole point of 230 was to encourage and immunize moderation.”
But for proponents of re-litigating 230, the fact that NetChoice’s earlier Supreme Court argument was used by Third Circuit to reconsider how expansive Section 230 protections should be is “deliciously ironic,” says Goldberg, because “NetChoice is the most pro-230 lobbyist organization out there.”
Tuesday’s appellate court decision does not guarantee that TikTok will be held liable for showing Nylah the video that culminated in her death; however, it means cases in the Third Circuit’s jurisdiction shouldn’t be thrown out on Section 230 grounds before trial courts can consider their facts.
The ruling “should send a message” to platforms that use content curation algorithms “that the gravy train is over,” Goldberg tells Mother Jones. “These tech behemoths have been minting money for too long, comfortable with the notion that dead kids are the price of outrageous growth.”