The US Court of Appeals for the Third Circuit has revived a lawsuit against TikTok over the tragic death of a 10-year-old Pennsylvania girl, Nylah Anderson, who allegedly participated in the platform’s viral “blackout challenge.” The lawsuit, filed by Nylah’s mother, Tawainna Anderson, accuses TikTok of negligence and product liability, claiming the platform’s algorithm pushed harmful content that led to her daughter’s death.
The Tragic Incident
In December 2021, Nylah Anderson was found unconscious in her bedroom after attempting the “blackout challenge,” a dangerous online trend that encourages users to choke themselves until they pass out. Despite being rushed to the hospital, Nylah tragically passed away a few days later. Her mother believes TikTok’s algorithm is to blame, stating that the platform’s “For You” page repeatedly showed Nylah videos of the challenge.
The Legal Battle
Tawainna Anderson initially filed the lawsuit in May 2022 in a Pennsylvania state court. However, a federal judge dismissed the case in October, citing Section 230 of the Communications Decency Act, which generally shields online platforms from liability for user-generated content.
However, the recent ruling by the Third Circuit Court of Appeals has breathed new life into the lawsuit. The court determined that Anderson’s claims could potentially proceed, arguing that TikTok’s recommendations might not be entirely shielded by Section 230. The court’s decision hinges on whether TikTok’s algorithm played an active role in promoting harmful content to Nylah.
TikTok’s Defense
TikTok has vehemently denied any wrongdoing, maintaining that the “blackout challenge” predates its platform and that it actively works to remove such content. The company emphasizes its commitment to user safety and highlights its efforts to prevent the spread of harmful challenges.
Implications of the Case
This case could have far-reaching implications for social media platforms and their responsibility for user-generated content. If TikTok is found liable, it could set a precedent for holding platforms accountable for the algorithms they use to recommend content. This could force platforms to implement stricter content moderation policies and potentially reshape the landscape of online content regulation.
The Broader Context
The “blackout challenge” is not an isolated incident. Several other children have reportedly died or suffered severe injuries after attempting the challenge. This case highlights the potential dangers of viral online trends and raises questions about the role of social media platforms in protecting young users.
The lawsuit against TikTok is far from over. The case will now return to the district court, where a judge will assess whether TikTok’s recommendations can be considered “traditional editorial functions” protected by Section 230. The outcome of this case could significantly impact the future of social media regulation and content moderation.
The Human Cost
Beyond the legal complexities, this case underscores the devastating consequences of harmful online content. Nylah Anderson’s tragic death serves as a stark reminder of the need for greater online safety measures, especially for young and vulnerable users.
As technology continues to evolve, so too must our approach to online safety. Social media platforms, lawmakers, and parents all have a role to play in creating a safer online environment for everyone. The lawsuit against TikTok is a crucial step in this ongoing journey.