Families of four British teenagers have filed a lawsuit against TikTok and its parent company, ByteDance, alleging that the platform’s algorithm promoted the deadly “blackout challenge,” leading to their children’s deaths in 2022. The lawsuit, initiated by the U.S.-based Social Media Victims Law Center (SMVLC) on February 6, 2025, represents the parents of Isaac Kenevan (13), Archie Battersbee (12), Julian “Jools” Sweeney (14), and Maia Walsh (13).
The “blackout challenge” encourages participants to choke themselves until they lose consciousness, a practice linked to multiple fatalities among young users. The lawsuit claims that TikTok’s algorithm intentionally targeted these children with such hazardous content to boost user engagement and, consequently, advertising revenue.
Matthew Bergman, founding attorney of the SMVLC, stated, “TikTok’s algorithm purposely targeted these children with dangerous content to increase their engagement time on the platform and drive revenue. It was a clear and deliberate business decision by TikTok that cost these four children their lives.”
In response, TikTok has maintained that it prohibits dangerous challenges and removes such content promptly. The platform asserts that searches related to the “blackout challenge” have been blocked since 2020, directing users instead to its Safety Center.
This lawsuit is part of a broader scrutiny of social media platforms’ responsibility in safeguarding users, especially minors, from harmful content. In a related case, a U.S. appeals court recently revived a lawsuit against TikTok concerning the death of a 10-year-old girl who allegedly attempted the same challenge. The court ruled that TikTok could be held liable for its algorithm’s role in promoting harmful content, challenging the typical immunity provided under Section 230 of the Communications Decency Act.
As legal proceedings continue, this case underscores the urgent need for social media companies to implement robust safety measures and for regulators to establish clear guidelines to protect vulnerable users from dangerous online trends.