By: John Baek
One of the laws protecting free speech on the Internet is Section 230 of the Communications Decency Act. It grants “provider[s]” of “interactive computer service[s]” protection from civil liability for content they provide from third parties: a protection, if not immunity, that applies to social media platforms such as Facebook, Instagram, YouTube, and TikTok. While the alternative – increasing safeguards and content monitoring – is impractical for these tech giants, individual users are battling the Section 230 shield with lawsuits claiming injury due to the algorithmic presentation of third party content.
In October 2022, over eighty similar suits from the Northern District of California alone against TikTok, Snapchat, and Meta arose. On October 25, the District Court for the Eastern District of Pennsylvania dismissed a claim against TikTok alleging design defect, failure to warn, wrongful death, and survival. In that case, a ten-year-old died by hanging herself with a purse strap, attempting the “Blackout Challenge” which encourages users to strangle themselves with household items and share via TikTok. However, since TikTok did not create the “Blackout Challenge” and merely published the content to the victim via its algorithms, it fell within the Section 230 aegis and escaped liability.
Plaintiffs attempt unsuccessfully to bypass the Section 230 shield by considering social media apps as products and pursuing product liability claims. The only success is a ruling by the Ninth Circuit, which favored plaintiffs in a suit against Snapchat for negligent design of a speed filter, and thus considered Snapchat a product. However, the Communications Decency Act of 1996 maintains a distinction, originating from an earlier version codified in 1991, between “tangible world” and “ideas and expression,” drawing a tenuous line between the physical product and social media app. This distinction significantly predates social media apps. Historically, product liability suits reach the class-action level; but the courts in 2022 still seem to prefer treating apps as conduits for ideas and expression, rather than products.
However, one unalterable fact remains that users of these platforms receive recommendations through algorithms, and that some of these recommendations “challenge” users to do dangerous activities. TikTok’s “Blackout Challenge” launched in 2008, but youth deaths and lawsuits against the corporation prompted reexamination of Section 230 in 2021. Of the social media apps in use today, TikTok is one especially conducive to widespread sharing, with an attractive user interface and reduced barrier-to-interaction between content creators and other users. As such, with increased failure by the courts to grant exceptions for plaintiffs using the product-liability spear, Section 230 permits TikTok’s immense algorithmic power to create more victims as content that continues to go viral on TikTok and other apps leads to more tragic accidents. Continued lawsuits may persuade the courts to find the chink in this twenty-year-old armor by distinguishing social media apps as products. If TikTok then amends its content adjudication, users everywhere – individuals to businesses – will face change after years of freedom allowed under Section 230.
 See Electronic Frontier Foundation, supra note 1; Communications Decency Act, 47 U.S.C. § 230(c)(1).
 See Ct. Order Grant. Def. Mot. Dis. at 4, Anderson v. TikTok, Inc., 2022 U.S. Dist. LEXIS 193841 (Oct. 25, 2022) (Doc. No. 12). See generally Isaiah Poritz, TikTok Beats Suit Saying “Blackout Challenge” Caused Child Death, Bloomberg Law (Oct. 26, 2022, 10:16 am), https://news.bloomberglaw.com/tech-and-telecom-law/tiktok-beats-suit-saying-blackout-challenge-caused-child-death.
 See Poritz, supra note 3.
 See Poritz, supra note 3; see also Ct. Order Grant. Def. Mot. Dis., Anderson v. TikTok, Inc., 2022 U.S. Dist. LEXIS 193841 (Oct. 25, 2022) (Doc. No. 12).
 See Poritz, supra note 3.
 See Ct. Order Grant. Def. Mot. Dis. at 2, Anderson v. TikTok, Inc., 2022 U.S. Dist. LEXIS 193841 (Oct. 25, 2022) (Doc. No. 12) (describing how TikTok’s algorithm used the victim’s age, location, and prior use of TikTok).
 See Debra Cassens Weiss, Lawsuits against social media companies use product-liability theory to sidestep immunity law, ABA Journal (Jul. 18, 2022, 9:41 AM), https://www.abajournal.com/news/article/lawsuits-against-social-media-companies-use-product-liability-theory-to-sidestep-immunity-law; see also Def. Mot. Dis., Case No. 3:22-CV-401-JD, June 24, 2022.
 See Debra Cassens Weiss, Maker of Snapchat can be sued for speed filter used by youths before fatal crash, 9th Circuit rules, ABA Journal (May 6, 2021, 10:46 AM), https://www.abajournal.com/news/article/snapchat-can-be-sued-for-speed-filter-used-by-youths-before-fatal-accident-9th-circuit-rules.
 See Electronic Frontier Foundation, supra note 1; see Winter v. G.P. Putnam’s Sons, 938 F.2d 1033, 1034 (Jul. 12, 1991) (indicating the intent of product liability law to not, for example, apply to the words of Shakespeare themselves, but only to the materially printed book; case cited in Def. Mot. Dis., Case No. 3:22-CV-401-JD, June 24, 2022, by Meta).
 See Deborah D’Souza, TikTok: What It Is, How It Works, and Why It’s Popular, Investopedia (Jul. 5, 2022), https://www.investopedia.com/what-is-tiktok-4588933 (launching in 2016); History.com Editors, Facebook Launches, History (Feb. 2, 2021), https://www.history.com/this-day-in-history/facebook-launches-mark-zuckerberg (launching in 2004); Dan Blystone, Instagram: What It Is, Its History, and How the Popular App Works, Investopedia (Oct. 22, 2022), https://www.investopedia.com/articles/investing/102615/story-instagram-rise-1-photo0sharing-app.asp (launching in 2010).
 See, e.g. The Investopedia Team, The 5 Largest U.S. Product Liability Cases, Investopedia (Apr. 26, 2021), https://www.investopedia.com/the-5-largest-u-s-product-liability-cases-4773418 (noting a suit of over $28 billion against Philip Morris and a class-action suit for 35 million customers of General Motors).
 See Brent Barnhart, Everything you need to know about social media algorithms, Sprout Social (Mar. 26, 2021), https://sproutsocial.com/insights/social-media-algorithms/; Sarah Felbin, Sabrina Talbert, and Addison Aloian, The “Blackout Challenge” Has Resurfaced On TikTok, And It’s Still Just As Dangerous As It Was 16 Years Ago, Women’s Health (Oct. 27, 2022), https://www.womenshealthmag.com/health/a38603617/blackout-challenge-tiktok-2021/; Ben Cost, Asia Grace, Marisa Dellatto, and Eric Hegedus, The 24 craziest TikTok challenges so far – and the ordeals they’ve caused, New York Post (Sep. 23, 2022), https://nypost.com/article/craziest-tiktok-challenges-so-far/.
 Felbin, Talbert, and Aloian, supra note 13.
 Drew Harwell and Taylor Lorenz, Sorry You Went Viral, The Washington Post (Oct. 21, 2022), https://www.washingtonpost.com/technology/interactive/2022/tiktok-viral-fame-harassment/ (describing a user who gained 170,000 followers within a day of sharing his first post).
 See Weiss, supra note 8.
 See generally Def. Mot. Dis., Case No. 3:22-CV-401-JD, June 24, 2022.
 See, e.g. Rieva Lesonsky, The Emerging Impact of TikTok on Small Businesses, Score (June 14, 2022), https://www.score.org/blog/emerging-impact-tiktok-small-businesses (using TikTok as a marketing tool).