By: Pablo Urioste
Introduction:
The June 2025 Supreme Court decision in Free Speech Coalition v. Paxton[1] marks a potential pivot point for state laws regulating online content and requiring age identification (“ID”) for online engagement.[2]
Across the country, a trade group of social media companies called NetChoice is battling “ID-For-Speech” laws that require social media companies to verify the age of all users and obtain parental consent before allowing minors to open accounts.[3] ID-For-Speech laws are intended to deter minors from engaging with harmful content such as substance abuse, illegal acts, and violence.[4] NetChoice has successfully challenged these laws in multiple states after the lower courts applied strict scrutiny and struck down the laws.[5]
Then, unexpectedly, the Supreme Court sided with Texas in Free Speech Coalition v. Paxton.[6] There, the conservative justices applied intermediate scrutiny instead and upheld a Texas law requiring websites that host more than one-third adult content to ask for ID verification before allowing users access to the website.[7] Thus, Free Speech Coalition may provide a new framework that threatens NetChoice’s string of wins in the lower courts.
Shifting From Strict Scrutiny to Intermediate
In NetChoice v. Carr,[8] a Georgia district court determined that the Georgia ID-For-Speech law was content-based and accordingly applied strict scrutiny because it limited access to expressive conduct.[9] Applying strict scrutiny, the court found that the law was unconstitutional.[10] The court reasoned that the law violated minors’ First Amendment rights to access social media because it was a blanket ban, and the legislature did not pursue less restrictive means of shielding minors from harmful content.[11] The court also found that the law impermissibly chilled adults’ rights to access social media content by requiring identification and foregoing anonymity.[12]
In contrast, the Supreme Court took a different approach in Free Speech Coalition. There, the Court held that while the Texas law was content-based, the appropriate framework was intermediate scrutiny because obscenity was an unprotected category of speech and the burden on adults’ access to expressive content was only incidental.[13] The Court explained that age verification is a routine part of life, analogous to alcohol or firearm purchases and driver’s licenses.[14] Because the law served an important government interest in shielding minors from harmful content, the Court applied intermediate scrutiny and upheld the statute.[15]
Does the Free Speech Coalition Holding Extend to Social Media?
The immediate question for industry and legislators is whether intermediate scrutiny as applied in Free Speech Coalition also extends to social media ID-For-Speech laws. The key distinction is that the Texas statute challenged in Free Speech Coalition targeted adult content, a specific type of unprotected speech that has long been limited to minors.[16] In contrast, social media platforms host a range of protected speech, like politics, educational content, and social interactions.[17] And the Supreme Court upholds as protected speech some of the specific types of harmful content the states seek to limit minors from.[18]
Nonetheless, the arguments regarding health detriments to minors advanced by Texas in Free Speech Coalition mirror states’ arguments regarding the harms of social media.[19] For states, Free Speech Coalition may demonstrate a growing willingness by the Court to uphold ID-verification statutes to protect minors.[20] However, for intermediate scrutiny to apply, states will have to demonstrate that they are advancing an important government interest in keeping children away from harmful social media content.[21] Additionally, they will have to show that the state is exercising a traditional power in curbing harmful content.[22]
In contrast, social media companies will need to show why ID-For-Speech laws regulate protected speech and strict scrutiny applies.[23] First, unlike access to alcohol or adult content, there is no important government interest or a long history and tradition of moderating content on social media.[24] Second, unlike adult content, the types of harmful content the states seek to regulate are not the type of speech the states have a historic power to suppress.[25] Rather, strict scrutiny should protect minors’ access to content related to religion, politics, and personal expression.
Chilling Adults’ Rights to Access Platforms
The Free Speech Coalition decision marks a significant departure point in jurisprudence regarding the chilling effects of age verification requirements. In Carr, the district court noted that the Georgia law impermissibly chilled adults’ rights to access social media by requiring them to verify their age too.[26] But in Free Speech Coalition, the Court held that merely providing identification is not a sufficient basis to find a chilling effect on adults’ First Amendment rights.[27] Indeed, the Court called age verification a “modest burden.”[28] And while the Carr court decried the loss of anonymity that ID-For-Speech laws imposed, the Free Speech Coalition Court ignored the argument over the dissent’s objection.[29] Rather, the Court ruled that age-verification laws only incidentally burden adults’ freedom of speech, and the goal of protecting minors from harmful content is more compelling.[30] Thus, considering the Court’s different treatment of age-verification requirements in Free Speech Coalition, lower courts may have to re-evaluate the balance of burdens and benefits to determine whether ID-For-Speech laws pass constitutional muster.
The Impact on the Social Media Industry
The Free Speech Coalition decision has already altered the compliance landscape. [31] If intermediate scrutiny is the new standard for analyzing ID-verification laws, states will have a clear roadmap for defending additional ID-For-Speech statutes. The Texas law at issue in Free Speech Coalition may also provide a blueprint for new and more narrowly tailored ID-For-Speech laws that courts uphold. Social media companies should prepare for a shifting compliance strategy post-Free Speech Coalition and plan accordingly.[32] Multi-jurisdictional compliance strategies should begin to reflect the possibility of a checkered landscape.[33] Businesses will likely need to account for the possibility of new data costs and loss of business due laws conditioning access on providing identification.[34] Similarly, information technology and legal departments should prepare to face new cybersecurity and liability threats as companies begin collecting and safeguarding sensitive ID data.[35]
[1] 606 U.S. 461 (2025).
[2] Id. at 466 (holding age verification is within a State’s authority to prevent children from accessing explicit content).
[3] Krista Chavez, Court Halts Georgia’s ID-for-Speech Law for NetChoice, NetChoice (June 26, 2025), https://netchoice.org/court-halts-georgias-id-for-speech-law-for-netchoice/ [https://perma.cc/2FMM-3FHD].
[4] See NetChoice, LLC v. Fitch, 787 F. Supp. 3d 262, 269 (S.D. Miss. 2025).
[5] See id.; NetChoice v. Carr, 789 F. Supp. 3d 1200, 1209 (N.D. Ga. 2025).
[6] See 606 U.S. 461 (2025); Kate Ruane & Aliya Bhatia, FSC v. Paxton Made Bad Law, But It’s Not Carte Blanche for Age Verification, Ctr. for Democracy & Tech. (Aug. 22, 2025), https://cdt.org/insights/fsc-v-paxton-made-bad-law-but-its-not-carte-blanche-for-age-verification/ [https://perma.cc/ZY6N-PXQU].
[7] Free Speech Coal., Inc. v. Paxton, 606 U.S. 461, 478 (2025).
[8] 789 F. Supp. 3d 1200 (N.D. Ga. 2025).
[9] Carr, 789 F. Supp. 3d at 1222-23.
[10] Id.
[11] Id. at 1223, 1230.
[12] Id. at 1225-26.
[13] Free Speech Coal., Inc. v. Paxton, 606 U.S. 461, 462 (2025).
[14] Id. at 462.
[15] Id.
[16] See id. at 492.
[17] See NetChoice v. Carr, 789 F. Supp. 3d 1200, 1211 (N.D. Ga. 2025).
[18] See Brown v. Ent. Merchs. Ass’n, 564 U.S. 786, 795-96 (2011) (extending First Amendment protection to violent media that reaches minors).
[19] See Carr, 789 F. Supp. 3d at 1211; Brief for Respondent at 6, Free Speech Coal., Inc. v. Paxton, 606 U.S. 461 (2025) (No. 23-1122) (arguing obscene materials foster violence and mental health issues in children).
[20] See Free Speech Coal., Inc. v. Paxton, 606 U.S. 461, 462 (2025) (expressing skepticism about applying strict scrutiny to all age verification laws).
[21] See id. at 495-96.
[22] Id. at 478.
[23] See Complaint at 5, NetChoice v. Carr, 789 F. Supp. 3d 1200 (N.D. Ga. 2025) (No. 1:25-cv-02422).
[24] See Free Speech Coal., 606 U.S. at 478 (holding intermediate scrutiny applies when a law falls within an exercise of a state’s traditional power).
[25] See id. at 471-72 (providing content is proscribable when the material appeals to prurient interests, depicts explicit conduct, and lacks serious literary, scientific, artistic, or political value); see also Brown v. Ent. Merchs. Ass’n, 564 U.S. 786, 793 (2011) (holding legislatures may not extend definition of obscenity to shocking content but can only be modified if the regulated core is “sexual conduct”).
[26] NetChoice v. Carr, 789 F. Supp. 3d 1200, 1225 (N.D. Ga. 2025) (qualifying age verification as a sweeping burden).
[27] Free Speech Coal., Inc. v. Paxton, 606 U.S. 461, 499 (2025).
[28] Id. at 463.
[29] See id. at 501-02 (Kagan, J., dissenting); Carr, 789 F. Supp. 3d at 1226.
[30] Id. at 495-96.
[31] See Whitney Cloud, Ellen Dew & Christopher Hooks, YouTube AI Age Estimation Tech Signals New Compliance Standards, BL (Sep. 10, 2025, at 16:30 ET), https://news.bloomberglaw.com/legal-exchange-insights-and-commentary/youtube-ai-age-estimation-tech-signals-new-compliance-standards (on file with the American University Business Law Review).
[32] See id.; see also NetChoice v. Carr, 789 F. Supp. 3d 1200, 1224 (N.D. Ga. 2025) (noting compliance efforts may incentivize intrusive methods like facial recognition and de-anonymization).
[33] See Cloud, supra note 31 (noting a fractured legal landscape results in rising compliance costs).
[34] See id; Carr, 789 F. Supp. 3d at 1227 (warning of loss of business when ceasing operations in a state is the most viable compliance measure).
[35] See Carr, 789 F. Supp. 3d at 1225 (concluding compliance requires large investment in data capabilities and creates cybersecurity risks); Cloud, supra note 31 (arguing litigation is likely to increase as the Free Speech Coalition holding extends to new statutes making corporations responsible for protecting minors online).
