
Overview Remarks
The court notes the tensions around allowing children to access social media:
- “Minors’ access to information is essential to their growth into productive members of our democratic public sphere”
- “There are a number of good things to be said about social media platforms…The ability to form and to maintain social connections on these online platforms can have positive effects on children and adolescents, especially those who routinely feel marginalized or excluded. In addition to enabling peer connection and social support, social media can also promote positive and identity-affirming content and encourage behaviors that proactively address mental health care.”
- “these benefits come with a cost. Extensive social media use also has been linked to higher rates of depression, anxiety, and loneliness among young people.”
Striking a balance between these considerations is a challenge: “lawmakers across the nation have sought to regulate online entities and cabin their deleterious effects, particularly on the youth. The task is an admittedly daunting one: Legislatures must craft laws broad enough to accommodate the pace with which digital platforms evolve, but not so broad that they cripple innovation or trample free speech.” I don’t think legislatures can find any constitutional way of navigating this dilemma given the perniciousness of mandatory age authentication.
Can NetChoice Bring a Facial Challenge?
The court says Ohio’s law is amenable to a facial First Amendment challenge:
in every application to a covered website, the Act raises the same First Amendment issues. Specifically, whether it is a NetChoice member product—YouTube; Facebook and Instagram; X; Nextdoor; Pinterest; or Dreamwidth—or other covered websites that provide the “socially interactive” features outlined in the Act and considered to be “target[ing]” minors or likely to be accessed by minors—like Goodreads, Soundcloud, Substack, Yelp, or LinkedIn—they are all are under the same statutory obligation to block access to unemancipated Ohio minors absent parental consent
From my perspective, mandatory age authentication always should be subject to a facial challenge because it always categorically harms minors, adults, and publishers, regardless of the perniciousness of the suppression obligation.
Does the Act Impact Protected Speech?
The court says
the Act’s First Amendment implications come into focus when social media operators are thought of as publishers of opinion work—a newspaper limited to “Letters to the Editor,” or a publisher of a series of essays by different authors. The analogy is an imperfect one—social media operators are arguably less involved in the curation of their websites’ content than these traditional examples. But the comparison helps clarify that the Act regulates speech in two consequential ways: (1) it regulates operators’ ability to publish and distribute speech to minors and speech by minors; and (2) it regulates minors’ ability both to produce speech and to receive speech….
These principles control here. The Act impedes minors’ ability to engage in and access speech by requiring covered websites to obtain parental consent before allowing any unemancipated child under the age of sixteen to register or create an account on their website. That means minors’ ability to contribute or access “a wide array of protected First Amendment activity on any number of diverse topics,” will be contingent on securing parental consent—an impermissible curtailment of their First Amendment rights.
I disagree with this characterization: “social media operators are arguably less involved in the curation of their websites’ content than these traditional examples.” A publisher who engages in light curation is making just as much of an editorial choice as those who engage in heavier curation. There’s not a “better” or “worse” level of curation; there are just different choices along a continuum. I also think the Moody decision clearly validated the protected editorial nature of social media services’ curation.
As I explain in my Segregate-and-Suppress paper, parental consent requirements are conditional access restrictions (i.e., conditioned on the parental consent). If parents say no, then the restriction becomes categorical for those minors. This issue infects all parental control obligations involving publishers.
As it did in the preliminary injunction stage, the state pressed its argument that the law merely governs children’s abilities to enter into contracts. The court again rejects that framing:
if a website “targets children” or is “reasonably anticipated to target children,” and offers socially interactive features—like the ability to interact with other users; construct a public or semipublic profile; populate a list of other users with whom an individual “shares or has the ability to share a social connection”; create or post visible content “including on message boards, chat rooms, video channels, direct or private messages or chats, and a landing page or main feed that presents the user with content generated by other users”—then that website faces a dichotomy: (1) either minors secure parental consent and gain “access to” and “use of” all the speech on covered websites; or (2) minors do not secure parental consent and are denied “access to” and “use of” the covered websites. These provisions do not strike at the commercial aspect of the relationship between covered websites and their users, they tackle the social speech aspect of it…Because the Act implicates protected speech, at least to some degree, it is not subject to the deferential rational basis standard of review.
Content-Based Restrictions
The court says the law is content-based, not content-neutral:
covered websites’ choices about whether and how to disseminate user-generated expression “convey a message about the type of community the platform seeks to foster.”…Among the messages the Act regulates are the ideas that: (1) user-generated content is not less valuable than speech authored by the websites themselves; and (2) social interactions and connections (as compared to other types of interactions, such as business interactions) have unique value for online communities…
The exceptions to the Act for product review websites and “widely recognized” media outlets are also content based….a product review website is excepted, but a book or film review website is presumably not. The State is therefore favoring engagement with certain topics, to the exclusion of others. That is plainly a content-based exception deserving of strict scrutiny.
This highlights the Goldilocks problem of regulating social media. Without exceptions for some publishers, any definition of social media is overbroad because it will functionally apply to the full universe of UGC. With exceptions for publishers, the law prioritizes some publishers over others in ways that are difficult or impossible to justify.
Strict Scrutiny
Unsurprisingly, the law fails strict scrutiny. The state gamely tried to defend the law anyway, to no avail.
The state invoked a justification that the law is just about protecting kids from bad contracts. The court responds: “the Act is not narrowly tailored to protect minors against oppressive contracts. The Act regulates access to and dissemination of speech when it could instead seek to regulate the—arguably unconscionable—terms of service that these platforms require. The Act is also underinclusive with respect to this interest. For example, as NetChoice explains, a child can still agree to a contract with the New York Times without their parent’s consent, but not with Facebook.”
The state invoked a justification that the law seeks to protect kids’ mental health. The court responds:
The record Defendant has assembled here, however, is much like the deficient record in Brown, where “nearly all of the research” showing any harmful effects “is based on correlation, not evidence of causation.” The record also does not show that the full range of “thousands” of websites covered by the Act cause harms to minors sufficient to suppress those minors’ access to protected speech.
But even if protecting children against these harms is a compelling interest, which it very well may be, the Act is not narrowly tailored to those ends…The State’s approach is an untargeted one, as parents must only give one-time approval for the creation of an account, and parents and platforms are otherwise not required to protect against any of the specific dangers that social media might pose
The state tried a final justification that the law protects parental rights. The court responded that services already offer parental-control tools.
The court’s summation:
This Court, although sympathetic to what the Ohio legislature sought to do, finds that the evidence does not establish the required nexus between the legislative concerns about the wellbeing of minors and the restrictions on speech. In other words, the Act is either underinclusive or overinclusive, or both, for all the purported government interests at stake. Ohio’s response to a societal worry that children might be harmed if they are allowed to access adult-only sections cannot be to ban children from the library altogether absent a permission slip.
This over-/under-inclusive problem pervades the segregate-and-suppress regulatory genre. It is functionally impossible to structure a law that doesn’t suffer from serious scope problems.
Void for Vagueness
In addition to the speech restriction problem, the court says the law is impermissibly vague: “the Act purports to apply to operators that “target[] children” or are “reasonably anticipated to be accessed by children.” On its face, this expansive language would leave many operators unsure as to whether it applies to their website….the Act also provides no guardrails or signposts for determining which media outlets are “established” and “widely recognized.” Such capacious and subjective language practically invites arbitrary application of the law.”
Case Citation: NetChoice LLC v. Yost, 2025 WL 1137485 (S.D. Ohio April 16, 2025)
BONUS: NetChoice LLC v. Fitch, No. 24-60341 (5th Cir. April 17, 2025). Prior blog post. A district court enjoined a Mississippi segregate-and-suppress law, issuing its opinion on the same day as the Supreme Court issued its Moody v. NetChoice decision. Unsurprisingly, on appeal, the Fifth Circuit reverses the decision and orders the district court to do the more thorough analysis contemplated by the Moody decision. Some of the interesting points from this ruling:
- With respect to prudential standing, the court says: “it is plain that an online platform is not barred by prudential standing when it asserts its users’ First Amendment rights, at least when the violation of those rights adversely affects the platform.”
- “By not determining the full scope of actors regulated by the Act and the activities it regulates, the district court did not apply Moody in the manner now required.”
- “The district court also did not determine the “commercially reasonable efforts,” as used in the Act, or the Act’s requirements for each DSP, requirements likely to be different with each DSP facing a unique regulatory burden.”
- Judge Ho concurs in the judgment, saying that he agrees with Justice Thomas’ dissent in Brown v. Entertainment Merchants Association that the First Amendment always steps aside when legislatures just sprinkle “but think of the kids” dust on a censorship law. However, because Justice Thomas’ view didn’t carry the day then, Judge Ho grudingly is forced to apply the actual law. But if Trump would only appoint Judge Ho to the Supreme Court, then….
Blog Posts on Segregate-and-Suppress Obligations