California's Age-Appropriate Design Code (AADC) Is Completely Unconstitutional (Multiple Ways)-NetChoice v. Bonta - The Legend of Hanuman

California’s Age-Appropriate Design Code (AADC) Is Completely Unconstitutional (Multiple Ways)-NetChoice v. Bonta


I don’t normally start my blog posts with a meme, but this one tells you everything you need to know:

mean girls

* * *

This blog post concerns the California Age-Appropriate Design Code (AADC), passed by the California legislature in 2022. The law cloned-and-revised the UK’s AADC. As the judge says resignedly, “Taking these provisions directly from a law enacted in the United Kingdom, the California Legislature left it to the courts to pass the CAADCA through the filter of our First Amendment.”

The AADC won’t pass through that filter unscathed. Like most Segregate-and-Suppress laws, the law took an omnibus approach to drafting, tossing dozens of bad policy ideas into the legislative blender. Among other attributes, the AADC combined an age authentication requirement with restrictions on businesses’ abilities to publish content to both segregated children and adults. This approach created a smorgasbord of policy and doctrinal problems. From my view, the law was plainly unconstitutional from the get-go, but the California legislature or Gov. Newsom didn’t care.

Last year, the Ninth Circuit struck down a key part of the AADC that required businesses to prepare Data Protection Impact Assessments (“DPIAs”) about their offerings and share those with the state AG. The Ninth Circuit left open the constitutional review of the other parts of the law for remand. Unsurprisingly, on remand, the district court declared the rest unconstitutional. I assume the Ninth Circuit will get a second chance to review the district court. This is a well-constructed opinion that I feel should do well on review.

* * *

First Amendment.

Protected Speech

The CAADCA regulates “business[es] that provide[] an online service, product, or feature likely to be accessed by children.” The state introduced an expert report acknowledging that the law covered “children’s online access to video games, video streaming services, social media, and educational content.” The court responds plainly: “Those types of products and services traditionally have enjoyed First Amendment protections.”

Content-Based Regulation

The court says the law is content-based because of the definition of who must comply with the law: “Businesses that provide online services, products, or features ‘likely to be accessed by children’ are subject to heightened regulation, while other businesses are not….Application of [the statutory criteria] to determine whether a particular business’s online offerings are likely to be accessed by children unavoidably requires an evaluation of content.” I agree with the court, though I note that a different flaw of the law is that almost every businesses’ websites are likely to be accessed by minors, at least occasionally, so the law is overbroad.

The court summarizes its legal standard: “where a statute’s gateway coverage definition divides the universe into covered and uncovered business based on the type of content they publish, those statutes are content-based and subject to strict scrutiny.”

🚨🚨 EVERY SEGREGATE-AND-SUPPRESS LAW DOES THIS BY DEFINITION. In other words, the point of most “child safety” laws is to segregate children for special content restrictions. This court’s standard would ensure strict scrutiny for the entire regulatory genre.

The state tried the ole’ “but kids” argument:

The State argues that the CAADCA is not intended to regulate any particular expressive content, but rather is intended to protect children from harm. In essence, the State argues that it may burden speech directed to, or likely to be accessed by, children because its intentions are good.

As my forthcoming Segregate-and-Suppress article will explain, we should stop assuming a legislature’s “intentions are good” when the legislature adopts censorial tactics that hurt both children and adults. Instead, we should assume the opposite: that the legislature was either (a) indifferent about whether it protected kids or not, and disregarded substantial evidence highlighting the harms of taking tools and content away from children, or (b) using kids as a political prop to advance an entirely different goal than protecting kids. Neither motivation deserves the presumption of “good intentions.”

The court responds to the state’s argument: “the State’s intentions in enacting the CAADCA cannot insulate the Act from the requirements of the First Amendment.” This is true, though the state’s intentions might undercut the act if there’s a means-ends mismatch.

Strict Scrutiny Applies to the Statute as a Whole

The state wanted NetChoice to prove its case provision-by-provision. The court responds:

This Court finds that applying strict scrutiny to the Regulatory Provisions as a whole, and applying Moody in that context, is the better approach. Under well-established precedent, a plaintiff’s showing that a statute is content-based shifts the burden to the State to show that the statute is narrowly tailored to promote a compelling Government interest

The court says Moody didn’t change this approach because Moody didn’t deal with a defective definition that infects the entire statute.

(Note: to be clear, I think that the statutes at issue in Moody had defective definitions that should have triggered strict scrutiny on their own, but those definitions weren’t the crux of the issues addressed by Moody).

Because of the statute’s defective definition of regulated entities:

NetChoice has shown that the vast majority (if not all) of such online services, products, and features will involve expressive content. Thus, even assuming the full sweep of the Act’s Regulatory Provisions, collectively, may apply to some online services, products, and features that do not implicate protected speech, any such applications of the Regulatory Provisions necessarily will be substantially outweighed by the applications of the Regulatory Provisions that do implicate protected speech. And the Supreme Court has made clear that imposing regulations on protected speech solely because the speech is directed at children is impermissible unless the regulations pass strict scrutiny. [cite to Brown]

Application of Strict Scrutiny

“The State does not even attempt to satisfy strict scrutiny.”

laughing leo meme

To appeal-proof the opinion, the court nevertheless tries to make the state’s best argument for it. The court says that “protecting the privacy and well-being of minors” should be a compelling state interest. However, the means-ends analysis will fail. The state introduced an expert’s declaration that the Internet harms kids. NetChoice introduced an expert’s declaration (from Christopher Ferguson) saying that the causation element isn’t proven. The court treats the battle of experts as good enough to conclude the state didn’t meet its burden to show the AADC will prevent children from being harmed online.

The court continues:

Record evidence describes the difficulties of age estimation, and the threats to users’ privacy inherent in age estimation. If a business chooses not to take on the burden of age estimation, it must apply the privacy and data protections designed for children to all content accessed by children or adults. The State has not even attempted to explain why its interest in protecting children could not be served by a less restrictive statute….

A regulation that focuses on “[t]he emotive impact of speech on its audience” is content-based and therefore must be drawn as narrowly as possible

Consider the implications of the last sentence on the laws purporting to protect children from “addictive” social media.

Due to the defective definition, the court says the Moody standards for a preliminary injunction are satisfied: “the application of strict scrutiny is grounded in the content-based definition of a covered business, which applies in all applications of the Act.”

The statute is enjoined, so the court could stop there. Nevertheless, the court appeal-proofs the opinion by thoroughly reviewing other principles.

Strict Scrutiny of Obligation to Enforce Stated Policies

The statute requires businesses to enforce their stated policies. The court rips this to shreds:

Requiring a business to enforce its own published content policies and community standards to the satisfaction of the State would burden the business’s right to exercise its editorial judgment whether to permit or prohibit any given content in any given instance.

💥💥💥 This is what I’ve saying in the context of compelled editorial transparency. If the state can second-guess a publisher’s editorial decisions, the publisher will change their decision-making process from trying to satisfy their audience to trying to anticipate the needs of, and please, the regulator. Cf. the many businesses anticipatorily complying with the MAGA agenda.

Strict Scrutiny of Age Estimation Requirement

The State contends that estimating a user’s age is conduct, not expressive activity. But this provision does not merely require covered a business to estimate age. The business must estimate a user’s age “with a reasonable level of certainty appropriate to the risks” to that user. The legislature contemplated that businesses would consider “the unique needs of different age ranges, including the following developmental stages: 0 to 5 years of age or ‘preliterate and early literacy’; 6 to 9 years of age or ‘core primary school years’; 10 to 12 years of age or ‘transition years’; 13 to 15 years of age or ‘early teens’; and 16 to 17 years of age or ‘approaching adulthood.’” By requiring a business to estimate age for the purpose of determining what content is appropriate for that age, the CAADCA imposes limits on the content a covered business may publish and the content each user may view. In the alternative, all content must be sanitized to comport with the highest risk level, presumably, the youngest children. Imposing restrictions of that nature with respect to content published to, and accessed by, both children and adults would trigger strict scrutiny of this provision

Yes! This is segregate-and-suppress, in a nutshell. Plus, there are even more problems with this provision. First, it’s unclear that the age authentication technology is precise enough to make the statute’s fine age gradations. I can guarantee you the legislature had no clue how businesses would actually achieve these accuracy rates. Second, and more importantly, segregating kids by age doesn’t ensure that the newly segregated age cohorts have common interests. As my Segregate-and-Suppress paper will explain, there are many inevitable cohort conflicts for reasons other than age. Treating these age cohorts as having common enough interests is a categorical error.

kelce child privacyAs for the means-ends analysis, the court says:

The State’s argument is grounded in an assumption that greater data privacy for children means greater security and well-being. As NetChoice points out, however, the State ignores that the age estimation requirement will require businesses to collect private information that users may not wish to share. One of NetChoice’s declarants, Stacie Rumenap of the nonprofit organization Stop Child Predators, opines that the practical effect of the CAADCA’s age estimation requirement is that businesses will gather and create “a trove of sensitive data” regarding children…

Dr. Egelman’s [the state’s expert] report does not explain how using third parties for age estimation would address the intrusive aspects of that process described by Ms. Rumenap, nor does Dr. Egelman’s report suggest that the prototype system described in the French report is widely available to covered businesses in the United States.

The AADC had an escape clause: “If a business chooses not to estimate the age of its users, it must apply the privacy and data protections afforded to children to all consumers.” The court rightly rejects this safety valve: “the Supreme Court has made clear that under the First Amendment, a state ‘could not reduce the adult population . . . to reading only what is fit for children.’”

The court then explains that it can enjoin this provision per Moody: “Every covered business will be forced to choose between intruding into user privacy, thereby chilling publication of and access to protected speech, or publishing only child-appropriate content, thereby restricting access to protected speech for users of all ages.”

The court’s analysis could be overridden by the imminent Supreme Court opinion in FSC v. Paxton.

Strict Scrutiny of Other Provisions

The court says other challenged provisions are sufficiently amorphous that they are not amenable to a facial challenge.

Vagueness Challenges

  • Policy enforcement: The provision requiring businesses to enforce their stated policies “would grant the State virtually unfettered discretion to enforce as law a variety of subjective content policies that might be viewed differently by the State than by the policies’ creators.”
  • No detrimental actions: “The challenged terms have no established meaning and the CAADCA provides no guidance….The State does not offer a proposed definition or means for determining how to decide what is ‘detrimental’ to a child’s ‘physical health, mental health, or well-being.’”
  • Act in best interests of children: “The phrase ‘best interests’ is not defined in the Act, and NetChoice argues that it has no established meaning in this context. The State argues that ‘best interests’ of a child is a legal term of art that is well-established in family law….Those are specialized proceedings, however, in which finite custodial or dependency options must be considered by the court as to a particular child, on a particular factual record. A state court’s application of the ‘best interest’ standard in those specialized proceedings provides no useful guidance as to how a covered business should understand what the ‘best interests of children generally means as used in the CAADCA.” This is a powerful ruling because the statutory requirement sounds so intuitive–just act in the children’s best interests. But this is not possible for a third-party vendor that doesn’t know the children. The family law analogy was mockable given the due process differences.
  • No dark patterns. It is “difficult to say that businesses have reason to know what effect a feature like autoplay or push notification will have on children generally.”
  • Age estimation. “the provision is less than precise with respect to the degree of certainty with which age estimation must be performed,” but it provides fair notice.

Severability

The court says the DPIA provisions aren’t volitionally severable from the rest of the challenged provisions, even if they are functionally severable.

IMG 8558Section 230 Preemption

“One business’s publication of content may implicate Section 230 while another business’s publication of the same content may not, depending on each business’s role (or lack thereof) in creating the published content….While it may be appropriate for a covered business to raise Section 230 in response to the State’s attempted enforcement of the CAADCA in a particular case, NetChoice has not shown that it is likely to succeed on its facial challenge to the CAADCA as preempted by Section 230.”

COPPA Preemption

“It is not clear that the cited provisions of the CAADCA contradict those of COPPA, or would stand as an obstacle to enforcement of COPPA.”

Dormant Commerce Clause

“The briefing on these issues, comprising approximately one page per party, and the record regarding potential extraterritorial applications of the Act, are insufficient for the careful analysis that would be required before the Court could find the Act facially invalid under the dormant Commerce Clause.”

* * *

Judge Beth Labson Freeman did an exceptionally good job with this opinion. The opinion wrestles with the doctrinally challenging problems in a cogent and persuasive way. This opinion ought to persuade and impress on appeal.

The opinion’s final scorecard:

  • The AADC is categorically unconstitutional because of its content-based definition.
  • The policy enforcement and age estimation provisions fail strict scrutiny.
  • The policy enforcement, no detrimental actions, best interests of children, and dark patterns provisions are void for vagueness.
  • The challenged provisions are not severable from the unconstitutional DPIA provision.

An impressive quadfecta of constitutionality problems. Rep. Wicks (the bill sponsor), the California legislature, and Gov. Newsom should be terribly ashamed of their work (but you know they aren’t).

I note that the court struck down the policy enforcement requirement two different ways (speech restriction and vagueness). Similar arguments come up often in private litigation, where plaintiffs claim that the service publicly announced an editorial policy but didn’t follow through. Often, those claims run into Section 230 problems, but the Ninth Circuit keeps trying to kick them out of Section 230 by saying those claims are “promise-based” (see, e.g., the problematic YOLO opinion). This ruling highlights that even if the plaintiffs get around Section 230, the claim raise significant First Amendment problems…in two ways… We don’t often see the First Amendment raised as a defense of those private claims, but this opinion might change that.

This opinion says that age authentication obligations are categorically unconstitutional online as speech restrictions. I agree, as I will lay out in detail shortly, but I note that the Free Speech Coalition v. Paxton decision could upend our expectations. I don’t expect the Supreme Court to uphold the Texas law, but the Supreme Court could still reset the evaluative standards.

Before the constitutionality of the AADC was resolved, the California legislature doubled-down and passed SB 976, another Segregate-and-Suppress law that purports to restrict children’s access to “addictive” services. In a surprising ruling, the district court upheld some parts of SB 976. This AADC opinion undercuts those conclusions, and 976 generally, in multiple ways, including:

  • SB 976 does the same kind of content-based definition of covered services that the AADC court treats as categorically unconstitutional.
  • In the 976 opinion, the court deferred reviewing the age authentication requirement because it wasn’t ripe. The AADC opinion indicates age authentication will always be an unconstitutional speech restriction, no matter how it’s implemented.
  • Regarding the age authentication privacy issues, the 976 opinion treated that as a constitutional feature, meaning that services could data-mine private information to estimate age. The AADC opinion says that the privacy issues are part of the problem, not part of a solution.
  • If 976 is content-based, then the 976 court improperly kept the burden of proof with the challengers rather than shifting it to the state.
  • The AADC court said that the state has higher burdens if the state is trying to regulate emotional responses to content publication, which is exactly what 976 does (the efforts to prevent “addiction”).
  • The AADC court said that it was impermissibly vague to ban actions “detrimental” to children or not in their “best interests.” The whole concept of “addiction” should be subject to the same vagueness concerns.

I hope the initial SB 976 opinion’s deficiencies get fixed in subsequent proceedings.

Finally, a reminder: Gov. Newsom publicly chewed out NetChoice for challenging the AADC because he felt NetChoice should care more about the kids (as if the kids and affected adults have no constitutional rights 🤷‍♂️). In other words, Newsom thought that NetChoice should let an unconstitutional law stand because….unconstitutional laws are cool with him? Newsom’s public chastisement of NetChoice looks even more terrible as the courts keep striking down the AADC. (Not that Newsom is exactly rehabilitating his reputation as he continually expresses MAGA-curious sympathies on his podcast series).

Case Citation: NetChoice, LLC v. Bonta, 2025 WL 807961 (N.D. Cal. March 13, 2025)

* * *

BONUS: CCIA v. Uthmeier, Case No.: 4:24cv438-MW/MAF (N.D. Fla. March 13, 2025). Another challenge to a segregate-and-suppress law. “Plaintiffs must show that at least one specific member of theirs is likely regulated by the law. They have not done so.” The court acknowledges that it was essentially requiring the companies to admit to facts that would create liability under the law if it’s constitutional. The judge responded 🤷‍♂️:

assuming, for the sake of argument, that the Florida Legislature designed the law with some of Plaintiffs’ members in mind and that the Attorney General believes some of Plaintiffs’ members are likely covered by it, the law can only be enforced against one of Plaintiffs’ members if its platform meets each of the four specific coverage criteria. Plaintiffs have failed to demonstrate that any of its members meet these coverage requirements, and there is no reason to believe that the Legislature or the Attorney General has any more information about whether or which of Plaintiffs’ members meet those coverage criteria than Plaintiffs or their members do.

* * *

Prior AADC coverage

My standard meme for AADC-related blog posts:

van on fire in arkansas

Click on the image to see the animation


Share this content:

I am a passionate blogger with extensive experience in web design. As a seasoned YouTube SEO expert, I have helped numerous creators optimize their content for maximum visibility.

Leave a Comment