Communications Litigation Today was a Warren News publication.
'Negligent' Design Choices

Amicus Briefs Back Bonta's Opposition to Injunction in Social Media Case

Section 230 of the Communications Decency Act doesn't preempt the California Age Appropriate Design Code (AADC), which addresses risks to children arising from data management practices of social media platforms, said a Friday amicus brief (docket 5:22-cv-08861) filed in U.S. District Court for Northern California in San Jose. The Electronic Privacy Information Center, Reset Tech, Facebook whistleblower Frances Hausen and a bipartisan roster of former elected and appointed state and federal government officials signed onto the brief.

Amici, with their “decades of experience evaluating state and federal legislative proposals,” believe the AADC’s approach to tech regulation is consistent with Section 230 and the First Amendment, said the brief. Calling AADC a “sensible and effective means of confronting tech’s harmful business model,” amici said their ultimate interest is in ensuring the court’s judgment about AADC is based on a holistic understanding of how the statute fits into the broader picture of privacy and technology regulation.

The law imposes liability for a platform’s own conduct, not for third-party conduct, said the brief. The use of children’s data to design online services is similar to the conduct courts have increasingly found outside of Section 230’s shield, they said. In Lemmon v. Snap, in which plaintiffs alleged harms from an incentive system within Snapchat that “encouraged its users to … drive at speeds exceeding 100 MPH,” plaintiffs alleged a “legally-cognizable injury caused by negligent design of the incentive systems in the platform itself,” said the brief. “When lawsuits allege harms caused by platforms’ own negligent design choices, a platform cannot invoke Section 230 liability,” they said.

In addition, they said, AADC data protection and privacy requirements don’t effect an unlawful prior restraint, said the brief, saying impact assessments required by the law are “common in regulatory frameworks” in the U.S. and worldwide. Requiring a company to assess and mitigate risks of harmful data processing “is not an unconstitutional prior restraint,” said the brief. The impact assessments required by the AADC are “commonplace risk mitigation mechanisms” already used by large companies, it said.

In another amicus brief filed Friday, children’s media watchdog Fairplay, the Public Health Advocacy Institute and others said enjoining the AADC would harm young people in California, citing social media as a “major contributor” to a “crisis in youth mental health” in the U.S.

Amici noted the “intentional design features” of the digital world and their correlation to youth mental health issues today. Though the Children's Online Privacy Protection Act (COPPA) was an “important step” in addressing issues kids confront online, it hasn't “been adequately enforced,” said the brief. Tech companies “exploited loopholes, and it fails to address the issues confronting youth today,” it said.

The California legislature passed AADC in 2022, soon after the surgeon general’s advisory about the mental health crisis among youth, to which social media is a contributor, noted the Fairplay brief. No such crisis existed when COPPA was passed in 1998 because the products and technologies instrumental in causing the harms hadn’t been developed. Kids weren’t spending six-eight hours a day on internet-accessible devices, and there was no Facebook, Snapchat, TikTok or YouTube “keeping them constantly engaged with addictive product features.”

Fairplay was one of 20 organizations in November that filed a petition for rulemaking with the FTC for a rule to prohibit the use of certain types of “engagement-optimizing design practices” on individuals under 18, said the brief. Three categories of design practices outlined were variable rewards design features that reward minors “unpredictably” to keep them on a service; features that manipulate navigation, making it difficult to freely navigate or cease use of a service; and social manipulation features.

The briefs were filed as context as the court considers plaintiff NetChoice’s motion for preliminary injunction in the First Amendment case against California Attorney General Rob Bonta (D). In a February motion for a preliminary injunction, NetChoice called California’s age-appropriate social media design law (AB-2273) “the most extensive attempt by any state to censor speech since the birth of the internet.”

Bonta’s response to NetChoice’s motion for preliminary injunction (see 2304240032 last month said nothing in AADC restricts the content that businesses can provide to minors, and “any incidental effect the Act may have on businesses’ speech is justified by the State’s compelling interest in children’s welfare."

COPPA requires online businesses to protect the personal information of children, but only where the platform is “directed towards” children under 13, Bonta noted. Unless a website identifies as one that targets children, a website only has to “prevent the disclosure of personal information from visitors who identify themselves as under age 13” without parental consent. Though businesses can’t share or sell personal data of users under 16, there’s no “comprehensive law” to protect against the collection and use of kids’ data, he said.