Page 1
Ninth Circuit:
Reporting Rules for Platforms Likely Violate Constitution
Opinion Says Preliminary Injunction Is Warranted to Enjoin Enforcement of Bill Requiring Social Media Companies to Submit Reports to State About Content-Moderation Policies
By Kimber Cooley, associate editor
The Ninth U.S. Court of Appeals held yesterday that a California law requiring large social media companies to provide semiannual reports to the state detailing the platforms’ content-moderation policies should be preliminarily enjoined as it is unlikely to survive a First Amendment challenge.
At issue is Assembly Bill 587, codified at Business and Professions Code §22675 et seq., which the Legislature enacted in September 2022.
Sec. 22677(a) requires the reporting of the platforms’ terms of service, including any definitions employed by the companies as to certain listed categories such as “hate speech or racism,” “disinformation or misinformation,” and “extremism or radicalization.” The opinion refers to the reporting requirements of the bill as the “Content Category Report provisions.”
In September 2023, X Corp. (formerly known as Twitter) filed a complaint against Attorney General Rob Bonta asserting free speech violations under the federal and state constitutions, among other challenges. The complaint alleges:
“AB 587 also imposes tremendously burdensome requirements on social media companies, requiring them to keep records about potentially hundreds of millions of content moderation decisions made on a daily basis….Worse yet, it threatens draconian financial penalties of up to $15,000 per violation per day if compliance is not made in ‘reasonable, good faith,’ a term that the statute does not define and that gives the Attorney General nearly unfettered discretion to threaten to impose draconian fines if social media companies’ content moderation policies are not to the State’s liking.”
In October 2023, the social media giant filed a motion seeking a preliminary injunction enjoining the enforcement of the bill.
Shubb’s View
District Court Judge William B. Shubb of the Eastern District of California denied the motion for preliminary relief, finding that X Corp. failed to establish a likelihood of success on the First Amendment violation. The judge reasoned:
“The reports required by AB 587 are purely factual. The reporting requirement merely requires social media companies to identify their existing content moderation policies, if any, related to the specified categories….The statistics required if a company does choose to utilize the listed categories are factual, as they constitute objective data concerning the company’s actions. The required disclosures are also uncontroversial. The mere fact that the reports may be ‘tied in some way to a controversial issue’ does not make the reports themselves controversial.”
Shubb, applying rational basis scrutiny applicable to purely factual commercial speech, concluded that “[w]hile the reporting requirement does appear to place a substantial compliance burden on social medial companies, it does not appear that the requirement is unjustified or unduly burdensome within the context of First Amendment law.”
Circuit Judge Milan D. Smith Jr. authored the opinion reversing the denial and remanding with instructions to enter a preliminary injunction. Smith wrote:
“[W]e hold that the Content Category Report provisions likely compel non-commercial speech and are subject to strict scrutiny, under which they do not survive. We reverse the district court on that basis….We remand to the district court to determine in the first instance whether the Content Category Report provisions are severable from the remainder of AB 587, and if so, which, if any, of the remaining challenged provisions should also be subject to the preliminary injunction.”
Circuit Judges Mark J. Bennett and Anthony D. Johnstone joined in the opinion.
Facial Challenge
Smith noted that neither the parties nor Shubb spent time discussing the requirements for a facial challenge to the reporting requirements of the bill but said:
“Nevertheless, we conclude that a facial challenge is permissible here. That is because all aspects of the Content Category Report, in every application to a covered social media company, raise the same First Amendment issues….[E]very Content Category Report must detail the company’s policies and actions concerning certain state-specified categories of content (even if only to detail the company’s decision not to define the enumerated categories of section 22677(a)(3)). In effect, the Content Category Report provisions compel every covered social media company to reveal its policy opinion about contentious issues, such as what constitutes hate speech or misinformation and whether to moderate such expression.”
He concluded that “[w]e therefore proceed to consider whether the Content Category Report provisions are likely to survive X Corp.’s First Amendment facial challenge.”
Strict Scrutiny
The jurist pointed out that courts distinguish between content-neutral and content-based regulations, the latter of which are presumptively unconstitutional and may be justified only if they survive strict scrutiny. Regulation of purely factual and uncontroversial commercial speech is generally deemed to be content-neutral and subjected only to rational basis review.
Turning to the regulations at issue, Smith wrote that “the compelled disclosures are not advertisements” and “a social media company has no economic motivation in their content.”
Under these circumstances, he reasoned:
“Here, the Content Category Reports are not commercial speech. They require a company to recast its content moderation practices in language prescribed by the State, implicitly opining on whether and how certain controversial categories of content should be moderated. As a result, few indicia of commercial speech are present in the Content Category Reports.”
Smith continued:
“The Content Category Report provisions would require a social media company to convey the company’s policy views on intensely debated and politically fraught topics, including hate speech, racism, misinformation, and radicalization, and also convey how the company has applied its policies. The State suggests that this requirement is subject to lower scrutiny because ‘it is only a transparency measure’ about the product. But even if the Content Category Report provisions concern only transparency, the relevant question here is: transparency into what? Even a pure ‘transparency’ measure, if it compels non-commercial speech, is subject to strict scrutiny.”
He reasoned that “[b]ecause the provisions are content-based, which the State does not contest, they are subject to strict scrutiny.”
Not Narrowly Tailored
Noting that strict scrutiny is a demanding standard, Smith opined:
“At minimum, the Content Category Report provisions likely fail under strict scrutiny because they are not narrowly tailored. They are more extensive than necessary to serve the State’s purported goal of ‘requiring social media companies to be transparent about their content-moderation policies and practices so that consumers can make informed decisions about where they consume and disseminate news and information.’ Consumers would still be meaningfully informed if, for example, a company disclosed whether it was moderating certain categories of speech without having to define those categories in a public report. Or, perhaps, a company could be compelled to disclose a sample of posts that have been removed without requiring the company to explain why or on what grounds.”
He declared that “[b]ecause X Corp. has a colorable First Amendment claim, it has demonstrated that it likely will suffer irreparable harm,” weighing in favor of a preliminary injunction.
The judge said that the balance of equities and the public interest also favor immediate injunctive relief due to the likely constitutional violation and declared that “we reverse the district court’s decision denying a preliminary injunction as to AB 587’s Content Category Report provisions.”
The case is X Corp. v. Bonta, 24-271.
This case follows the August 2024 Ninth Circuit decision in NetChoice LLC v. Bonta, also authored by Smith, which found that preliminary injunctive relief was warranted to prevent the enforcement of provisions of the Child Online Privacy Act requiring content providers to prepare reports for the state identifying any risk of “material detriment to children that arise from [the content providers’] data management practices.”
Copyright 2024, Metropolitan News Company