According to the majority opinion in Moody v. Netchoice, LLC:
The laws, from Florida and Texas, restrict the ability of social-media platforms to control whether and how third-party posts are presented to other users … [including by] requir[ing] a platform to provide an individualized explanation to a user if it removes or alters her posts….
When analyzing whether these requirements are sound, the majority stated that “means asking,” as to each kind of content moderation decision, “whether the required disclosures unduly burden” the platforms’ own expression:
Requirements of that kind violate the First Amendment if they unduly burden expressive activity. See Zauderer v. Office of Disciplinary Counsel of Supreme Court of Ohio (1985). So our explanation of why Facebook and YouTube are engaged in expression when they make content-moderation choices in their main feeds should inform the courts’ further consideration of that issue.
For more information on the “main feeds” question and the Court’s decision not to address the First Amendment questions raised by the platforms’ other functions, refer to this post. To delve deeper into the “unduly burden expressive activity” standard of Zauderer, especially beyond the commercial advertising context, check out NIFLA v. Becerra (2018).
These suggest that the individualized-explanation requirements are more likely to be invalid concerning decisions about what to include in the “main feeds” and more likely to be valid regarding decisions about whether to delete a post outright or ban a user. However, the clarity on this matter is not absolute. For a comprehensive examination of the laws’ practical effects, as suggested by the majority, read Daphne Keller’s Platform Transparency and the First Amendment article.
Justice Thomas, in his separate opinion, advocated for greater protection against speech compulsions in general:
I think we should reconsider Zauderer and its offspring. “I am skeptical of the premise on which Zauderer rests—that, in the commercial speech context, the First Amendment interests implicated by disclosure requirements are substantially weaker than those at stake when speech is actually suppressed.”
However, he also aligned with Justice Alito’s concurrence in the judgment (also supported by Justice Gorsuch), which took a less platform-friendly approach. An excerpt:
NetChoice argues in passing that it cannot disclose how its members moderate content because it would empower “malicious actors” and reveal “proprietary and closely held” information. Yet these potential harms are not guaranteed. Various platforms already make similar disclosures—voluntarily and to comply with the European Union’s Digital Services Act—without catastrophic consequences. On remand, NetChoice will have the chance to argue whether specific disclosures are essential and whether any pertinent materials should be submitted under seal. Several NetChoice members already disclose in general terms how they utilize algorithms to curate content….
Just as NetChoice failed to demonstrate that the States’ content-moderation provisions are facially unconstitutional, their facial challenges against the individual-disclosure provisions also fell short. These provisions necessitate platforms to clarify to affected users the basis of each content-censorship decision. Since these regulations entail the disclosure of “purely factual and uncontroversial information,” they must be assessed under Zauderer‘s framework, which only demands that such laws be “reasonably related to the State’s interest in preventing deception of consumers” and not “unduly burde[n]” speech.
For purposes of Zauderer, a law is “unduly burdensome” if it threatens to “chil[l] protected commercial speech.” NetChoice contends that these disclosures have that effect and lead platforms to opt for not exercising editorial discretion at all rather than explain why they remove “millions of posts per day.” …
In the lower courts, NetChoice did not attempt to show how these disclosure provisions stifle each platform’s speech. Instead, they merely identified one subset of one platform’s content that would be impacted by these laws: billions of nonconforming comments that YouTube removes annually. If YouTube uses automated processes to flag and remove these comments, it is unclear why having to disclose the bases of those processes would stifle YouTube’s speech. Even if explaining each removal decision would unduly burden YouTube’s First Amendment rights, the same may not apply to all of NetChoice’s members.
NetChoice’s failure to make this broader demonstration is troubling, especially since they do not dispute the States’ claim that many platforms already offer a notice-and-appeal process for their removal decisions. In fact, some have advocated for such disclosure requirements. Before its change in ownership, the former Chief Executive Officer of the platform now known as X went as far as stating that “all companies” should be mandated to explain censorship decisions and “provide a straightforward process to appeal decisions made by humans or algorithms.” Furthermore, many platforms are already furnishing similar disclosures in compliance with the European Union’s Digital Services Act. Yet, complying with that law does not seem to unduly burden each platform’s speech in those countries. On remand, the courts could consider whether compliance with EU law stifled the platforms’ speech….