Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Reason
Reason
Politics
Eugene Volokh

Ninth Circuit Blocks Default Restrictions on "Like Counts" for Minors' Social Media Accounts, Upholds Default of "Private Mode"

From today's panel opinion in Netchoice, LLC v. Bonta, by Ninth Circuit Judge Ryan D. Nelson, joined by Judges Michael Daly Hawkins and William A. Fletcher:

Addressing the growing concern that our youth are becoming addicted to social media, California passed a law regulating how internet platforms allow minors to access personalized recommendation algorithms. NetChoice sued, arguing that the law violates the First Amendment. The district court preliminarily enjoined some provisions but largely left the law in place. NetChoice appeals the district court's denial of injunctive relief. With one exception, we affirm the district court.

There's a lot going on there, and some of the analysis turns on procedural or remedial features of the case, but here's a substantive First Amendment analysis as to one facet of the law:

NetChoice also raises an as-applied challenge to the Act's requirement that minors' accounts operate with certain default settings, which can be turned off by a parent. Two such default settings are at issue: (1) that covered platforms cannot show minors the number of likes or other feedback on a post; and (2) that minors' accounts must be on "private mode" ….

We disagree that the whole Act is content based, but agree that the like-count provision itself is….

First, the district court correctly concluded the Act's exception from coverage of websites "limited to commercial transactions or to consumer reviews," is not content based. While a close question, we agree. In City of Austin, the Supreme Court rejected "the view that any examination of speech or expression inherently triggers heightened First Amendment concern." City of Austin instead recognized an implication of this rule that governs here: Statutes that classify and single out solicitation "require some evaluation of the speech and nonetheless remain content neutral."

The Court's definition of "solicitation" is instructive. It is "speech 'requesting or seeking to obtain something' or '[a]n attempt or effort to gain business.'" And "the Court has reasoned that restrictions on solicitation are not content based." Sitting en banc, we reiterated and elaborated on this solicitation carveout earlier this year..

Although the Act does not define "commercial transactions" or "consumer reviews," the ordinary meaning of those terms suggests that they amount to commercial solicitation as City of Austin and Project Veritas discussed the term. This exception's description of "[a]n internet website, online service, online application, or mobile application for which interactions between users are limited to commercial transactions or to consumer reviews of products" simply describes websites "requesting or seeking to obtain something" or "attempt[ing] … to gain business" online. Thus, the exception categorizes websites along lines that have been affirmed as content neutral…. Thus, the Act "applies evenhandedly to all who wish to distribute and sell" online.

NetChoice also argues that the Act's focus on social media makes the entire act content based. We disagree.

The Act applies to any internet website "including, but not limited to, a 'social media platform'" that personalizes feeds based on information provided by the user. A "social media platform" is a service whose "substantial function" is to facilitate social interaction…. California's use of "social media" platform as statutory shorthand does not render the Act content based, since it applies to websites whether they facilitate social interaction or other forms of content. So neither the commercial-transactions exception nor the Act's focus on "social media" platforms makes the Act as a whole content based….

That said, the regulation of like counts in particular is independently content based. Like counts are "speech with a particular content." The Act prohibits platforms from describing posts based on "the idea or message expressed" by the description. Reed v. Town of Gilbert. A platform may show a post to a minor. And it may presumably tell that minor that other users have interacted with it. But it cannot tell the minor the number of likes or feedback that the post has received. Thus, whether the Act restricts a website's description of a post turns on what message the description will communicate. That is content discrimination.

As a result, strict scrutiny applies to this provision. And the like-count default setting is not the least restrictive way to advance California's interest in protecting minors' mental health [as we held just last year] … in NetChoice v. Bonta (9th Cir. 2024). Here, as in that case, California could encourage websites "to offer voluntary content filters" related to like counts or educate children and parents on such filters. We see no basis to distinguish that recent case. So we conclude that NetChoice is likely to prevail on the merits of its challenge to the like-count provision as applied to its members….

We next address the as-applied challenge to the private-mode default setting. In private mode, only users connected to a minor's account (being "friends," for example) can view or interact with that minor's posts….

This restriction may be speaker based. But not all speaker-based laws are subject to strict scrutiny. A speaker preference is problematic only if it "reflects a content preference." After all, speaker-based distinctions are suspect only because they "are all too often simply a means to control content." …

The private-mode provision does not "reflect[ ] a content preference." … The private-mode default is agnostic as to content and therefore need only survive intermediate scrutiny.

It does so. While not perfectly tailored, this restriction is narrowly tailored. It is not underinclusive enough to raise "doubts about whether the government is in fact pursuing" the asserted interest. In private mode, minors cannot conform their social media habits to maximize interaction and approval of a worldwide audience. This logically serves the end of protecting minors' mental health by reducing screentime and habit-forming platform usage. The provision may allow minors "to communicate with unconnected users on other types of services." But contrary to NetChoice's contention, that does not mean that the Act is so "riddled with exceptions" that it raises doubts about whether California is trying to mitigate the addictive nature of platforms that provide personalized feeds.

Neither is the provision so overinclusive to make it "substantially broader than necessary" to achieve California's interest. True, the requirement "applies to all covered websites and minor users, regardless of why they are using a particular service." But California's interests are wide-ranging. And California took a relatively nuanced approach. So the district court did not err by declining to enjoin the private-mode default setting provision….

The post Ninth Circuit Blocks Default Restrictions on "Like Counts" for Minors' Social Media Accounts, Upholds Default of "Private Mode" appeared first on Reason.com.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.