Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Reason
Reason
Politics
Eugene Volokh

N.C. Trial Court Rejects First Amendment Defense to "Addictive Design" Claim Against Tiktok

From State ex rel. Jackson v. TikTok Inc., decided Tuesday by Judge Adam Conrad; I'm skeptical about the analysis, for reasons I hope to blog about later (I'm writing a journal article on the subject right now), but I thought I'd pass it along:

The following background assumes that the allegations in the complaint are true [because that's what courts do when deciding on a defendant's motion to dismiss -EV].

TikTok features an array of elements allegedly designed to exploit minors' developmental immaturity and induce compulsive use. TikTok's home page (coined the "For You Page") feeds each end user videos that are algorithmically selected to maximize engagement. The algorithm, or recommendation system, performs this task by recording the user's interactions with the app (such as sharing or skipping a video), identifying behavioral patterns, comparing the user's behavior with others', and ranking videos as more or less likely to be engaging based on that comparison. This individualized feed is, in the words of ByteDance employees, "addictive."

Other design elements enhance TikTok's addictive quality. When a user opens TikTok, a video plays automatically. The user can then cycle through videos endlessly just by swiping a finger. These features—"autoplay" and "infinite scroll"—generate an immersive, seamless experience without the occasional pause that the user might regard as a natural stopping point. Of course, scrolling isn't all that the app has to offer. Filters allow users to touch up photos and videos in myriad ways; one filter called "Beauty Mode" makes facial features and hairstyles look more attractive. Various buttons and widgets also allow users to like and share videos, post comments, and follow specific content creators. The desire to amass likes and similar social rewards begets more frequent and protracted app usage.

In addition, the app sends push notifications to coax users to return to the app when they are away. Notifications arrive on a schedule most likely to get users' attention, such as late in the evening. They may highlight algorithmically selected videos and sometimes promote content that is available to view only for a short period or at a specific time, playing on users' fear of missing out to create a sense of urgency. There are also badges, which appear as a number above the app's icon and tempt the user to return by quantifying, perhaps falsely, all that they've missed while not using the app.

These design choices allegedly make TikTok addictive to minors in much the same way that, say, roulette is addictive to gamblers. One reason that roulette is so alluring is that it offers unpredictable, variable rewards. As the wheel spins, gamblers "anticipate[ ] a reward that they know could come but is tantalisingly just out of reach," and they "experience a dopamine rush" in the process.

TikTok has similar traits. Each new video in the feed is a surprise; users can scroll as long as they wish, endlessly anticipating but never knowing what they will see next. By the same token, social rewards are variable; users do not know when they will get the next notification that a viewer followed their account or liked one of their videos. For minors, this is irresistible. As one expert put it, minors "struggle 'to ignore the prospect of a dopamine reward, even when this conflicts with other essential daily activities, such as sleeping or eating.'" Tracking statistics bear this out: the average teen user opens TikTok sixteen times per day, and many teens spend more than four hours on the app daily and often wallow in lengthy, late-night binges.

TikTok addiction is bad for minors' mental health, the State alleges. ByteDance's own employees have sounded the alarm to company leaders, worrying that compulsive use of the app disturbs sleep patterns, interferes with "work/school responsibilities," "leads to deficient self-regulation," causes or compounds "anxiety," and impairs "analytical and problem-solving skill[s], memory formation, contextual thinking, conversational depth and empathy." Beyond that, employees have expressed concerns about negative effects on teens' self-image and susceptibility to eating disorders, particularly in connection with appearance-altering filters. But not everyone at ByteDance shares these concerns. One dissenter brushed them aside, asking "isn't addiction in this sense considered a very positive metric in our field?" And ByteDance's leadership has allegedly rejected these and other internal calls to make the app less addictive.

To the outside world, though, ByteDance touts its safety efforts. In advertisements and public statements, ByteDance maintains that minor accounts carry an automatic screen-time limit of sixty minutes, built-in nudges to remind minors to take breaks, and customizable tools that parents can use to control what their kids see and do. Plus, ByteDance tells users that they can escape so-called "rabbit holes" of entrancing, personalized content by refreshing their feed as if they had just opened a new account. In the same vein, ByteDance publicizes Community Guidelines that draw the lines between permissible and impermissible content. The Guidelines forbid and promise the removal of, among other things, "sexually suggestive" content created by minors and images of "gory, graphic human injuries."

But these are smokescreens, according to the State. In every case, the purported safety features either do not function as advertised (refreshing the feed lasts just a few videos before heading back down the "rabbit hole") or are so easy to disable as to be useless (teens get a passcode that they can use to override the supposed limit on screen time). Indeed, ByteDance allegedly designed these features to be ineffective while giving the illusion of mitigating compulsive use. And ByteDance disregards its own Community Guidelines. Its skeletal content-moderation staff cannot review more than a small fraction of posted content for compliance, so that scads of videos and user comments that are "egregious," "dangerous," and "graphic" by ByteDance's self-imposed standards never get flagged. Even when aware of prohibited content, ByteDance chooses to make that content harder to find but not to remove it as promised.

The State asserts a single claim, based on TikTok's design and marketing, for unfair or deceptive trade practices under N.C.G.S. § 75-1.1. It claims that ByteDance unfairly designed TikTok to be addictive to minors despite knowledge that compulsive use harms them. It also claims that ByteDance deceived the public by misrepresenting TikTok's safety features and Community Guidelines while falsely assuring that the app is safe for young users….

Here's the court's First Amendment analysis:

One [constitutionally] protected "aspect of speech" is "the editorial function"—that is, "exercising editorial discretion in the selection and presentation of content," including third-party content. Moody v. NetChoice, LLC (2024). Still, "the First Amendment does not prevent restrictions directed at commerce or conduct from imposing incidental burdens on speech." Sorrell v. IMS Health Inc. (2011) (contrasting "restrictions on protected expression" with "restrictions on economic activity or, more generally, on nonexpressive conduct")….

Few areas of the law are more cutting-edge than this one. It was just last year in Moody that the United States Supreme Court confirmed that "some [social-media] platforms, in at least some functions, are indeed engaged in expression" protected by the First Amendment. Specifically, a platform's content-moderation policy reflects its editorial judgments "about whether—and, if so, how—to convey posts having a certain content or viewpoint." See also id. (questioning state laws that "limit[ed] the platforms' capacity to engage in content moderation—to filter, prioritize, and label the varied messages, videos, and other content their users wish to post"). These "editorial judgments influencing the content" of social-media feeds are "protected expressive activity."

"But what if," as Justice Barrett asked in her concurrence, "a platform's algorithm just presents automatically to each user whatever the algorithm thinks the user will like—e.g., content similar to posts with which the user previously engaged?" Is that protected expressive activity as well? The Moody majority left that question unanswered: "We therefore do not deal here with feeds whose algorithms respond solely to how users act online—giving them the content they appear to want, without any regard to independent content standards."

This case invites Justice Barrett's question once again. The State bases its unfairness theory on features that induce compulsive use, including TikTok's algorithm. As alleged, the algorithm does not "understand" or "care about" content, nor does it "promote or suppress particular political agendas, views, or content."  Rather, the algorithm presents videos based solely on an analysis of "the user's pattern of engagement."  Put another way, "[t]he recommendation engine is content-neutral, meaning that it recommends content based on behavioral and certain device signals, not on the semantic nature of the content itself."

Taking these allegations as true, it's hard to discern any expressive activity. The algorithm does not convey a message by its programmer; it simply bows to user preferences and propensities. At a minimum, the complaint supports an inference that a reasonable person would understand TikTok's video feed to reflect a given user's content choices as opposed to ByteDance's own creative expression or editorial judgment. See, e.g., U.S. Telecom Ass'n v. FCC (D.C. Cir. 2016) ("As a result, when a subscriber uses her broadband service to access internet content of her own choosing, she does not understand the accessed content to reflect her broadband provider's editorial judgment or viewpoint."); see also NetChoice v. Bonta (N.D. Cal. 2024) ("[I]t would be hard to say that the algorithm reflects any message from its creator because it would recommend and amplify both favored and disfavored messages alike so long as doing so prompts users to spend longer on social media.").

So too for the other disputed design features, such as autoplay, infinite scrolling, and social rewards. ByteDance says little about them, offering no basis to conclude that they deserve First Amendment protection independent of the underlying algorithm. And on the face of the complaint, none of the features has an obviously expressive quality.

Of course, the complaint gives only a glimpse—and likely a contested glimpse at that—into the inner workings of TikTok's algorithm and ancillary features. A more developed record might reveal an expressive message that changes the calculus. But for now, the Court concludes that the First Amendment does not mandate dismissal of the State's unfairness theory. See Meta Platforms, 2024 Mass. Super. LEXIS 161, at *22–23 (deeming claims about features that "allegedly induce addiction in young users … to be principally based on conduct and product design, not expressive content"); see also Meta Platforms, 2024 D.C. Super. LEXIS 27, at (same); Meta Platforms, 2024 Vt. Super. LEXIS 146, at *17–18 (same); Utah Div. of Consumer Prot. v. TikTok Inc., No. 230907634, at 14 (Utah Dist. Ct. Nov. 12, 2024) (same).

Likewise, the Court concludes that the First Amendment does not bar the State's deception theory. ByteDance misconstrues the theory as an attempt to regulate its content-moderation practices to suppress overly engaging content. That is not what is alleged. Rather, the State alleges that ByteDance deceived the public by making false and misleading statements about the functionality of its safety features and its adherence to its Community Guidelines. "Untruthful speech, commercial or otherwise, has never been protected for its own sake." …

The post N.C. Trial Court Rejects First Amendment Defense to "Addictive Design" Claim Against Tiktok appeared first on Reason.com.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.