by Deborah Pearlstein and John Dellamore
May 4, 2023
constitutional law, Facebook, First Amendment, Instagram, Meta, Social Media Platforms, Technology, TikTok, Twitter
by Deborah Pearlstein and John Dellamore
May 4, 2023
Last month, Montana became the first U.S. state to pass a bill banning TikTok from operating within its borders. If Governor Greg Gianforte signs some version of the bill, it will become the first statewide ban in the country to take direct aim at the popular social media app, which various U.S. government officials have warned poses a serious national security threat. But while Montana may be the first to act, significant gaps remain in the public debate surrounding both the nature of the threat that TikTok presents, and the constitutional questions that trying to regulate it might create.
On the security side, state and federal lawmakers have spoken with great concern about the seriousness of the threat posed by the Chinese-owned app. But they have remained unfortunately general in describing the exact nature of their concerns – officials often cite to one or more worries about the proliferation of disinformation, the compromise of personal data, or threats to U.S. national security more broadly – and have done little to clarify how exactly TikTok’s dangers distinguish it from any number of other online platforms and data collection operations that channel disinformation and vast quantities of similar data to the open market (which are accessible to Beijing or anyone else).
On the speech side, civil liberties advocates and scholars have, in contrast, been remarkably clear: “banning TikTok” would violate the First Amendment (see here, here, and here). Yet given the variety of forms that legislative or executive action against the platform might eventually take, the constitutional question is quite a bit more complicated than it initially appears or is currently acknowledged. The outcome of any court challenges to a ban will depend in key part on the factual strength of the government’s claims. As it stands, the failure of current public debate to engage the complex reality serves neither the interests of national security nor freedom of expression. This piece explains why.
Start with the First Amendment. On the text of the Amendment itself, it is easy to imagine that it categorically bans any government rule that even vaguely burdens “the freedom of speech.” But that is not how the First Amendment works. The speech universe has long been divided into “protected” and “unprotected” or “less protected” forms of speech. The Supreme Court has said that the government only needs to provide a modest justification when it regulates less protected forms of speech (such as defamation, incitement of violence, and commercial fraud). Even within the realm of protected speech, regulations that aim merely at the time, place, or manner of speaking – rather than the speech’s content – have regularly passed First Amendment muster, so long as the government can show that its interest in regulating the speech is significant, that the regulation is no more restrictive than necessary, and that a potential speaker has ample opportunity to convey the same message at some other time or place. In short, the availability of First Amendment protection may greatly depend on whose speech would be regulated, why, and how.
To understand how the First Amendment applies to TikTok, it might help to consider three theories about whose free speech rights are implicated: (1) TikTok’s rights as a platform; (2) the rights of TikTok’s U.S. users to speak on a major social media platform; and (3) the rights of TikTok’s U.S. users to access content available on the platform.
As the Supreme Court has made clear in many contexts, corporations have speech rights just like (or almost like) individuals. It’s easy to imagine TikTok’s lawyers arguing that their client has the same right to speak in the United States as the New York Times, Verizon, or any other U.S. company. But two important issues make this claim problematic. First is that foreign individuals or corporations outside the United States may not have any cognizable rights under the First Amendment. It is in part for that reason that the Court has generally treated cases involving the restriction of foreign speech from outside the United States as implicating the First Amendment rights of U.S. listeners, rather than foreign speakers (more on that below).
Foreign nationals inside the United States of course have all kinds of constitutional rights, and one might argue that to the extent TikTok is operating within the territorial United States, it should enjoy the same First Amendment protections that any other U.S. publisher or speaker enjoys. But that brings us to the second issue: Whether social media apps are indeed speakers or publishers at all. That debate is at the heart of the giant, looming question that overhangs all social media regulation in the United States at the moment – namely, what is the status of social media companies for First Amendment purposes? The Supreme Court hasn’t weighed in yet, but if it treats social media companies as publishers or speakers, they may well enjoy roughly the same First Amendment rights available to other corporate speakers in the United States. On the other hand, if the Court considers them more like, say, common carriers or conduits – an equally live possibility – then the government would have much more room to regulate. Or, the Court could conclude, social media companies are publishers for some purposes, and conduits for others. But this simply reposes the question: what is TikTok’s status here?
The doctrinal hurdles to TikTok itself asserting a First Amendment claim make it far more tempting, then, to revert to Option 2: TikTok’s U.S. users’ right to speak on a major public social media platform (of which TikTok, with more than 150 million U.S. users, is surely one). Here, the theory is far more intuitively attractive: TikTok (like Twitter was or Instagram perhaps remains) is the modern-day equivalent of the town square, a traditional public forum in First Amendment terms in which any government restriction on the content of speech must be justified by a compelling government interest, and must be narrowly tailored to be the least speech-restrictive means possible to achieve that compelling interest. It was some version of this town square idea that a federal district court in California embraced when it stopped the Trump administration from banning WeChat, a Chinese-owned instant messaging and social media app widely used by the Chinese-speaking and Chinese American community in the United States. But the district court’s opinion in WeChat offered little discussion (or for that matter citations) to support its vague public-forum analogy. The court instead rested its willingness to temporarily suspend the effect of the proposed ban almost entirely on its factual finding that WeChat was “effectively the only means of communication” with friends and relatives in China among the community that used it – not only “because China bans other apps,” but also because, as the court repeatedly emphasized, “Chinese speakers with limited English proficiency” have no other options. The situation for TikTok’s U.S. users – including millions of teens who are equally active and adept at using closely analogous and widely popular U.S.-owned platforms like Instagram – hardly seems the same.
In any case, TikTok and platforms like it are not in any traditional sense a “public” forum, in that they are owned or maintained by the government; they are privately owned services that other courts may yet conclude are not any kind of “forum” at all, but are rather, again, private speakers or publishers, common carriers, or some combination of the two – the answer to which has potentially conclusive implications for whether U.S. users have any “right” to speak on it at all. We don’t mean for a moment to suggest that U.S. users’ speech rights are not implicated here. A law banning TikTok could indeed limit U.S. users’ speech on something that functions something like a public forum. Montana’s law is especially vulnerable under almost any First Amendment analysis, aimed as it is at a single platform and expressly identifying the content on the platform it finds objectionable, rather than focusing solely on the manner in which it collects and secures data (more on that below). But where a lawmaker could design a ban not aimed squarely at the site’s content, and where ample alternative channels for the same speech on other platforms remain, existing doctrine offers no guarantee the First Amendment would be offended.
A third theory goes as follows: TikTok’s U.S. users’ First Amendment rights are separately burdened when they are deprived of the ability to access the content otherwise available on the platform. Indeed, because one of the core purposes of the First Amendment has long been thought to protect a diverse “marketplace of ideas” sufficient to sustain democratic governance, the Supreme Court has repeatedly recognized the First Amendment right of listeners to access information in that marketplace. More than half a century ago, the Court struck down a federal law barring the mail delivery of “communist propaganda” from abroad unless the intended recipient specifically asked the postal service to deliver it. The Court held that it was wrong for the government to interfere with the mail and attempt “to control the flow of ideas to the public.” This right of access was also part of the Court’s rationale in a more recent decision striking down a sweeping North Carolina law that barred convicted sex offenders from “accessing” any “commercial social networking website” with child members. In that case, the Court wrote at length about the indisputable importance of the government’s interest in passing the law in the first place, mainly to protect children from sex offenders. But a ban of “such staggering reach” – involving a law that could be read to block access to everything from Facebook to WebMD to Amazon – was not remotely tailored enough to survive any degree of constitutional scrutiny from the Court.
The Court reached the right answer in both of those right-of-access cases. But it’s easy to imagine a “TikTok ban” written much more carefully than Montana’s initial version to have far more targeted scope. For instance, a time, place, and manner regulation that effectively precludes U.S. users from accessing foreign-owned platforms on U.S.-based mobile devices for so long as those platforms offer insufficient safeguards to prevent, for example, the geolocation and facial recognition data of U.S. users from being shared with foreign adversaries. In this form, a “ban” starts to look far less problematic than longstanding regulations on foreign speech that still operate in the United States. For better or worse, the First Amendment rights of U.S. listeners have never before posed an obstacle to federal restrictions on foreign involvement (through speech or otherwise) in federal elections, or even federal regulations surrounding the distribution of “foreign political propaganda.” As it stands, U.S. copyright law already precludes a foreign broadcaster from directing copyright-infringing performances into the United States from abroad. The Court has never suggested that such restrictions run afoul of U.S. listener’s right to access that information. If the government were actually able to demonstrate a genuine threat to U.S. national security, and if the content available on TikTok could also be accessed elsewhere, it is not obvious how a court would reconcile the speech and security interests-on-all-sides.
All of this uncertainty highlights why it is important for the public conversation to engage more deeply with the questions of both security and speech around a TikTok ban. It is entirely right to assume in the first instance that an outright government ban of a major social media platform violates the First Amendment. There is no question that any proposed government restriction on the operation of a social media platform available in the United States raises First Amendment concerns, especially when a ban targets a single platform by name. But the issue, should it eventually make its way to Court – like so many of the current questions in the online speech space – will inevitably present a novel question of constitutional law. Suggesting otherwise makes it more likely that speech advocates will be unprepared for the serious litigation battle they are sure to face if, and when, any TikTok-specific (or, as even more likely, non-TikTok-specific) regulation is enacted. The illusion that the resolution of an issue is settled or entirely certain likewise tends to relieve scholars of the burden of helping judges to think through the genuine range of interests and issues at play. (“If the question is easy, why write an amicus brief or article about it?”)
That over-simplification also disserves security policymakers, who need to understand the full landscape of arguments and litigation risks that potential legislation or administrative action is likely to face. Blanket statements that a TikTok ban would violate the First Amendment suggests to policymakers that there is little point to setting forth in detail – for litigation or even in their own minds – the specific, evidence-backed reasons why the government’s interest is compelling, or how that interest might most narrowly be achieved. (“If the case is a sure loser, they may assume, why even try to go to the trouble?”)
Perhaps most important, the expectation that the courts will step in to correct any constitutional failings may make law or policymakers more likely to take action they fear is constitutionally defective. Legislators get to “take the political win,” show constituents they are acting to address a problem, and then wait for the courts to sort it out. As one Senator put it in a different context, “the Court will clean it up” later. But depending on the actual terms of any federal action, it is simply unclear the courts will.
The current information ecosystem is new, the global threats are complicated, and the facts on the ground are changing quickly. There is reason to worry about the First Amendment implications of many of the proposed online speech regulations circulating these days. But those are not the only worry. We also worry that the confidence that often comes with deep expertise – whether in legal training, security experience, or technical know-how – often promotes a false sense of certainty. That confidence may prove more of a hindrance when trying to solve problems that require collaboration across disciplines.
The views expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the U.S. Government.
constitutional law, Facebook, First Amendment, Instagram, Meta, Social Media Platforms, Technology, TikTok, Twitter
All-source, public repository of congressional hearing transcripts, government agency documents, digital forensics, social media analysis, public opinion surveys, empirical research, more.
by Ryan Goodman
May 22nd, 2023
by Ikechukwu Uzoma and Mooya Nyaundi
May 18th, 2023
by Patrick Quirk and Santiago Stocker
May 17th, 2023
by Tom Joscelyn, Norman L. Eisen and Fred Wertheimer
May 12th, 2023
by Spencer Reynolds
May 10th, 2023
by Ryan Goodman and Norman L. Eisen
May 9th, 2023
by Clara Apt
May 8th, 2023
by Faiza Patel and Charles Kurzman
May 8th, 2023
by Paras Shah
May 4th, 2023
by Menachem Z. Rosensaft
May 4th, 2023
by Lord David Neuberger, Can Yeginsu, Catherine Amirfar and Baroness Helena Kennedy
May 3rd, 2023
by James Bruce
May 3rd, 2023
by Suliman Baldo
May 2nd, 2023
by Jacob Glick
May 1st, 2023
by Ryan Goodman, Justin Hendrix and Norman L. Eisen
May 1st, 2023
by Adam Keith
Apr 28th, 2023
by Yumna Rizvi
Apr 25th, 2023
by Matthew Teasdale
Apr 25th, 2023
by Jessica Wolfendale
Apr 24th, 2023
by William Maley, Farkhondeh Akbari and Niamatullah Ibrahimi
Apr 21st, 2023
by Jelena Pejic
Apr 20th, 2023
by Elizabeth Goitein
Apr 18th, 2023
by Just Security
Apr 17th, 2023
by Brianna Rosen
Apr 14th, 2023
by Aryeh Neier
Apr 14th, 2023
by Ambassador James F. Jeffrey
Apr 13th, 2023
by Denis Bećirović
Apr 13th, 2023
by Erik Dahl
Apr 12th, 2023
by Paige Collings and Adam Schwartz
Apr 10th, 2023
by Pablo Arrocha Olabuenaga
Apr 10th, 2023
by Jelena Pejic
Apr 7th, 2023
by Ambassador Daniel Fried
Apr 7th, 2023
by Jenny Maddocks
Apr 5th, 2023
by Chimène Keitner
Apr 3rd, 2023
by Chimène Keitner
Apr 3rd, 2023
by Ryan Goodman, Norman L. Eisen, Siven Watt, Joshua Kolb and Joshua Stanton
Apr 3rd, 2023
by Siven Watt and Norman L. Eisen
Mar 30th, 2023
by Katherine Yon Ebright
Mar 30th, 2023
by Maksym Dvorovyi, Antonina Cherevko and Nick Benequista
Mar 30th, 2023
by Rebecca Hamilton
Mar 29th, 2023
by Sirikan Charoensiri
Mar 28th, 2023
by David Kretzmer and Limor Yehuda
Mar 27th, 2023
by Brianna Rosen
Mar 24th, 2023
by Justin Hendrix
Mar 23rd, 2023
by Clara Apt and Katherine Fang
Mar 22nd, 2023
by Siven Watt, Norman L. Eisen and Ryan Goodman
Mar 21st, 2023
by Stephen Pomper
Mar 20th, 2023
by Crispin Smith and Michael Knights
Mar 20th, 2023
by Tom Joscelyn, Norman L. Eisen and Fred Wertheimer
Mar 20th, 2023
by Stephen Pomper
Mar 20th, 2023
by Christopher S. Chivvis
Mar 20th, 2023
by Rebecca Hamilton
Mar 17th, 2023
by Ismet Fatih Čančar
Mar 17th, 2023
by Rebecca Hamilton
Mar 17th, 2023
by Erlingur Erlingsson and Fridrik Jonsson
Mar 15th, 2023
by Michael Schmitt
Mar 15th, 2023
by Erlingur Erlingsson and Fridrik Jonsson
Mar 15th, 2023
by Ambassador Daniel Fried
Mar 13th, 2023
by W. Casey Biggerstaff
Mar 10th, 2023
by Kelsey Davenport
Mar 10th, 2023
by Todd Buchwald
Mar 9th, 2023
by Chris Purdy
Mar 8th, 2023
by Adam Keith
Mar 4th, 2023
by Oleksandra Matviichuk, Natalia Arno and Jasmine D. Cameron
Mar 3rd, 2023
by Oleksandra Matviichuk, Natalia Arno and Jasmine D. Cameron
Mar 3rd, 2023
by W. Casey Biggerstaff
Mar 2nd, 2023
by John Erath
Mar 1st, 2023
by Inga Imanbay
Feb 28th, 2023
by Mary B. McCord and Jacob Glick
Feb 27th, 2023
by Mark Malloch-Brown
Feb 24th, 2023
by Mark Malloch-Brown
Feb 24th, 2023
by Darryl Robinson
Feb 23rd, 2023
by Luis Moreno Ocampo
Feb 23rd, 2023
by Ukrainian MP Kira Rudik
Feb 22nd, 2023
by Ukrainian MP Oleksiy Goncharenko
Feb 22nd, 2023
by Andy Wright and Ryan Goodman
Feb 21st, 2023
by Mark Nevitt
Feb 21st, 2023
by Ambassador David Scheffer and Kristin Smith
Feb 17th, 2023
by Sophia Yan
Feb 15th, 2023
by Elizabeth Goitein
Feb 13th, 2023
by Chile Eboe-Osuji
Feb 10th, 2023
by Linda Bishai and Laura R. Cleary
Feb 9th, 2023
by Scott Roehm
Feb 8th, 2023
by Norman L. Eisen, E. Danya Perry and Fred Wertheimer
Feb 7th, 2023
by Ryan Goodman
Feb 7th, 2023
by Rebecca Hamilton and Rosa Curling
Feb 6th, 2023
by Luis Moreno Ocampo
Jan 31st, 2023
by Brian Finucane and Luke Hartig
Jan 30th, 2023
by Douglas London
Jan 27th, 2023
by Eileen B. Hershenov and Ryan B. Greer
Jan 26th, 2023
by Menachem Z. Rosensaft
Jan 25th, 2023
by Jasmine D. Cameron
Jan 24th, 2023
by Ryan Goodman and Clara Apt
Jan 19th, 2023
by Kate Donald and Anne-Marea Griffin
Jan 19th, 2023
by Marieke de Hoon
Jan 13th, 2023
by Andy Wright
Jan 12th, 2023
by Nikhil Deb and Nadia Genshaft-Volz
Jan 9th, 2023
by Mary B. McCord and Jacob Glick
Jan 6th, 2023
by Dean Jackson, Meghan Conroy and Alex Newhouse
Jan 5th, 2023
by Ambassador Peter Mulrean (ret.) and William J. Hawk
Jan 4th, 2023
by Karl Mihm, Jacob Apkon and Sruthi Venkatachalam
Jan 30th, 2023
by Clara Apt and Katherine Fang
Nov 18th, 2022
by Clara Apt
May 8th, 2023
by Noah Bookbinder, Norman L. Eisen, Debra Perlin, E. Danya Perry, Jason Powell, Donald Simon, Joshua Stanton and Fred Wertheimer
Oct 27th, 2022
by Tess Bridgeman and Brianna Rosen
Mar 24th, 2022
by Megan Corrarino
Feb 18th, 2022
by Mary B. McCord
Jan 24th, 2022
by Emily Berman, Tess Bridgeman, Megan Corrarino, Ryan Goodman and Dakota S. Rudesill
Jan 20th, 2022
by Laura Brawley, Antara Joardar and Madhu Narasimhan
Oct 29th, 2021
by Leila Nadya Sadat
Sep 13th, 2021
by Tess Bridgeman, Rachel Goldbrenner and Ryan Goodman
Sep 7th, 2021
by Just Security
Jul 19th, 2021
by Kate Brannen
Jun 30th, 2021
by Fionnuala Ní Aoláin and Kate Brannen
Jun 14th, 2021
by Steven J. Barela and Mark Fallon
Jun 1st, 2021
by Christine Berger
May 29th, 2021
by Beth Van Schaack
Feb 1st, 2021
by Beth Van Schaack and Chris Moxley
Nov 16th, 2020
by Oona A. Hathaway, Preston Lim, Mark Stevens and Alasdair Phillips-Robins
Nov 10th, 2020
by Emily Berman, Tess Bridgeman, Ryan Goodman and Dakota S. Rudesill
Oct 14th, 2020
by Cristina Rodríguez and Adam Cox
Oct 12th, 2020
by Scott Roehm, Rita Siemion and Hina Shamsi
Sep 11th, 2020
by Matiangai Sirleaf
Jul 13th, 2020
by Catherine O'Rourke
Oct 21st, 2020
by Sarah Knuckey and Jayne Huckerby
May 27th, 2020
by Tess Bridgeman and Ryan Goodman
Sep 12th, 2019
by Just Security
Jan 28th, 2019
by Marty Lederman
Oct 25th, 2018
by Erik Dahl
Jun 7th, 2022
by Justin Hendrix, Nicholas Tonckens and Sruthi Venkatachalam
Aug 29th, 2021
by Ryan Goodman and Juilee Shivalkar
Aug 8th, 2021
by Kate Brannen and Ryan Goodman
May 11th, 2021
by Atlantic Council's DFRLab
Feb 10th, 2021
by Ryan Goodman, Mari Dugas and Nicholas Tonckens
Jan 11th, 2021
by Ryan Goodman and Danielle Schulkin
Nov 3rd, 2020
by Chris Shenton
Aug 24th, 2020
by Ryan Goodman and Danielle Schulkin
Jul 27th, 2020
by Ryan Goodman and Julia Brooks
Mar 11th, 2020
Professor and Co-Director of the Floersheimer Center for Constitutional Democracy at Cardozo Law School. Follow her on Twitter (@DebPearlstein).
John Dellamore is a student at Cardozo School of Law.
Send A Letter To The Editor
by Fionnuala Ní Aoláin and Adriana Edmeades Jones
May 16th, 2023
by Spencer Reynolds
May 10th, 2023
by Catherine Amirfar, Ina Popova, Christel Tham and Nicole Marton
May 5th, 2023
by Lord David Neuberger, Can Yeginsu, Catherine Amirfar and Baroness Helena Kennedy
May 3rd, 2023
by Jacob Glick
May 1st, 2023
by Matthew Teasdale
Apr 25th, 2023
by Laurence H. Tribe, Joshua Stanton and E. Danya Perry
Apr 11th, 2023
by Anika Collier Navaroli
Apr 11th, 2023
by Paige Collings and Adam Schwartz
Apr 10th, 2023
by Ian Allen
Apr 3rd, 2023
by Jameel Jaffer
Mar 24th, 2023
by Justin Hendrix
Mar 23rd, 2023
Just Security is based at the Reiss Center on Law and Security at New York University School of Law.