Supreme Court Report

Supreme Court's internet inexpertise will be put to the test in social media content cases

  • Print.

Social media blocking

A pair of cases, NetChoice v. Paxton and Moody v. NetChoice, ask whether states may dictate content-moderation standards or require individualized explanations when social media outlets remove or alter users' posts. (Image from Shutterstock)

During oral arguments before the U.S. Supreme Court last term—in one of two thorny cases about whether social media companies could be held liable for aiding and abetting terrorist groups—Justice Elena Kagan observed, “We’re a court. We really don’t know about these things.”

“You know,” she continued, looking across the bench at her colleagues and drawing laughter in the courtroom, “these are not, like, the nine greatest experts on the internet.”

This term, however, the justices are getting a crash course on some of the most salient regulatory and constitutional questions involving social media.

The court in October heard arguments in two cases that will help determine whether public officials may block bothersome constituents from their personal social media accounts. (Decisions in those cases are pending.) In March, meanwhile, the court will take up a case involving allegations that the federal government coerced some social media outlets to stifle some users who posted misinformation about COVID-19 or the 2020 presidential election.

Next week, the court will consider perhaps the highest-stakes battle for web and social media companies, at least for this term. A pair of cases, NetChoice v. Paxton and Moody v. NetChoice, ask whether states may dictate content-moderation standards or require individualized explanations when social media outlets remove or alter users’ posts.

Some legal experts are a bit worried about how well the nine non-experts on the internet will handle the latest social media battle to confront them.

“I think there is no way for them to ‘get it’ on the set of issues raised in this case, because the issues are so novel that even the most specialized experts are still evolving their thinking,” says Daphne Keller, a lecturer at Stanford Law School and the director of the Program on Platform Regulation at the Stanford Cyber Policy Center.

Motivated by perception of hostility to conservative views

The cases stem from laws passed in 2021 by the Florida and Texas legislatures aimed at regulating large social media platforms such as Facebook, YouTube, and X (formerly known as Twitter). The laws have some differences, but they both have two key provisions: one involving content-moderation that restricts what the companies may do to present user-generated content on their platforms; and the other requiring individualized explanations for certain content-moderation decisions.

Opponents of the laws argue that state lawmakers did not hide that the measures were motivated by a goal of promoting conservative speech and combating perceived censorship by tech companies.

“It is now law that conservative viewpoints in Texas cannot be banned on social media,” Texas Gov. Greg Abbott, a Republican, said when signing House Bill 20 into law.

The Texas measure applies only to platforms with more than 50 million active U.S. users per month. One key provision at issue bars platforms from censoring users based on viewpoint, though it allows them to prohibit categories of content such as violence or pornography. The explanation provision requires the platforms to contact users whose content was removed and explain why.

In Florida, Republican Gov. Ron DeSantis said when signing Senate Bill 7072 that the measure would protect his state’s residents from “Silicon Valley elites” and “Big Tech oligarchs.”

The Sunshine State’s law applies to internet platforms with more than $100 million in annual gross revenues or at least 100 million monthly users. It bars censoring (such as restricting or altering a user’s posts), shadow banning (limiting the exposure of a user to the content of another) and deplatforming (banning a user or deleting their posts for more than 14 days). The law also has language specific to deplatforming candidates for public office.

Both laws were challenged by two internet industry groups—the 52-year-old Computer & Communications Industry Association and 23-year-old NetChoice, both based in Washington, D.C. Both count Facebook and Instagram parent Meta, YouTube owner Google, and X as members.

“Our case is about whether the government can control or dictate the rules of the road for online speech,” says Chris Marchese, the litigation center director for NetChoice. “The government cannot compel private parties to publish speech they don’t want to publish. If the Supreme Court were to rule against us, it would incentivize every single government actor in this country to try to control the internet. It is also going to endanger the First Amendment rights of everyone.”

In a pre-enforcement challenge to the Florida law, the Atlanta-based 11th Circuit U.S. Court of Appeals held that content moderation is speech under the First Amendment, and the state’s restrictions were unlikely to survive intermediate scrutiny, let alone strict scrutiny. The court also ruled that the individual-explanation requirement would chill the social media platforms’ exercise of their editorial judgment.

In the Texas case, the New Orleans-based 5th U.S. Circuit Court of Appeals held that content-moderation activities were not speech but “censorship” that states may regulate. One judge on the three-judge panel, writing only for himself, suggested that the content moderation regulations were akin to “common carrier” rules imposed on railroads, telephone companies and, more recently, internet service providers.

The losing sides in both cases appealed to the Supreme Court, which agreed to examine both the Florida and Texas laws.

The social media platforms “express no message in the vast and disparate mass of user-provided content they host,” Florida Attorney General Ashley Moody argues in a brief. “Unlike a newspaper or a bookstore, the platforms are, to say the least, unselective in the people and content they allow on their sites: Virtually anyone may sign up and post almost any content.”

The Florida law “does little more than require the platforms to adhere to their general business practice of holding themselves open to all comers and content, which is how common-carrier regulation has functioned for centuries,” Moody wrote.

Texas Attorney General Ken Paxton argues in a brief that his state law’s content-moderation provision “just enables voluntary communication on the world’s largest telecommunications platforms between speakers who want to speak and listeners who want to listen, treating the platforms like telegraph or telephone companies.”

Adam Candeub, a law professor at Michigan State University and director of its Intellectual Property, Information & Communications Law program, says that the old Ma Bell telephone monopoly (as well as its modern successors) could not refuse to connect calls based on their subject matter or a disfavored source, and Western Union could not curate telegrams transmitted between its offices.

“The question is whether the social media platforms are like the telephone companies, the telegraph companies, the post office, or even cable TV systems in the sense they may have to carry messages they don’t like, or are they artistic or literary creations?” says Candeub, who co-wrote an amicus brief in support of the states. “If it is the former, they can be regulated like common carriers.”

Fitting 21st-century technologies into old wires and tracks

The social media industry groups dispute that they are common carriers.

“Websites like Facebook and YouTube are not common carriers, and governments cannot compel private parties that have exercised editorial discretion in ways the government disfavors to become common carriers of third-party speech,” they say in their merits brief in the Texas case.

Marchese of NetChoice says that the common law has historically imposed a duty on common carriers and places of traditional public accommodations such as innkeepers, ferries, stagecoaches, and railroads to serve the public without discrimination. But there is no comparable common law tradition of imposing common-carrier-like regulations on private parties that disseminate curated collections of speech, he says.

To be sure, there are other issues in the cases, and there are dozens of amicus briefs representing diverse views on all sides. Reddit, a platform offering hundreds of thousands of distinct subject discussions, filed a brief in support of the industry groups expressing worry about how state laws would affect its volunteer moderators.

In fact, Reddit was one of the first companies to be sued under the Texas law (before it was enjoined) by a disgruntled participant who was ejected from a subreddit devoted to Star Trek for violating the platform’s simple rule to “be nice.”

“The states’ laws would continually force Reddit into court to defend the idiosyncratic, subreddit-level rules created by individual users to organize Reddit’s multitude of communities,” the company says in its brief.

Florida and Texas, meanwhile, have drawn support from several sources, including former President Donald J. Trump, who has sued Twitter, Meta, and YouTube over removals or limitations imposed on him.

Keller, the Stanford Law lecturer, helped write an amicus brief in support of the industry groups. Among her other worries with these cases, alongside concerns about the justices’ understanding of the world of social media, is that there was a “shoddy legislative process” without a “real deliberative process” in designing the Florida and Texas laws.

“And we have had nothing like the kind of judicial tire-kicking in lower courts that usually ripens new questions for Supreme Court review,” she says. “So this is particularly dangerous territory for the court … to issue a ruling that states use as the blueprint for the next round of legislation.”

But, she adds, “the odds seem good that they will issue a consequential ruling anyway.”

See also:

“Chemerinsky: Supreme Court will hear some of its biggest cases of the term this month”

“Supreme Court to consider laws that block social media from removing certain content and users”

Give us feedback, share a story tip or update, or report an error.