The National Pulse

School districts sue social media platforms, saying they're harming youths' mental health

  • Print.

Photo illustration by Sara Wadford/ABA Journal

Photo illustration by Sara Wadford/ABA Journal

Selena Rodriguez was still in elementary school when she began using the social media platforms Instagram and Snapchat.

It didn’t take long for her to develop an “extreme” addiction, according to the Social Media Victims Law Center, which represents her mother, Tammy Rodriguez. Selena later developed sleep deprivation, low self-esteem, eating disorders and depression. Ultimately, at age 11, Selena died by suicide. Her mother thinks social media was to blame. She asserted these grievances in a wrongful death lawsuit filed by the Social Media Victims Law Center in a San Francisco federal court in January 2022 against Snap Inc., which operates Snapchat; and Meta Platforms Inc., which operates Instagram and Facebook.

The suit was soon followed by more than 100 others against Facebook, Instagram, Snapchat, TikTok, YouTube and their parent companies, alleging they caused a range of mental health issues. The suits accuse the Big Tech giants of intentionally creating addictive products to hook adolescent audiences.

Now, school districts across the country are joining the fray, including districts in Alabama, Arizona, California, Florida, New Jersey, Oregon, Pennsylvania and Washington. They argue they have been forced to hire additional counselors, develop resources and train staff to handle the burgeoning number of students succumbing to what they describe as a youth mental health crisis. The school districts are seeking restitution for expenses incurred, a fund to provide ongoing student support and changes to the platforms to try to make them less addictive.

According to a complaint by a Seattle school district against the platforms, the defendants design algorithms to “maximize revenue” and “have intentionally designed and operated their platforms to maximize users’ screen time.” The complaint says they “exploit human psychology using complex algorithms driven by advanced artificial intelligence and machine-learning systems.”

The cases (the Social Media Victims Law Center is counsel in some of these) have been consolidated into multidistrict litigation in the Northern District of California.

Social media immunity?

The defendants maintain in an initial case management statement that they are protected from liability by Section 230 of the Communications Decency Act, which provides immunity to online services that publish content created by others.

They also assert First Amendment protection for the way they present or control third-party content on their apps and say their platforms do not constitute products, in light of the plaintiffs’ defective product liability claims.

The Section 230 argument looms largest, as the scope of the law is under review by the U.S. Supreme Court, which heard oral arguments in Gonzalez v. Google in February. In that case, Reynaldo Gonzalez, whose daughter was killed in Islamic State group attacks in Paris in 2015, alleges Google provided video recommendations that helped the group’s recruitment efforts. Google claims Section 230 protection.

The court appeared reluctant to take on defining the limitations of the law, with both Justice Elena Kagan and Justice Brett Kavanaugh suggesting Congress was better equipped for the task. Justice Amy Coney Barrett mentioned that the case could possibly be resolved on other grounds.

In their motion to dismiss the individual plaintiffs’ claims, the defendants indicate they will brief their First Amendment and Section 230 “threshhold defenses” after the court’s ruling in Gonzalez. The court had not ruled in the case at press time. [Editor’s note: The Supreme Court ruled in Twitter v. Taamneh on May 18 that tech companies were not liable for allowing the Islamic State group to use their platforms in their terrorism efforts. Companion case Google v. Gonzalez was vacated and remanded to the 9th U.S. Circuit Court of Appeals.]

The attorney for several school districts, Cari Campen Laufenberg of Keller Rohrback in Seattle, says, “We believe we have claims that are not impacted by Section 230, but we understand that this … will likely be a litigated issue.”

Plaintiffs’ steering committee member Jayne Conroy of Simmons Hanly Conroy in New York City dismisses its application. “They are not simply a host for information. Rather, they designed defective products that were intentionally addictive.”

If the plaintiffs overcome any Section 230 hurdle, the focus in the school district cases shifts to their central basis of liability: public nuisance.

Correlation or causation?

University of Virginia law professor Leslie Kendrick, a leading expert in public nuisance law, says the notoriously nebulous cause of action requires the schools “to show that the [alleged] problems are the kind that public nuisance was meant to address and that there is a causal link between the specific conduct of the defendants and the problems they are seeing.”

Or, as complex litigation and tort law professor Alexandra Lahav of Cornell Law School says: “Is the fact that there’s a correlation between the increased use of social media platforms and the incidence of suicidal ideation, depression and other problems a causal relation, or is it just two things happening at the same time?”

Seattle attorney Matthew Bergman, who founded the Social Media Victims Law Center in 2021, argues that the causal connection is undeniable. “The spike in mental health harm in young people coincides with the advent of social media to the letter,” he says. Bergman cites the work of San Diego State University professor of psychology Jean Twenge, whose research focuses on generational differences, and social psychologist and New York University Stern School of Business professor Jonathan Haidt for showing that social media began impacting youth around 2012.

Bergman, who serves on the social media litigation plaintiffs’ steering committee, says the youth mental health crisis does not seem to correlate with other societal upheavals, such as the 2008 financial collapse. In that example, data indicates that the U.S. economy steadily improved from 2009 to 2019, while adolescent depression rates skyrocketed.

He and other plaintiffs attorneys suggest these cases are not unlike mass tort actions that have preceded them—in particular, vaping and opioid litigation—which also raised youth addiction and public nuisance issues.

Conroy, who has served on numerous major plaintiffs’ steering committees, including in opioid litigation, likens social media’s effects to the pleasurable dopamine release from opioids. Writing in February 2023 for the Perry Weitz Mass Tort Institute at the Maurice A. Deane School of Law at Hofstra University, she asserts that in the social media setting, “It is just not drugs—it is the images on the screen, the endless stream of content ‘chosen’ for us, the feeling we get when someone ‘likes’ one of our posts” that produces a dopamine high.

Because children’s and adolescents’ brains aren’t fully developed, Conroy and other plaintiffs’ counsel maintain that they are particularly susceptible to addictive design and vulnerable to harm from “algorithmically selected” content.

But Kendrick sees distinctions between the social media claims and the vaping and opioid litigation. The addictive nature of social media is much more contested than that of opioids or nicotine, she says. Also, the causal link is likely “much more attenuated when you’re trying to single out specific forms of communication that minors were exposed to and how that ultimately related to their mental health status.”

Lahav says a central question will be: “What does the in-house conversation look like?” She believes that “a lot is going to depend on what evidence of misconduct there is with the app makers. Are there memos, reports or studies indicating that [they] recognized that there’s a population that’s susceptible, and [they’re] going to target them?”

Lahav, who is among the amici in opioid litigation pending in the Richmond, Virginia-based 4th U.S. Circuit Court of Appeals, notes that with Juul, the defendant manufacturer of vaping products, “There was all this evidence that they were actually targeting schoolchildren.”

The more than 5,000 Juul cases, also consolidated into a single case that included school districts, settled in December 2022 for an undisclosed amount, reported by the Wall Street Journal to be valued at $1.7 billion.

In the social media litigation, defendants point to their efforts to minimize risk of harm to youths.

“We have invested heavily in creating safe experiences for children across our platforms and have introduced strong protections and dedicated features to prioritize their well-being,” José Castañeda, spokesperson for Google—which owns YouTube—said in a statement. “For example, through Family Link, we provide parents with the ability to set reminders, limit screen time and block specific types of content on supervised devices.”

TikTok representative Jessica Allen declined to comment on the litigation, but in an email to the ABA Journal, she listed ways that “TikTok prioritizes the safety and well-being of teens, including age-restricted features, with limits on direct messaging and livestreams, and private accounts by default for younger teens; screen time management tools, break reminders; restricted nighttime notifications for teens; … parental controls … and access to a range of expert support resources.”

Meta features Antigone Davis, global head of safety, on its website noting the company’s first Summit on Youth Safety and Well-Being in December 2022 and its “tools to support teens and families on our apps.” Meta did not respond to the Journal’s request for comment.

No deadline to respond to the school districts’ claims had been scheduled by the court at press time.

This story was originally published in the June-July 2023 issue of the ABA Journal under the headline: “Social Detox: School districts sue social media platforms, saying they’re harming youths’ mental health.”

Give us feedback, share a story tip or update, or report an error.