The social media companies YouTube, TikTok, Facebook, Snapchat, and Instagram lost a key battle this week in a nationwide lawsuit over their role in the youth mental health crisis.
The suit, which represents hundreds of parents and children as plaintiffs, alleges that the platforms were designed to be addictive, have subsequently harmed young users, and did not sufficiently warn parents of related risks. Those dangers include anxiety, depression, suicidality, body image issues, and eating disorders, according to Lieff Cabraser, the law firm representing the plaintiffs.
The defendants, which include the platforms’ parent companies Alphabet, Google, Meta, Snap, and ByteDance, argued in response that they are immunized by law from the plaintiffs’ claims, and asked the court to dismiss the litigation. They cited Section 230 of the 1996 Communications Decency Act, which has long protected internet companies that publish third-party content online from legal liability in many circumstances.
On Tuesday, a federal judge in California partly dismissed the companies’ motion in a lengthy ruling, which means that critical aspects of the suit will move forward.
That includes claims that certain platform features, like imperfection-blurring filters and photos that have been edited but not labeled as such, are product defects for which the companies should be held accountable. The plaintiffs’ lawyers argued that such features expose young users to unrealistic body ideals and prompt them to compare themselves negatively to others.
Additionally, U.S. District Judge Yvonne Gonzalez Rogers found that the failure to implement protective limits on the duration and frequency of use, robust verification processes to determine a user’s age, and effective parental controls and notifications are also product design defects for which the companies could potentially be held responsible.
“Today’s decision is a significant victory for the families that have been harmed by the dangers of social media,” the plaintiffs’ lawyers said in a statement. “The mental health crisis among American youth is a direct result of these defendants’ intentional design of harmful product features.”
Rogers noted that the social media companies have a duty to their users when they create certain products that could be defective, and that they could be sued over negligence related to those defects.
However, Rogers dismissed some of the plaintiffs’ claims, including that the social media companies could be held liable for private messaging functions, the timing and clustering of notifications regarding other users’ content, and algorithmic recommendations that connect minors with adults. She ruled that Section 230 shields the companies from liability because these acts fall squarely within the realm of “publishing,” which the federal law largely protects.
Reuters reported that a spokesperson for Alphabet, the parent company of Google and YouTube, described the suit’s allegations as “simply not true,” adding that it always worked to protect children. A spokesperson for TikTok told Reuters that the platform had “robust safety policies and parental controls.”
In addition to this nationwide lawsuit, a group of 41 states and the District of Columbia sued Meta last month, alleging that the company has intentionally hooked young users on its platforms, which include Facebook, Instagram, and WhatsApp. School districts across the U.S. have also sued social media companies on similar grounds. In June, a Maryland school district sued the parent companies of Instagram, Facebook, TikTok, Snapchat, and YouTube for creating harmful product features that have created “a mental health crisis among America’s youth.”