Finance

US jury verdicts against Meta, Google tee up fight over tech liability shield

Published by Global Banking & Finance Review

Posted on March 26, 2026

5 min read

· Last updated: April 1, 2026

Add as preferred source on Google
US jury verdicts against Meta, Google tee up fight over tech liability shield
Global Banking & Finance Awards 2026 — Call for Entries

By Diana Novak Jones March 26 (Reuters) - Jurors in the first two trials in the U.S. from a growing wave of lawsuits targeting social media firms over harm to children have found Meta and Alphabet's

US jury verdicts against Meta, Google tee up fight over tech liability shield

Jury Verdicts Challenge Tech Companies' Legal Protections

By Diana Novak Jones

March 26 (Reuters) - Jurors in the first two trials in the U.S. from a growing wave of lawsuits targeting social media firms over harm to children have found Meta and Alphabet's Google liable, potentially teeing up an appeals fight that could reshape how U.S. law shields tech companies from lawsuits.

Recent Jury Decisions and Their Impact

In California, a Los Angeles jury on Wednesday found Meta and Google liable for a young woman’s depression and suicidal thoughts after she said she became addicted to Instagram and YouTube at a young age, ordering them to pay a combined $6 million in damages. In a separate New Mexico case, jurors on Tuesday ordered Meta to pay $375 million after finding the company misled users about the safety of its products for young users and enabled the sexual exploitation of children on its platforms.

Section 230 and Legal Strategies

The verdicts pierce a legal shield that plaintiffs suing tech companies have long struggled to overcome: Section 230 of the Communications Decency Act, a 1996 federal law that generally protects online platforms from liability over user-generated content. In both cases, the plaintiffs sidestepped that hurdle by arguing the companies harmed young users through decisions they made about the platforms' design rather than the content itself.

“Courts are increasingly trying to distinguish claims about platform functionality or platform conduct from claims that would really just impose liability for third-party speech,” said Gregory Dickinson, an assistant professor at the University of Nebraska College of Law who studies the intersection of tech and the law.

Meta and Google have denied the claims, arguing they have taken actions to protect young people. 

Meta and Google’s Defense Tactics

META, GOOGLE CLAIMED LIABILITY SHIELD

In both cases, Meta urged the judge to dismiss the lawsuit, as did Google in the Los Angeles case, claiming they were shielded from liability by Section 230. The judges rejected the argument, saying the cases could move to trial.

“We respectfully disagree with the verdicts and will appeal,” a Meta spokesperson said in a statement. “We remain committed to building safe, supportive environments for young people and will defend our record vigorously."

Google has said it plans to appeal in the Los Angeles case, but did not respond to a request for comment.

Those appeals are almost certain to center on Section 230 – and they could have broad implications.

Broader Legal Landscape

Meta, Google, Snapchat parent Snap Inc, and TikTok parent ByteDance are facing thousands of lawsuits in both state and federal court over claims their design choices have led to a mental health crisis for teens and young people. More than 2,400 cases have been centralized before a single judge in California federal court, while thousands of cases are consolidated in California state court. 

Legal experts say courts have been moving toward a narrower view of Section 230’s liability shield. Several lower courts have ruled that companies’ platform design choices are not protected by the law, but no appellate court has weighed in. Appellate courts, not trial judges, are the ones whose rulings bind other courts.

Implications for the Future of Tech Liability

IMPLICATIONS BEYOND SOCIAL MEDIA

An appellate ruling on Section 230 could have implications beyond social media, legal experts say, shaping lawsuits against other online platforms that host content used by children. More than 130 lawsuits are pending in federal court against Roblox Corporation, for example, accusing the popular gaming site of failing to protect users from sexual exploitation. Roblox denies the claims.

Potential Supreme Court Involvement

“I think the internet is on trial, not social media,” said Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University School of Law. "If the theories work, they will be deployed elsewhere."

Appeals in both cases would be heard first by appeals courts at the state level. But they could go to higher courts after that.

The U.S. Supreme Court has shown a willingness to potentially decide the scope of Section 230. In 2023, the court heard a challenge involving Google's video-sharing platform YouTube, but ultimately sidestepped a ruling on the legal protections for internet companies.

In 2024, the high court declined to hear a Texas teen's bid to revive his lawsuit accusing Snapchat owner Snap of failing to protect underage users of its social media platform from sexual predators. Two conservative justices - Clarence Thomas and Neil Gorsuch - dissented from that decision, however, warning of further delays in addressing the issue. "Social-media platforms have increasingly used (Section) 230 as a get-out-of-jail free card," they wrote in a dissent.

Expert Opinions on Legal Developments

Meetali Jain, director of the Tech Justice Law Project, which brings litigation against tech companies, said she thinks the U.S. Supreme Court may now be open to weighing in on the scope of Section 230.

“I personally think that the Supreme Court is even ready for a case like this, for the right case,” Jain said. 

(Reporting by Diana Novak Jones in Chicago, additional reporting by Andrew Chung in New York, Editing by Alexia Garamfalvi and Rod Nickel)

Key Takeaways

  • Los Angeles jury found Meta (Instagram) and Google (YouTube) liable for addiction-linked mental health harm to a young woman, awarding $6 million in damages by attributing culpability to platform design—not user content.
  • A New Mexico jury found Meta violated state consumer protection law by misleading users and enabling child sexual exploitation, imposing $375 million in penalties by treating violations as design-based harms rather than speech-based content.
  • These verdicts challenge the traditional scope of Section 230 immunity—highlighting a legal shift where platform functionality, not user-generated content, may open pathways to liability—raising profound implications for future tech litigation.

References

Frequently Asked Questions

What recent verdicts have been issued against Meta and Google?
Juries found Meta and Google liable for harm to young users, awarding damages in cases involving depression, suicidal thoughts, and misleading safety claims.
What is Section 230 and why is it important in these lawsuits?
Section 230 of the Communications Decency Act generally protects online platforms from liability over user-generated content. Recent cases challenge its scope by targeting platform design decisions.
How did plaintiffs sidestep Section 230 protections in these cases?
Plaintiffs argued the harm came from platform design choices, not user content, which courts allowed to proceed to trial.
What broader implications could the appellate rulings have?
Appellate rulings could reshape the legal landscape for tech companies and have implications for lawsuits against various online platforms beyond social media.
How many lawsuits are currently pending against social media companies over similar claims?
More than 2,400 cases are centralized in California federal court, with thousands more in state court, targeting social media design choices harming young users.

Tags

Related Articles

More from Finance

Explore more articles in the Finance category