If juries are regulating Big Tech, where is Congress?

Published May 13, 2026 10:00am ET



A New Mexico jury recently ordered Meta to pay $375 million for exposing children to online sexual predators. One day later, a California jury ordered Google and Meta to pay millions to a woman who said YouTube and Instagram caused her depression and body image distress as a child. 

Across the country, ordinary citizens on juries are essentially stepping in for Congress, which has thus far failed to hold Big Tech companies accountable for deliberately addicting children, connecting them with predators, and pushing content that promotes self-harm, eating disorders, and sexual exploitation. In case after case, plaintiffs have presented evidence that companies knew their platforms could harm young users, but chose not to warn families or redesign their products. Instead, they tried to hide or downplay the risks.

Internal company research has confirmed as much: Meta’s own studies found that Instagram worsened body image problems for roughly 1 in 3 teenage girls, even as executives publicly downplayed the risks.

OPINION: THE KIDS ACT TREATS EVERYONE LIKE A KID

For the families involved, these verdicts may feel like justice has finally arrived. In some sense, it has. 

But the verdicts are also a damning indictment of a system that must wait for children to be predictably harmed before anyone is held accountable. Because of a lack of congressional appetite to regulate Big Tech, consumer harm shows up as lawsuits. But juries can only step in after harm has occurred — deciding, case by case, how companies should have protected children in the first place. And when the victims are children, the cost is measured in damage no verdict can undo.

The recent wave of cases underscores that courts are beginning to recognize what families have long understood: platforms such as Instagram or X are not neutral conduits of speech. They are products — and like any product, their design can be safe or dangerous. 

At the center of this failure is a law written when the internet meant message boards and static web pages. Section 230 of the Communications Decency Act made sense in 1996. It allowed platforms to host user content without being treated as the publisher of every post — a reasonable protection that helped the young internet grow.

But the internet of 1996 is not the internet of today.

Today’s platforms run on sophisticated algorithms that determine, at massive scale, what billions of people see. They are built to maximize engagement, often by keeping children and teenagers continually scrolling. American teenagers now spend an average of about five hours per day on social media, according to a new Gallup poll, with heavier use consistently associated with worse mental health outcomes.

For years, tech companies have argued that because users create and upload content, platforms cannot be responsible for any harm caused by it. Courts are beginning to reject that argument by focusing on the design of the platforms themselves: algorithms, infinite scroll features, engagement loops, and the lack of safety features companies chose not to include.

And the risks are accelerating. Artificial intelligence now allows predators to generate sexualized images of children in seconds and distribute them at scale — yet there is still no clear federal standard requiring platforms to detect or prevent this abuse.

Congress must reclaim its role as a consumer protector. Reports of online child sexual exploitation have surged in recent years, with the National Center for Missing and Exploited Children receiving more than 20 million reports in 2025 alone. If technology companies engineer their platforms to addict, to connect predators with children, or to amplify exploitative content, that is a product failure, and lawmakers must hold those companies accountable. 

At minimum, lawmakers should make clear that Section 230 does not shield companies from liability when their own design choices — algorithms, recommendation systems, or product features — cause foreseeable harm, especially to children. 

Lawmakers have regulated dangerous products before. They required warning labels on cigarettes. They set safety standards for cars and toys. They acted because the risks were clear and the consequences were too severe to ignore.

Children deserve no less in the digital age.

OPINION: CONGRESS WANTS TO PROTECT KIDS ONLINE. ITS SOLUTION MAKES THEM MORE VULNERABLE

The question before Congress is no longer whether children are being harmed. Juries across the country are already answering that question. The real question is whether lawmakers will act now to prevent the next harm — or continue leaving it to the courts.

For children growing up online right now, waiting is a choice. And it’s the wrong one.

Teresa Huizar is CEO of Washington, D.C.-based National Children’s Alliance, the nation’s network of nearly 1,000 Children’s Advocacy Centers, providing justice and healing through services to child victims of abuse and their families.