A New Mexico jury found Meta liable for misleading users about child safety, ordering $375 million in civil penalties — the first successful state lawsuit of its kind against the company. A second trial phase in May could force structural changes to Meta’s platforms.
SANTA FE, NEW MEXICO — A civil jury in New Mexico ordered Meta Platforms to pay $375 million in penalties on Tuesday, March 24, after finding the company violated the state’s Unfair Practices Act by misleading the public about the safety of its apps for children, according to the New Mexico Department of Justice. The fine was calculated across thousands of individual violations, each carrying a maximum penalty of $5,000.
The verdict, delivered after a nearly seven-week trial, makes New Mexico the first US state to successfully take Meta to trial over child safety. But the more consequential threat to Meta may not be the fine itself — it is what comes next.
On May 4, a judge — without a jury — will hold a second proceeding to determine whether Meta created a public nuisance and whether the company must fund remediation programs or, more critically, redesign features of Facebook and Instagram. New Mexico Attorney General Raúl Torrez said he will push for court-mandated reforms including stronger age verification, removal of predatory accounts, and changes to encrypted messaging protections that prosecutors say currently shield bad actors.
The $375 million is a fraction of Meta’s $201 billion in revenue in 2025 — a detail that several legal observers noted after the verdict. The real stakes of this case are the platform changes that could be ordered in May, which would affect the hundreds of millions of minors currently using Meta‘s apps globally.
Algorithm design under direct scrutiny
The trial placed Meta‘s internal engineering decisions at the center of the lawsuit. State prosecutors argued that features like infinite scroll and auto-play video were not neutral design choices but deliberate tools to maximize engagement — including among children — while internal research showed the company understood the psychological harm being done.
Prosecutors cited an internal Meta study showing that 16% of Instagram users reported exposure to unwanted nudity or sexual content within a single week. They also presented internal communications linked to CEO Mark Zuckerberg‘s 2019 decision to make Facebook end-to-end encrypted — a move that, according to prosecutors, blocked the ability to report approximately 7.5 million instances of child sexual abuse material to law enforcement annually.
Former Meta employee and whistleblower Arturo Béjar testified that experiments on Instagram showed underage users were repeatedly served sexualized content. He told the court that his own daughter had received inappropriate propositions from a stranger while using the platform.
Meta said it “disagrees with the verdict” and will appeal. A company spokesperson cited ongoing safety investments including the rollout of Teen Accounts in 2024 and parental alerts for self-harm searches. During closing arguments, Meta attorney Kevin Huff argued the platforms are “designed to connect friends and family — not predators.”
The jury rejected that framing entirely.
A second trial, broader consequences
What most early coverage missed is that the $375 million verdict is only Phase One. Prosecutors explicitly told the court they will seek additional financial penalties from Judge Bryan Biedscheid during the bench trial starting May 4. They are also pressing for operational changes to how Meta handles minors — reforms that, if ordered, could set a binding legal precedent affecting how social media platforms nationwide are permitted to operate.
A separate federal trial in Los Angeles is simultaneously evaluating whether Meta and YouTube caused addiction among children — a case that has been in jury deliberation for over a week and could deliver a second blow to Meta within months.
Torrez called Tuesday’s ruling “historic,” saying New Mexico is “proud to be the first state to hold Meta accountable in court.”
For over two years, every legislative attempt to regulate social media’s effect on children — from age verification laws to algorithm transparency bills — has stalled or been challenged by platforms on First Amendment grounds. The New Mexico jury verdict cuts through that by targeting not what users post on Meta‘s platforms, but what Meta itself built and chose not to fix. That distinction may prove harder to appeal than any Congressional bill ever passed.

