Meta Trial: New Mexico Accuses Company of Harming Children for Profit
SANTA FE, N.M. — Closing arguments concluded Monday in a landmark trial in New Mexico state court, where social media conglomerate Meta is accused of misleading users about the safety of its platforms for children. Jurors will now deliberate whether Meta violated the state’s Unfair Practices Act.
The case, brought by New Mexico Attorney General Raúl Torrez in 2023, alleges that Meta – which owns Instagram, Facebook, and WhatsApp – prioritized profits over the well-being of young users, violating state consumer protection laws. Prosecutors contend that Meta failed to adequately disclose the potential harms associated with its platforms, including addictive algorithms and exposure to harmful content.
During the six-week trial, scores of witnesses testified, including teachers, psychiatric experts, state investigators, Meta officials, and former employees. State investigators presented evidence gathered through the creation of fake social media accounts posing as children, documenting instances of online sexual solicitation and Meta’s response.
In closing statements, prosecutor Linda Singer argued that Meta’s algorithms actively promoted sensational and harmful content to teenagers and that the company did not enforce its stated minimum user age of 13. “The safety issues that you’ve heard about in this case weren’t mistakes. They were a product of a corporate philosophy that chose growth and engagement over children’s safety,” Singer told the jury. She urged jurors to impose a civil penalty exceeding $2 billion, based on the potential maximum fine of $5,000 per violation across an estimated 208,700 monthly users under 18 in New Mexico.
Meta’s attorney, Kevin Huff, countered that the company discloses the risks associated with its platforms in user agreements, advertisements, and on its website. He asserted that Meta incorporates protections for teenagers and actively removes harmful content, acknowledging that some problematic posts inevitably slip through its safety measures. Huff characterized the state’s request for penalties as “a shocking number,” arguing that prosecutors failed to demonstrate a direct link between Meta’s platforms and specific harm to teenagers.
The trial included testimony from Meta founder and CEO Mark Zuckerberg, whose deposition was played for the jury on March 4, 2026. Prosecutors highlighted internal Meta research indicating that one-third of teens reported experiencing problematic use of the platforms, suggesting addiction. Meta executives emphasized the company’s ongoing efforts to improve safety and address compulsive use while upholding free speech principles.
A second phase of the trial will determine whether Meta created a public nuisance and, if so, the amount of funding the company should provide to programs addressing alleged harms to children. The outcome of this case could have significant implications for a growing wave of litigation against social media companies regarding their impact on youth. A similar case is currently in deliberations in California, involving Meta and YouTube, and could set a precedent for future lawsuits.
Tech companies have historically been shielded from liability for user-generated content under Section 230 of the U.S. Communications Decency Act and First Amendment protections. However, New Mexico prosecutors argue that their case focuses not on the content itself, but on Meta’s role in amplifying that content through its algorithms.
