


In a landmark legal decision with far-reaching implications for the technology industry, Meta Platforms—the parent company of Instagram, Facebook, and WhatsApp—has been ordered to pay $375 million (approximately Dh1.38 billion) in civil penalties after a jury found the company misled consumers and failed to protect children on its platforms.
The verdict, delivered in the case of State of New Mexico v. Meta Platforms, Inc., marks the largest ruling of its kind and represents the first time a U.S. state has successfully taken a major technology company to trial over the harm its platforms allegedly cause to young users.
The case followed more than two years of litigation led by the New Mexico Department of Justice, which brought claims under the state’s Unfair Practices Act. The jury ultimately found Meta liable and imposed the maximum penalty of $5,000 per violation, underscoring the severity of the findings.
New Mexico Attorney General Raúl Torrez described the outcome as a defining moment for families and child safety advocates.
“The jury’s verdict is a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety,” Torrez said.
Central to the case was a body of evidence that included internal Meta documents, testimony from former employees, and insights from educators and law enforcement officials.
According to the findings, Meta’s internal communications revealed that company executives had been warned about potential risks to children, including exposure to harmful content and exploitation on its platforms.
The jury heard allegations that certain design features enabled predators to engage in child sexual exploitation, while also exposing young users to content related to eating disorders and self-harm.
“Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew,” Torrez added.
The ruling has been widely described as a turning point in the global conversation around online safety.
For many parents and educators, the verdict offers validation of long-standing concerns about the risks associated with social media platforms.
“This is a watershed moment for every parent concerned about what could happen to their kids when they go online,” the Department of Justice noted, emphasizing the broader societal impact of the decision.
The case itself began in 2023, when authorities launched an investigation into Meta’s platforms, focusing on issues such as online solicitation, child exploitation, and digital safety failures.
Experts believe the ruling could set a precedent for future legal actions against major technology companies.
Dr Joanne Gray, chair of discipline in media and communications at the University of Sydney, highlighted the significance of the jury’s decision.
“A group of ordinary American citizens did what US regulators have so far failed to do,” she said. “They looked at the evidence and found that Meta puts profits over user safety. This jury decision sends a clear message to all the Big Tech platforms: they need to do better, especially when it comes to keeping kids safe.”
In response, Meta stated that it “respectfully” disagrees with the verdict and confirmed plans to appeal the decision.
Amid growing scrutiny, the company has already begun introducing new safety features. Recently, Instagram launched alerts designed to notify parents if teenagers repeatedly search for content related to suicide or self-harm—an indication of increasing pressure on platforms to take proactive measures.
The $375 million ruling not only represents a legal milestone but also signals a broader shift in how governments and societies approach the regulation of digital platforms.
As investigations, public scrutiny, and legal challenges continue to mount, the case against Meta may serve as a blueprint for holding technology companies accountable—particularly when it comes to the safety and well-being of children.
In an era where digital platforms shape the daily lives of millions, the verdict stands as a powerful reminder: responsibility must evolve alongside innovation.