Press "Enter" to skip to content

Meta Found Guilty of Endangering Children’s Mental Health in NM Trial

New Mexico Jury Finds Meta Liable in Child Mental Health Case

In a pivotal legal development, a New Mexico jury has determined that Meta, the parent company of Instagram, Facebook, and WhatsApp, knowingly endangered children’s mental health by failing to disclose the risks associated with its social media platforms. This decision follows a nearly seven-week trial and coincides with ongoing deliberations in a similar federal case in California involving Meta and YouTube.

The jury sided with state prosecutors, agreeing that Meta prioritized financial gain over user safety. They concluded that the tech giant violated the state’s Unfair Practices Act by obscuring the dangers of child sexual exploitation and the adverse effects on young users’ mental health. The ruling detailed numerous infractions, each contributing to a potential penalty totaling $375 million.

A Meta spokesperson expressed disagreement with the verdict, stating, “We respectfully disagree with the verdict and will appeal. We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content. We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online.”

Meta’s defense emphasized the company’s transparency regarding risks and efforts to eliminate harmful content. Despite acknowledging the presence of some inappropriate material, they maintained that safety investments are both ethically and commercially motivated.

This case marks one of the initial trials in a growing wave of litigation targeting social media firms over their influence on child well-being. New Mexico’s lawsuit, spearheaded by Attorney General Raúl Torrez, leveraged evidence from a state undercover investigation. Agents created child personas on social media to document instances of sexual solicitation and assess Meta’s response.

The legal action, initiated in 2023, accuses Meta of failing to fully disclose or mitigate the risks of social media addiction, a concept the company does not formally recognize. Nevertheless, during the trial, Meta executives admitted to recognizing “problematic use” and expressed a desire for users to have positive experiences on their platforms.

Meta attorney Kevin Huff argued before jurors that the company’s designs aim to foster connections among friends and family, not to facilitate predatory behavior. He stated, “Evidence shows not only that Meta invests in safety because it’s the right thing to do but because it is good for business.”

The protection of tech companies from liability under Section 230 of the U.S. Communications Decency Act and First Amendment rights was also examined. However, New Mexico prosecutors contended that Meta’s algorithms contribute to the spread of harmful content, thereby warranting accountability.

Prosecution attorney Linda Singer underscored the impact of Meta’s decisions, saying, “We know the output is meant to be engagement and time spent for kids. That choice that Meta made has profound negative impacts on kids.”

The trial’s second phase, expected in May, will be judge-led and could result in further directives for Meta to address public nuisances and implement corrective actions.

Throughout the proceedings, jurors reviewed Meta’s internal communications on child safety, heard testimonies from various stakeholders, and considered the experiences of local educators dealing with social media-related disruptions.

In reaching its verdict, the jury evaluated whether Meta misled users about safety, referencing statements from key company figures, including CEO Mark Zuckerberg.

The jury’s checklist of allegations covered Meta’s enforcement of age restrictions, content related to teen suicide, and the influence of algorithms in prioritizing sensational content.

For more insights on social media and its impact, see Zuckerberg’s Testimony on Social Media Addiction and Elon Musk and Social Media Content.

This article was originally written by www.npr.org