Internal research comes to light
Court filings unsealed in the United States allege that Meta withheld internal findings suggesting that Facebook and Instagram may contribute to negative mental-health outcomes. The documents claim that a 2020 internal experiment, code-named “Project Mercury”, showed that users who deactivated both platforms for one week experienced notable reductions in depression, anxiety, loneliness and social comparison.
The study reportedly included controlled experiments and tracked measurable changes in users’ mental-health indicators during the deactivation period.
Why this matters for tech accountability
The case raises more than public-relations or ethical concerns. If the allegations are accurate, they point to a technology company prioritising growth and engagement ahead of user wellbeing, despite having access to internal evidence of harm. The filings cite messages from Meta employees that referenced a “causal impact” between platform usage and harmful psychological effects. One internal communication noted:
“The Nielsen study does show causal impact on social comparison.”
Another message drew a comparison to historical industry misconduct:
“Doing research and knowing cigs were bad and then keeping that info to themselves.”
Meta disputed the allegations. A spokesperson said the company discontinued the study because of methodological flaws and added that Meta has spent “over a decade” listening to parents and improving platform safety.
At a regulatory level, the case may influence how governments and oversight bodies approach youth safety, algorithmic design and internal-research transparency across digital platforms.
Who stands to be affected
Meta’s user base is at the centre of the claims, particularly teenagers and vulnerable groups who are most sensitive to design choices that influence comparison, attention and social pressure. U.S. school districts are also plaintiffs in the case, arguing that platform design contributes to youth mental-health challenges. Regulators, legislators and global watchdogs will follow developments closely, especially those reviewing platform-safety rules and algorithmic accountability.
What comes next
A hearing is scheduled for 26 January 2026 in federal court in Northern California. It will help determine which internal studies, messages and research documents become public.
Further stages may include more unsealed filings, regulatory investigations and renewed pressure on Meta to release additional internal evidence about youth wellbeing. Observers will track whether Meta adjusts product-design priorities, increases transparency about research methodologies or offers clearer disclosures about how its algorithms affect young users. Global regulators will study the case as they consider new rules for platform accountability.







