Meta, the parent company of Facebook and Instagram, faces fresh allegations in a US class-action lawsuit accusing it of suppressing internal research that demonstrated the harm its platforms cause to users’ mental health. The lawsuit, brought by several US school districts, claims Meta prioritized growth over user safety and concealed risks from regulators and parents.
The lawsuit highlights Meta’s 2020 internal study, codenamed “Project Mercury,” which partnered with Nielsen to evaluate the effects of deactivating Facebook and Instagram for one week. The results reportedly showed significant reductions in depression, anxiety, loneliness, and social comparison among participants who stopped using these platforms.
Despite the study’s findings, Meta allegedly halted the research and dismissed the results as influenced by external media narratives. Internal communications, however, indicated some company scientists privately agreed the study showed a causal link between the platforms and worsened mental health symptoms.
The lawsuit further accuses Meta of downplaying child safety issues, including ignoring evidence of underage usage and child sexual abuse material, and paying entities like the National PTA to promote positive messaging about platform safety. Allegations also suggest Meta set high thresholds to block sex traffickers to avoid disrupting user growth metrics.
Meta’s representatives deny the allegations, emphasizing that the study was stopped due to methodological flaws. The company insists it has consistently worked for over a decade to enhance teen safety, including implementing measures to reduce risks associated with platform use.
The case, which will have a hearing on January 26 in Northern California’s US District Court, intensifies scrutiny on Big Tech companies’ roles in youth mental health crises. Alongside Meta, other tech giants like Google, TikTok, and Snapchat face similar accusations of concealing known mental health risks related to their platforms.
This lawsuit reflects growing concerns about social media’s impact on mental health, pushing for greater transparency and accountability. It shines a light on the tension between user safety and corporate growth strategies in the tech industry, especially regarding vulnerable teenage users.