Casey Newton's latest analysis for Platformer strips away the corporate fog surrounding Meta's relationship with children, revealing a company that allegedly knew the precise mechanics of its own harm and chose profit over protection. The unredacted complaint from 41 state attorneys general does not merely accuse the tech giant of negligence; it paints a picture of a deliberate strategy where internal data on underage usage and psychological damage was treated as a trade secret to be guarded, not a crisis to be solved.
The Illusion of Compliance
Newton begins by dismantling the industry's long-standing "wink-wink, nudge-nudge" relationship with the Children's Online Privacy Protection Act. While platforms claim to ban users under 13, the lawsuit suggests this is a convenient fiction. Newton writes, "Platforms get to have their cake and eat it too: reap the rewards of the engagement and ad revenue that comes with younger users on the platform, while also insisting that they're not there." This framing is crucial because it shifts the narrative from accidental policy failure to calculated business model exploitation.
The evidence cited is damning. The complaint details how Meta's internal records show an "open secret" regarding millions of underage Instagram users, complete with charts boasting penetration into 11- and 12-year-old demographics. Newton notes that the company's researchers even took "pains to avoid uncovering Instagram's under-13 users through their studies," suggesting a willful blindness that borders on malice. This is not a case of a platform struggling with age verification; it is a case of a platform that has mastered the art of ignoring the law while monetizing the very users it claims to exclude.
"The unredacted lawsuit lays out what appears to be a straightforward case of a company well aware it is collecting reams of data on underage users and not seeking parental permission as required."
Critics might argue that age verification is technically fraught and that placing the burden on device makers like Apple and Google, as Meta suggests, is a pragmatic solution. However, Newton points out the legal reality: "The law is the law. And it looks like the attorneys general have Meta dead to rights." The company's deflection to third parties rings hollow when their own internal data proves they have known the ages of their users for years.
The Architecture of Harm
The second, and perhaps more disturbing, part of Newton's coverage focuses on the alleged "scheme to exploit young users for profit." Here, the argument moves beyond privacy violations to active psychological manipulation. The piece highlights Project Daisy, an experiment where Meta tested hiding like counts to reduce social anxiety. The internal data was clear: visible likes drove "negative social comparison," particularly among teen girls.
Newton writes, "About 1 out of 10 people experience negative social comparison on Instagram often or always," citing an internal study that also found "66% of teen girls on IG experience negative social comparison." Despite this, the company chose to make the feature opt-in rather than the default, a decision Newton describes as knowing that the "neutered" version would have negligible effects on well-being. The motivation? Revenue. The lawsuit alleges that the full implementation of hiding likes would have decreased revenue by around 1 percent.
This is the core of the argument: Meta prioritized a marginal revenue gain over the documented mental health of its youngest users. Newton observes that "despite the harm it causes, people really do like browsing Instagram to see how many likes posts get," and the company leveraged this addiction. The piece also details how CEO Mark Zuckerberg overturned a permanent ban on augmented reality beauty filters that simulated plastic surgery, dismissing concerns about body dysmorphia as "paternalistic." Newton notes that the head of responsible design had warned that these filters were "actively encouraging young girls into body dysmorphia," yet the business imperative to keep users engaged won out.
"It sure seems that way to me. A month ago, it was possible to believe that the AGs may have overplayed their hand. Today, I'd say Meta is in for a very tough fight."
A counterargument worth considering is the difficulty of proving "knowingly" in a court of law, especially after Meta dismantled many of the very research projects that would serve as evidence of their prior knowledge. However, Newton suggests that the sheer volume of internal emails, charts, and presentations makes this a formidable case for the states.
Bottom Line
Newton's commentary effectively transforms a dense legal filing into a clear moral indictment of a business model built on the exploitation of adolescent psychology. The strongest element of this analysis is the specific, internal evidence that Meta knew the exact cost of its features to user well-being and calculated the financial trade-off anyway. The biggest vulnerability remains the legal hurdle of proving intent in a complex corporate structure, but the public record now makes it nearly impossible for the company to claim ignorance. The coming litigation will likely determine whether profit can legally supersede the duty of care to children in the digital age.