← Back to Library

The ags reveal their hand against Meta

Casey Newton's latest analysis for Platformer strips away the corporate fog surrounding Meta's relationship with children, revealing a company that allegedly knew the precise mechanics of its own harm and chose profit over protection. The unredacted complaint from 41 state attorneys general does not merely accuse the tech giant of negligence; it paints a picture of a deliberate strategy where internal data on underage usage and psychological damage was treated as a trade secret to be guarded, not a crisis to be solved.

The Illusion of Compliance

Newton begins by dismantling the industry's long-standing "wink-wink, nudge-nudge" relationship with the Children's Online Privacy Protection Act. While platforms claim to ban users under 13, the lawsuit suggests this is a convenient fiction. Newton writes, "Platforms get to have their cake and eat it too: reap the rewards of the engagement and ad revenue that comes with younger users on the platform, while also insisting that they're not there." This framing is crucial because it shifts the narrative from accidental policy failure to calculated business model exploitation.

The ags reveal their hand against Meta

The evidence cited is damning. The complaint details how Meta's internal records show an "open secret" regarding millions of underage Instagram users, complete with charts boasting penetration into 11- and 12-year-old demographics. Newton notes that the company's researchers even took "pains to avoid uncovering Instagram's under-13 users through their studies," suggesting a willful blindness that borders on malice. This is not a case of a platform struggling with age verification; it is a case of a platform that has mastered the art of ignoring the law while monetizing the very users it claims to exclude.

"The unredacted lawsuit lays out what appears to be a straightforward case of a company well aware it is collecting reams of data on underage users and not seeking parental permission as required."

Critics might argue that age verification is technically fraught and that placing the burden on device makers like Apple and Google, as Meta suggests, is a pragmatic solution. However, Newton points out the legal reality: "The law is the law. And it looks like the attorneys general have Meta dead to rights." The company's deflection to third parties rings hollow when their own internal data proves they have known the ages of their users for years.

The Architecture of Harm

The second, and perhaps more disturbing, part of Newton's coverage focuses on the alleged "scheme to exploit young users for profit." Here, the argument moves beyond privacy violations to active psychological manipulation. The piece highlights Project Daisy, an experiment where Meta tested hiding like counts to reduce social anxiety. The internal data was clear: visible likes drove "negative social comparison," particularly among teen girls.

Newton writes, "About 1 out of 10 people experience negative social comparison on Instagram often or always," citing an internal study that also found "66% of teen girls on IG experience negative social comparison." Despite this, the company chose to make the feature opt-in rather than the default, a decision Newton describes as knowing that the "neutered" version would have negligible effects on well-being. The motivation? Revenue. The lawsuit alleges that the full implementation of hiding likes would have decreased revenue by around 1 percent.

This is the core of the argument: Meta prioritized a marginal revenue gain over the documented mental health of its youngest users. Newton observes that "despite the harm it causes, people really do like browsing Instagram to see how many likes posts get," and the company leveraged this addiction. The piece also details how CEO Mark Zuckerberg overturned a permanent ban on augmented reality beauty filters that simulated plastic surgery, dismissing concerns about body dysmorphia as "paternalistic." Newton notes that the head of responsible design had warned that these filters were "actively encouraging young girls into body dysmorphia," yet the business imperative to keep users engaged won out.

"It sure seems that way to me. A month ago, it was possible to believe that the AGs may have overplayed their hand. Today, I'd say Meta is in for a very tough fight."

A counterargument worth considering is the difficulty of proving "knowingly" in a court of law, especially after Meta dismantled many of the very research projects that would serve as evidence of their prior knowledge. However, Newton suggests that the sheer volume of internal emails, charts, and presentations makes this a formidable case for the states.

Bottom Line

Newton's commentary effectively transforms a dense legal filing into a clear moral indictment of a business model built on the exploitation of adolescent psychology. The strongest element of this analysis is the specific, internal evidence that Meta knew the exact cost of its features to user well-being and calculated the financial trade-off anyway. The biggest vulnerability remains the legal hurdle of proving intent in a complex corporate structure, but the public record now makes it nearly impossible for the company to claim ignorance. The coming litigation will likely determine whether profit can legally supersede the duty of care to children in the digital age.

Sources

The ags reveal their hand against Meta

by Casey Newton · Platformer · Read full article

On October 24, 41 states and the District of Columbia sued Meta. Attorneys general alleged that the company hurts younger users by intentionally designing addictive and harmful products at the expense of their privacy and well being.

At the time, the lawsuits’ merits were difficult to assess, since much of the evidence in the 233-page complaint was redacted. But this week, with Meta’s approval, the full complaint was made public. And what we could only speculate about last month now seems ripe for a fuller discussion.

Writing about the subject last month, I divided the complaint into two big parts. One is focused on privacy; the other on everything else. Let’s take privacy first.

I.

The Children’s Online Privacy Protection Act (COPPA) places several requirements on platforms that collect data on users under the age of 13. Chief among these is the need to obtain verifiable parental consent for the user; it also puts restrictions on how data on younger users can be collected and used.

In practice, all of this is more trouble than it’s worth. And so most major platforms prohibit under-13s from signing up, while remaining well aware that underage users are using their services daily. This means platforms get to have their cake and eat it too: reap the rewards of the engagement and ad revenue that comes with younger users on the platform, while also insisting that they’re not there, or that if they are there they shouldn’t be, and also look at how many under-13 accounts we remove every year.

Platforms’ wink-wink, nudge-nudge relationship with COPPA has been mostly tolerated until now. But the unredacted lawsuit lays out what appears to be a straightforward case of a company well aware it is collecting reams of data on underage users and not seeking parental permission as required.

The lawsuit alleges:

Within the company, Meta’s actual knowledge that millions of Instagram users are under the age of 13 is an open secret that is routinely documented, rigorously analyzed and confirmed, and zealously protected from disclosure to the public.

Meta’s extensive internal records documenting its actual knowledge of its under-13 Instagram users and collection of data from those users include the following: (1) charts boasting Instagram’s penetration into 11- and 12-year-old demographic cohorts; (2) an internal report presented to Zuckerberg regarding the four million under-13 users on Instagram; (3) emails and policies documenting Meta’s mishandling of known under-13 ...