Casey Newton doesn't just interview a CEO; he exposes a cultural rot at the heart of the tech industry's relationship with childhood. The most jarring element isn't the safety failures themselves, which are well-documented, but the chillingly casual indifference displayed by the leader of a platform where 151.5 million monthly users are minors. This piece forces a reckoning with a question most of us have been avoiding: have we accepted a digital ecosystem where growth is the only metric that matters, even when the cost is measured in the safety of children?
The CEO's Dismissal
Newton opens with a recent interview with Roblox CEO David Baszucki that has become the most-discussed conversation in the publication's three-year history. The reaction from listeners was one of shock, not just at the content, but at the tone. Newton writes, "Listeners who wrote in to us said they were shocked to hear the leader of a platform with 151.5 million monthly users, most of them minors, express frustration and annoyance at being asked about the company's history of failures related to child safety." The description of the interview as "bizarre," "unhinged," and a "car crash" feels accurate, yet Newton argues these adjectives miss the deeper point.
The core of Newton's argument is that Baszucki's behavior is not an anomaly; it is the standard operating procedure for the industry. As Newton puts it, "Baszucki, after all, is not the first CEO to have insisted to me that a platform's problems are smaller than I am making them out to be." The CEO's reaction—throwing up his hands and asking, "how long are you guys going to be going on about all this stuff?"—reveals a fundamental lack of empathy for the victims of crimes his platform enabled. This is a stark contrast to the historical precedent of the Tobacco Master Settlement Agreement, where the industry eventually had to admit the harm they caused after decades of denial. Here, the denial is still the primary strategy.
"Given a chance to display empathy for the victims of crimes his platform enabled... the CEO throws up his hands and asks: how long are you guys going to be going on about all this stuff?"
Newton highlights a critical tension: Roblox explicitly courts users as young as five, a demographic far younger than the thirteen-year-old minimum for Instagram or TikTok. While the company claims to spend hundreds of millions on safety, the reality on the ground tells a different story. For years, adults could freely message minors unless parents manually dug into settings. Filters were easily bypassed by misspelling words, and age verification was non-existent. Newton notes that "a reporter for the Guardian created an account presenting herself as a child and found that in Roblox she could wander user-created strip clubs, casinos, and horror games." In one instance, a reporter's avatar was sexually assaulted by another user who "thrust his hips into her avatar's face as she begged him to leave her alone."
Critics might argue that no platform can eliminate all harm, and that millions of children use Roblox without incident. Newton acknowledges this, stating, "I believe that millions of children use Roblox daily without incident. And we would not want to shut down the entire internet to prevent a single bad thing from ever happening." However, he counters that the choice isn't between a perfect world and a dangerous one; it is between proactive stewardship and reactive negligence. The company could have blocked unrestricted contact years ago. Instead, they prioritized growth.
The Pattern of Denial
Newton expands the scope beyond Roblox, arguing that this is a systemic issue across the tech sector. He draws a parallel to Meta, where internal documents reveal a company that "stalled internal efforts to prevent child predators from contacting minors for years due to growth concerns." The most damning evidence comes from the employees themselves, who compared their leadership's tactics to those of tobacco companies. One employee wrote, "Like we're seriously saying 'we have to hook them young' here?" referencing the targeting of children under thirteen.
The author points out that when Meta researchers found that 55 percent of users showed signs of "problematic use," the company redefined the term to include only the most severe cases—3.1 percent—before publishing the data. Newton writes, "Because our product exploits weaknesses in the human psychology to promote product engagement and time spent... the company should 'alert people to the effect that the product has on their brain.' You will not be surprised to learn that the company did not alert people to the issue." This deliberate obfuscation mirrors the tactics seen in the Children's Online Privacy Protection Act (COPPA) era, where the industry fought tooth and nail against basic privacy safeguards.
"When it published that research the following year, though, it redefined 'problematic use' to include only the most severe cases — 3.1 percent of users."
The argument here is that the rank-and-file employees are often trying to do the right thing, only to be overruled by executives who view safety as a barrier to engagement. Newton observes that "over and over again, though, their boss' boss tells them to stop." This dynamic creates a culture where shame, once a powerful regulator of behavior, has been replaced by "denial, defiance, and the noxious vice signaling of the investor class."
A Broken Bargain
The piece concludes with a sobering reflection on the state of accountability. Newton suggests that the social contract between tech platforms and the public has fractured. He writes, "Fast forward to today and the bargain no longer holds. Platforms do whatever the president of the United States tells them to do, and very little else." The public's short attention span and the overwhelming volume of news have allowed platforms to delay, deny, and deflect with impunity.
Newton admits to his own struggle with covering a world where "the truth can barely hold anyone's attention — much less hold a platform accountable." He notes that platforms now wish to be judged only by their stated intentions rather than their actual outcomes. This shift is dangerous because it allows companies to pitch "fantastical visions of the future" while failing to "steward the present." The interview with Baszucki was a "car crash," but as Newton concludes, "the real problem isn't that David Baszucki talks this way. It's that so many of his peers do, too."
"The real problem isn't that David Baszucki talks this way. It's that so many of his peers do, too."
Bottom Line
Newton's strongest move is reframing the Roblox interview not as a singular PR failure, but as a symptom of an industry-wide moral collapse where growth consistently trumps human safety. The argument's vulnerability lies in its reliance on the assumption that public shame or journalistic exposure can still shift corporate behavior in an era of diminished attention spans. Readers should watch for whether the bipartisan backlash against state AI preemption, mentioned in the companion news, signals a genuine shift in the political will to enforce these guardrails, or if it is merely another temporary distraction."