← Back to Library

Reacting to matt yglesias's confession that a.i. Progress is giving him writer’s block

Brad DeLong delivers a stinging rebuke to the paralysis induced by AI anxiety, arguing that the fear of an imminent, total societal overturn is a distraction from the very real, incremental creative destruction already reshaping the economy. While many policy writers freeze at the threshold of the unknown, DeLong insists that history offers a far more reliable map than the speculative forecasts of Silicon Valley evangelists.

The Burden of the Fork

The piece begins by addressing a specific anxiety plaguing modern journalism: the inability to write about policy when the future trajectory of technology is so uncertain. DeLong critiques Matthew Yglesias for feeling "transfixed, like Buridan's Ass," unable to choose a course of action because he cannot predict whether AI will plateau or explode into superintelligence. DeLong writes, "Matt thinks there is a genuine fork here, and stands transfixed... Matt thinks today's AI is roughly as capable as a generic literate person with broad knowledge of what's already written—which is a powerful research assistant, but just a powerful research assistant."

Reacting to matt yglesias's confession that a.i. Progress is giving him writer’s block

This framing is effective because it exposes the absurdity of waiting for certainty before acting. DeLong suggests that Yglesias has "half-drunk the AI-psychoactive koolaid," conflating the current hype cycle with genuine existential risk. He draws a sharp parallel to the cryptocurrency boom, noting that the "vibe is the same" as the current AI fervor, where "nobody sees [Bitcoin] as anything societally transformative, or indeed as having any other serious use case other than 'digital gold!' and 'number go up!'" By invoking the crypto bubble, DeLong grounds the abstract fear of AI in a tangible, recent failure of prediction.

Critics might argue that the comparison to Bitcoin is flawed because AI's impact on productivity is already measurable in ways that crypto never was. However, DeLong's point remains valid: the narrative of total societal collapse is often a smokescreen for the grifters and self-grifters who benefit from the chaos of uncertainty.

Maybe A.I. progress means we have a golden opportunity to launch a Police for America initiative and get a whole different group of people thinking about law enforcement careers, and maybe it means total loss of explicit human control over the future of our planet and our species. That's not a very good article!

Standing on Shoulders, Not Worshiping Gods

DeLong pivots from the immediate policy paralysis to a grand historical sweep, arguing that humanity has always experienced "successive leading-sector Schumpterian creative-destruction upending of orders and institutions." He rejects the notion that we are facing a unique, unprecedented singularity. Instead, he reframes the current moment as the latest chapter in a long story of "standing on the shoulders of ever-taller pyramids of giants." He writes, "The real ASI emerged. Not an Artificial Super-Intelligence constructed in a computer lab... But, rather, the distributed knowledge and thought base that is the Anthology Super-Intelligence that is humanity's collective mind."

This historical contextualization is the piece's intellectual anchor. DeLong traces the acceleration of human progress from the slow 1% per millennium of the Paleolithic era to the explosive 100% per century of the Industrial Revolution. He notes that since 1875, "a generation sees about 80% of the economy grow in technology by about 1/4 in efficiency. While about 20% is upended and revolutionized." This data-driven approach dismantles the idea that AI is a sudden, alien force; rather, it is the logical continuation of a trend where technology reshapes the division of labor every single generation.

The argument here is that the discomfort of being in the "bulls-eye" of this disruption is not new. DeLong recalls King Arkhidamos III, who lamented, "By Hercules! Man's bravery is ended!" upon seeing the Macedonian torsion catapult. This historical anecdote serves as a powerful reminder that every technological leap is met with fear by those whose roles are being displaced. The parallel to the current anxiety among intellectual professionals is striking and well-placed.

The Real Disruption

DeLong's most provocative claim is that the current wave of disruption is specifically targeting the "learned intellectual professions that Matt and I specialize in." He argues that the "gears shift" are not about a digital god taking over, but about natural-language front-ends to structured databases becoming the new standard for accessing human wisdom. He writes, "Much better ways at accessing and remixing the real ASI... is not a Digital God. It is natural-language front-ends to structured and unstructured databases."

This reframing is crucial. It moves the conversation away from the sci-fi fear of machine dominance and toward the practical reality of how work is being reorganized. The "Anthology Super-Intelligence" is not a rival to human intelligence but an amplifier of it, built on the accumulated knowledge of the past. DeLong suggests that the "vibe" of the current moment is less about the end of humanity and more about the end of the specific ways in which we have traditionally organized intellectual labor.

A counterargument worth considering is that the speed of AI advancement, driven by recursive self-improvement, may indeed outpace the historical patterns DeLong describes. If the gap between models grows exponentially rather than linearly, the "standing on shoulders" metaphor might break down. Yet, even if the pace accelerates, the fundamental dynamic of creative destruction remains the same.

Since 1875 we have seen: Steampower, Applied-Science, Mass-Production, Globalized Value-Chain, and now Attention Info-Bio Tech modes—of production, but also distribution, communication, and domination—equivalent scale transformations shake society every single generation, with societal superstructures always lagging far behind and desperately shaking themselves to pieces in attempts to cope.

Bottom Line

DeLong's strongest asset is his refusal to treat AI as a unique historical anomaly, grounding the debate in the long arc of technological disruption rather than the short-term panic of the tech sector. His biggest vulnerability lies in potentially underestimating the speed at which recursive self-improvement could alter the economic landscape, but his core message—that we must act within the uncertainty rather than waiting for a clear path—is essential for any policymaker or writer trying to navigate the next decade.

Deep Dives

Explore these related deep dives:

  • Artificial intelligence

    The piece discusses AI progress causing writer's block and its potential impact on white-collar work demand

Sources

Reacting to matt yglesias's confession that a.i. Progress is giving him writer’s block

Matt’s subhead: “It’s hard to write good articles when you have no idea if everything is about to change”. In short, Matt has half-drunk the AI-psychoactive koolaid. My view: Matt should talk to Ezra Klein, and have Ezra Klein recount to him Ezra’s days in San Francisco, when it seemed every day made him stupider as he found himself rubbing elbows with yet another bunch of crypto-enthusiast grifters and self-grifters. while BitCoin is still a thing today, nobody sees it as anything societally transformative, or indeed as having any other serious use case other than “digital gold!” and “number go up!” “AI” will have more of an impact, yes, and the balance between cynical grifters and self-grifters on the one hand and genuine technologists exploring use cases on the other is very different. But the vibe is the same: in both cases the evidence of the rapid total overturning of human society is not present….

And yet Matt thinks there is a genuine fork here, and stands transfixed, like Buridan’s Ass:

MATT YGLESIAS: A.I. progress is giving me writer’s block.

It’s hard to write good articles when you have no idea if everything is about to change..

Matthew Yglesias

Feb 18, 2026

Here’s an idea for an article that I had recently:

One of the most underrated aspects of education policy is the impact that second-wave feminism had on the K-12 workforce. It used to be the case that an enormous fraction of the smartest and most ambitious women in America were working as public school teachers, and were doing so at depressed wages because of limited opportunities for women to have white-collar careers. Some of this was formal, but a lot of it wasn’t. Jeannette Rankin entered Congress in 1917 and Elizabeth Blackwell graduated from medical school in 1849, so it’s not like women “couldn’t” have careers in politics or medicine before 1970. But they rarely did. And there wasn’t one specific formal policy change that unleashed the entire transformation of women’s professional opportunities. There were formal changes in public policy, of course, but the most important changes were the shifts in attitudes and social values over several generations.

And a second-order consequence of this was the steady erosion of human capital available in the teaching workforce.

And it seems likely to me that as artificial intelligence generates a sharp decline in the demand for major categories of white-collar ...