← Back to Library

What does it mean for a technology to scale?

Ben Reinhardt delivers a crucial reality check for a world obsessed with invention: the hardest part of technology isn't the spark of genius, but the grueling, expensive, and often impossible work of making that genius available to everyone. While venture capitalists and tech analysts toss around the term "scale" as a buzzword for growth, Reinhardt argues that for physical technologies, it is a question of survival that determines whether a breakthrough remains a lab curiosity or becomes a tool that actually changes the world.

The Illusion of Infinite Growth

Reinhardt begins by dismantling the romantic notion that a brilliant prototype is the finish line. He writes, "Scaling a technology can be more work than inventing it in the first place and ultimately, the most capable technology in the world won't have much impact if it is too expensive to be used or there just isn't enough of it to go around." This framing is vital because it shifts the focus from the "what" of innovation to the "how" of production. The author correctly identifies that scalability is not an inherent property of a material, but a property of the process used to create it.

What does it mean for a technology to scale?

He illustrates this with the history of aluminum. Asking if aluminum can scale is meaningless without specifying the method. Reinhardt notes that the Wöhler process, a chemical reaction, involved expensive chemicals and batch processing, whereas the electrolysis method became the standard because it was a continuous process where cost was driven primarily by electricity. "Asking whether aluminum, the material, can scale is meaningless — if you're making it via a chemical reaction like the Wöhler process, scaling is a very different proposition from doing it through electrolysis," he explains. This distinction is often lost in modern hype cycles where a new battery chemistry or synthetic fuel is celebrated before the manufacturing constraints are even understood.

The most capable technology in the world won't have much impact if it is too expensive to be used or there just isn't enough of it to go around.

Critics might argue that this view is overly pessimistic, suggesting that market forces and engineering ingenuity always find a way to drive costs down. While true in the long run, Reinhardt's point is that the timeline for that cost reduction is often longer than the lifespan of the companies or the patience of the investors funding the project.

The Physics of Price

The piece then dives into the mechanics of cost, distinguishing between the ideal world of software and the messy reality of physical goods. Reinhardt writes, "A perfectly scalable process adds no marginal cost for each additional unit of product you create... But in reality, even software isn't perfectly scalable: servers, customer support, sales, and fixing problems that only occur at scale all mean that each additional unit isn't costless." This is a necessary correction to the Silicon Valley dogma that marginal costs are zero.

For anything made of atoms, the constraints are far stricter. The author breaks down the two main failure modes: "hard" caps and "soft" caps. A hard cap is an external bottleneck, such as a lack of specialized equipment or rare materials. He cites the Haber-Bosch process for ammonia, where the original catalyst, osmium, was too rare to support global food production. It was only when engineers found a catalyst made of common elements like iron that the process could scale to feed billions. "If Carl Bosch and Alwin Mittasch hadn't discovered a catalyst made out of much more common elements... the Haber-Bosch process would have never scaled," Reinhardt asserts. This historical example serves as a stark reminder that a technology's potential is often held hostage by supply chain realities that have nothing to do with the core science.

A soft cap, conversely, is a price ceiling. If the cost to produce a necessary volume of a product exceeds what consumers are willing to pay, the technology fails regardless of its technical brilliance. Reinhardt emphasizes that operational expenses (opex) are the silent killers here. "If a process has a high opex, no amount of volume will reduce the unit price of the product," he writes. This is a critical insight for anyone looking at green energy or carbon capture; simply building more factories doesn't solve the problem if the energy or raw materials required to run them remain prohibitively expensive.

The Hidden Research of Scaling

Perhaps the most provocative argument in the piece is that the work required to scale a process is often harder and more research-intensive than the initial invention. Reinhardt challenges the common attitude that once a proof-of-concept exists, the hard part is done. "The research to scale up a process is usually lumped under 'development' and dismissed as 'just an engineering (as opposed to science) problem,'" he observes.

He argues that scaling often requires reinventing the technology from the ground up. A process that works in a lab for grams might fail completely when trying to produce tons due to heat gradients, mixing issues, or equipment wear. "The process to produce a few grams of a substance or a single prototype will never suffice to make thousands of tons or millions of units economically," Reinhardt states. This reframes the role of the engineer from a mere optimizer to a fundamental researcher. The timeframes extend, the variables multiply, and the economic stakes become existential.

Scaling a process can require harder work and more research than inventing it in the first place.

The author also touches on the unique case of Artificial Intelligence, noting that while AI scaling is often discussed in terms of improving quality (intelligence) rather than lowering cost, the long-term bet remains that the price per unit of intelligence must come down for the technology to be truly transformative. However, he warns that even in the digital realm, the assumption of infinite scalability is dangerous. "In AI land, scaling roughly refers to how the quality of the product changes and you scale up the manufacturing process... Unlike most scaling, it's less about bringing down price, and more about bringing up quality," he writes, but the underlying physical constraints of energy and hardware eventually catch up.

Bottom Line

Reinhardt's analysis is a necessary antidote to the "move fast and break things" mentality, grounding high-tech optimism in the unyielding laws of physics and economics. The strongest part of the argument is the distinction between the invention of a technology and the invention of its manufacturing process, a nuance that is frequently ignored in policy and investment circles. The biggest vulnerability, however, is the sheer difficulty of predicting which technologies will overcome their scaling hurdles; history is littered with brilliant ideas that died because the world couldn't figure out how to make them cheaply enough. As the executive branch and private sector pour resources into new industrial capabilities, the question isn't just "can we build it?" but "can we build enough of it, and at what cost?" That is the only metric that truly matters.

Sources

What does it mean for a technology to scale?

by Ben Reinhardt · · Read full article

A particular failure mode that we’ve noticed among scientists and engineers doing ambitious technology research is ignoring the question “does this technology scale?” It’s a question that gets thrown around a lot by VCs and technology analysts, but people rarely unpack what that means and (I suspect) many of us don’t even know. Scalability and the work to scale a technology is worth unpacking because often scaling a technology can be more work than inventing it in the first place and ultimately, the most capable technology in the world won’t have much impact if it is too expensive to be used or there just isn’t enough of it to go around.

To a large extent, “does this technology scale?” translates to the question: “in the limit, can you make enough of a thing at a price that people want it?” With that in mind, clearly scalability means something very different for different technologies. There are fewer requirements on a scalable drug manufacturing process than a commodity chemical manufacturing process both because consumers are willing to pay far more for drugs and you need far smaller volumes of drugs for them to be effective. Cancer drugs can cost ~$10,000 for a ~100mg dose; ammonia costs less than $1000 for a metric ton which can fertilize ~10 acres of corn.

You might notice that the term “manufacturing process” snuck into the previous paragraph. Talking about scaling inevitably involves talking about how you make something. Scalability is a property of the process of making the thing more than the thing itself. Asking whether aluminum, the material, can scale is meaningless — if you’re making it via a chemical reaction like the Wöhler process, scaling is a very different proposition from doing it through electrolysis. The chemical reaction involves expensive chemicals, batch processing, and creates an impure product that still needs a lot of refinement. Aluminum electrolysis, on the other hand, is a continuous process that produces a relatively pure output whose cost is mostly determined by the price of electricity. (Of course, there are ways to design a technology that make its manufacturing process more or less scalable.)

There’s a narrow definition of scaling that asks “what happens to the unit price of a thing when you make more of it?” That is, what happens to the cost per kilogram, cubic meter, or widget, as you make orders of magnitude more mass, volume, ...