← Back to Library

How tsmc keeps getting better

Most observers treat semiconductor manufacturing as a race of capital expenditure and geopolitical brinkmanship, but Asianometry reframes it as a battle of statistical probability and data velocity. The piece's most arresting claim is that the true barrier to entry isn't just the cost of a factory, but the invisible, exponential difficulty of achieving perfection across hundreds of sequential steps where a single microscopic error destroys the entire batch. For the busy executive, this shifts the narrative from "who has the most money" to "who learns the fastest."

The Mathematics of Perfection

Asianometry begins by dismantling the illusion that modern chipmaking is merely scaled-up industrial production. "Semiconductor manufacturing is the most sophisticated unforgiving high volume production technology that has ever been done successfully," they write, noting that a single wafer can undergo up to 1,200 distinct process steps. The author illustrates the fragility of this system with a chilling calculation: even if each of 400 steps succeeds 98% of the time, the final yield drops to a catastrophic 0.03%. This is not hyperbole; it is the mathematical reality of the industry.

How tsmc keeps getting better

The commentary effectively highlights that "yield" is not a static metric but a dynamic target that dictates profitability. Asianometry explains that in the early stages of a new node, yields can hover between 20% and 50%, meaning the majority of production is scrap. The author argues that the "time to yield"—how quickly a foundry climbs from 20% to 95%—is the single most critical factor for profit, often outweighing the initial price of the equipment. This reframing is crucial because it explains why a company might accept lower margins initially; they are buying data, not just selling silicon.

"If you can ramp up the yield faster than the price of the product declines that is when you earn the highest profits."

Critics might note that this analysis assumes a stable market demand, yet in reality, supply chain shocks or sudden shifts in consumer electronics demand can render a "fast yield" irrelevant if there are no customers waiting. However, the core logic holds: in a market where prices erode rapidly, speed of learning is the only sustainable moat.

The Invisible Enemies: Particles and Probability

The piece excels when it details the physical and quantum barriers that make this learning curve so steep. Asianometry identifies "killer particles" as a primary adversary, noting that as feature sizes shrink, the definition of a defect becomes absurdly strict. "Recent discussions in the ultra pure water industry are implying a killer particle size under 10 nanometers which is roughly the diameter of DNA," the author observes. This scale forces the industry to contend with static electricity that can melt patterns in mere nanoseconds.

Even more fascinating is the shift from deterministic engineering to probabilistic management. The author describes how modern lithography must account for "stochastic effects," where the randomness of atomic interactions causes lines to break or bridge. "At this scale there is only a probability that a photon will be absorbed at a certain location," Asianometry writes. This forces a fundamental change in how software is written, moving from discrete edges to "gradual approximations on a continuum." This is a profound insight: the industry is no longer just building machines; it is building algorithms to manage the inherent chaos of the quantum world.

"No foundry starts off a new process node at 95 percent they have to get there through a process called yield learning."

The argument here is that the "magic" of advanced chips is not in the hardware itself, but in the software's ability to compensate for physical randomness. This shifts the competitive advantage toward companies with the most sophisticated data analysis capabilities, not just the cleanest rooms.

The Data Engine and the Robot Workforce

Asianometry contrasts the old guard of semiconductor manufacturing with the new data-driven reality. In the past, solving yield issues relied on "experienced engineers eyeballing the data," a method that could take years. The author cites a Motorola case where a subtle yield drop took five years and thirty experiments to solve, a timeline that is now obsolete. Today, the process is defined by automation and machine learning. "The whole fabrication process generates and collects millions of pieces of data each day," the author notes, making human intuition insufficient for finding non-linear correlations.

The piece details how physical automation has removed humans from the clean room almost entirely, not just for cleanliness, but because the sheer weight of the wafers and the need for precision require robotic handling. The author points out that the most successful fabs are those with the most flexible automation systems, allowing robots to adjust the manufacturing process on the fly. This is where the concept of "cycle time" becomes paramount. Asianometry explains that the company uses Little's Law from queueing theory to optimize inventory and flow, calculating remaining cycle times to adjust dispatching in real-time.

"Tsmc is obsessed with [cycle time]... simply accelerating the cycle time allows for more work in progress turns more data a faster learning rate and thus faster time to yield."

This is the piece's most strategic insight: speed is not just about shipping faster; it is about generating the data required to fix the process faster. The author argues that every wafer moved is a data point that refines the machine learning models. A counterargument worth considering is whether this data advantage creates an insurmountable barrier for new entrants, potentially stifling competition. Asianometry implies this is exactly the case, noting that "each new customer who joins the roster adds to tsmc's edge in data volume."

Bottom Line

Asianometry's strongest contribution is the demonstration that the semiconductor industry's true moat is not capital intensity, but the velocity of its learning loop. The piece's greatest vulnerability is its heavy reliance on the assumption that data volume alone guarantees success, potentially underestimating the role of supply chain resilience or geopolitical constraints. Readers should watch for how the industry's shift toward managing stochastic, quantum-level randomness will redefine the next generation of chip design software.

Sources

How tsmc keeps getting better

by Asianometry · Asianometry · Watch video

in 2021 tsmc fabbed about 12 to 13 million 300 millimeter wafers assuming each die is about 100 square millimeters large that is about 8 billion chips 8 billion semiconductor manufacturing is the most sophisticated unforgiving high volume production technology that has ever been done successfully you need a lot of practice and the more chips that tsmc makes the better it gets at it in this video i want to talk about how fab like tsmc improves their operations how they think about yield speed up throughput and in general get better at wafer fabrication but first let me thank our sponsors at the agenometry newsletter i write a lot of exclusive content for it this includes profiles on taiwan's startup space and exclusive videos like this one here the link is in the video description below i try to put one out every week maybe two alright back to the show wafer manufacturing is hard to break it down there are mask layers where a mask is used to reproduce a pattern onto a substrate each mask layer requires about 15 to 25 process steps these steps include oxidation photolithography etching ion implantation washing and so on a simple 180 nanometer process chip which was leading edge some 20 years ago has about 20 mask layers that means a single wafer can take up to 600 steps remember this is a chip from like 20 years ago today's leading edge processes which tsmc calls n14 n16 and n10 can have up to 60 70 and even 80 mask layers you can pass bad wafers to your customers bad dies that cannot be repurposed somehow have to be thrown away with its cost distributed amongst the good dies so if your yield is 50 percent then the impact is that the cost per good unit is twice as high as it possibly can be broadly speaking yield is defined as the fraction of total input transformed into a shippable output but that is composed of several components for instance line yield is the fraction of wafers that make it to the final wafer electrical test die yield is the fraction of dies on good waivers that make it through to the assembly and final testing stage and final test yield is what makes it through that assembly and final testing stage independent fabs have the most direct control over ...