Brian Potter doesn't just explain how technology evolves; he reveals that the greatest barrier to innovation isn't a lack of genius, but the sheer mathematical impossibility of stumbling upon a solution in a sea of chaos. By dissecting a simulation of logic circuits, he exposes a hidden mechanism that allows complex systems to emerge without requiring a miracle. This is essential reading for anyone trying to understand why some industries stagnate while others explode with capability.
The Search Space Problem
Potter begins by critiquing the uneven landscape of technological literature. He notes that while "most major inventions... have a decent book written about them," the broader question of how progress actually works remains largely unexplored. He turns to the work of economist Brian Arthur, specifically a 2006 paper co-authored with Wolfgang Polak, to find a clearer lens. Potter writes, "What the paper is really showing us is that finding some new technology is a question of efficiently acquiring information." This reframing is potent. It shifts the focus from the "Eureka!" moment to the logistical challenge of navigating a "gargantuan sea of possibilities."
The author illustrates this with a simulation that builds boolean logic circuits from simple NAND gates. The simulation attempts to create complex functions, like an 8-bit adder, by randomly combining components. The scale of the difficulty is staggering. Potter points out that for a function with 16 inputs and 9 outputs, "you have a mind-boggling 10^177554 possible logic functions." To put that in perspective, he notes that "the number of atoms in the universe is estimated to be on the order of 10^80." The odds of randomly assembling a working computer circuit are effectively zero.
Fulfilling some goal from circuit space means finding one particular function in a gargantuan sea of possibilities.
This is where the argument gains its teeth. If random chance cannot solve the problem, what does? Potter argues that the simulation succeeds only because it treats simpler technologies as "stepping stones." Once the simulation finds a basic function like an AND gate, it "encapsulates" it, adding it to the pool of available building blocks. This mirrors the concept of combinatorial explosion, a topic explored in depth in related deep dives on the subject. Just as a few chemical combinations can lead to millions of organic molecules, a few simple circuits can bootstrap the creation of a modern computer. The author suggests that without these intermediate steps, "the simulation will never find solutions to the more complex goals."
Critics might argue that this simulation is too clean, relying on a "partial fulfillment" mechanic that allows imperfect circuits to be refined. However, Potter tested this by turning the mechanic off, forcing the simulation to discard anything that wasn't a perfect match. He found "no real difference in how many goals get found," suggesting that the core driver of progress is the hierarchical accumulation of stable sub-assemblies, not the gradual polishing of errors.
The Architecture of Complexity
To explain why this hierarchical approach works, Potter reaches back to a classic 1962 paper by Nobel laureate Herbert Simon titled "The Architecture of Complexity." Simon used a parable of two watchmakers, Hora and Tempus, to illustrate the difference between fragile and robust systems. Tempus builds watches where every part is connected directly; if he is interrupted, the whole thing falls apart. Hora, conversely, builds sub-assemblies that are stable on their own. If he is interrupted, he only loses the current sub-assembly, not the entire watch.
Potter writes, "The result is that Hora makes completed watches about 4,000 times faster than Tempus." This analogy is crucial for understanding the simulation. In the context of technological evolution, a "Tempus" approach would be trying to assemble a complex machine part-by-part in a single, linear sequence. The probability of success is "negligible." But a "Hora" approach, which builds stable sub-units first, reduces the search space dramatically.
The emergence of circuits such as 8-bit adders seems not difficult. But consider the combinatorics.
This connection to Simon's work provides a historical anchor that strengthens Potter's modern analysis. It suggests that the ability to modularize—creating stable, reusable components—is not just an engineering convenience but a fundamental requirement for complexity to exist at all. This echoes the work of Lillian Hoddeson, who detailed how the invention of the transistor was not just a single breakthrough but a shift in how components could be standardized and combined. Without the ability to treat a transistor as a stable "sub-assembly" rather than a unique, fragile artifact, the digital revolution would have been mathematically impossible.
Potter's recreation of the simulation confirms this. He managed to build a "15-way AND circuit" and a "full-adder" by letting the system accumulate these stable blocks. He notes that "once a 4-way AND gate is found... that can be used to build a 5-way AND gate, which in turn can be used to build a 6-way AND gate." The system doesn't just get better; it changes the rules of the game by expanding the pool of available tools.
The Hidden Cost of Innovation
The most striking implication of Potter's analysis is that technological progress is not a straight line of discovery, but a process of navigating constraints. The simulation shows that "if the simpler goals aren't met first, the simulation won't find solutions to the more complex goals." This has profound implications for how we view policy and investment in technology. It suggests that skipping foundational research in favor of "moonshots" is a strategy doomed to fail because the necessary stepping stones haven't been laid.
A counterargument worth considering is whether this model applies to all forms of innovation. Some breakthroughs, like the discovery of penicillin or the structure of DNA, seem to have come from serendipity rather than a systematic, hierarchical build-up. Potter's model explains the evolution of engineered systems well, but it may be less descriptive of scientific discovery, which often relies on pattern recognition across disparate fields rather than the assembly of modular parts.
The likelihood of such a circuit being discovered by random combinations in 250,000 steps is negligible.
Despite this limitation, the core insight remains robust: complexity requires stability. The simulation demonstrates that "complex features can be created... only if simpler functions are first favored and act as stepping stones." This challenges the romantic notion of the lone inventor solving a problem in a flash of inspiration. Instead, it paints a picture of a collective, cumulative process where every new invention stands on the shoulders of thousands of previous, stable sub-inventions.
Bottom Line
Potter's analysis successfully demystifies the "black box" of technological progress, replacing it with a clear, mathematical explanation of how complexity emerges from simplicity. The strongest part of his argument is the demonstration that without the "encapsulation" of simpler technologies, the search space for complex solutions is too vast to ever be traversed. The biggest vulnerability is the assumption that all technological evolution follows this modular, hierarchical path, potentially underestimating the role of disruptive, non-linear scientific leaps. Readers should watch for how this "stepping stone" dynamic plays out in current debates over AI development and semiconductor manufacturing, where the inability to build stable sub-systems could stall progress entirely.