Asianometry doesn't just recount the history of microprocessors; they expose the high-stakes gamble that nearly broke the computing world. The piece argues that the shift from Complex Instruction Set Computing to Reduced Instruction Set Computing wasn't merely a technical upgrade, but a fundamental inversion of how we build computers: trading expensive, slow memory for cheap, fast logic. This is the story of how a handful of academics and a few brave startups managed to challenge the iron fist of Intel and the legacy of IBM, proving that sometimes, less really is more.
The Great Inversion
The narrative begins by dismantling the prevailing wisdom of the 1970s. At the time, memory was prohibitively expensive and slow, so chip architects like those at Intel designed processors with complex instructions to minimize how often the CPU had to fetch data. Asianometry explains this trade-off clearly: "Intel and other companies were trading expensive RAM for inexpensive ROM."
This strategy relied on microcode, a translation layer that allowed hardware to execute complex commands. However, the author highlights a critical turning point within IBM itself. When an internal team led by John Cocke realized that their microcoded chips were too sluggish for a new telephone switch, they made a radical pivot. "Caul inverted that," Asianometry writes, noting that the team realized "what actually got you a faster computer was simple hardware, more complex software." This was a profound shift in philosophy. Instead of making the hardware do the heavy lifting, they pushed the complexity into the compiler, allowing the processor to run at blistering speeds with a slimmed-down instruction set.
The evidence for this approach was compelling. A 1978 study by LJ Shushuk at Stanford found that 90% of program executions involved only 26 instructions out of a possible 183. This suggested an "80-20 rule" where optimizing the most common tasks yielded massive gains. Asianometry notes that this idea wasn't entirely new—Seymour Cray had used similar tactics for supercomputers—but applying it to general-purpose computing was the real revolution.
The real meaning of my/RISC technology is that it raises the possibility of breaking the stranglehold of IBM mainframe computers.
The Software Moat
Despite the technical elegance, the path to commercialization was blocked by a formidable barrier: existing software. The author correctly identifies that the biggest hurdle wasn't engineering, but economics. "You never want to change your software if you can avoid it," recalls Tom Murphy of IBM. The inertia of the installed base was immense. IBM hesitated to fully embrace their own RISC architecture because customers had invested heavily in legacy systems that RISC chips couldn't read without recompilation.
This is where the narrative shines, contrasting IBM's caution with the audacity of newer players. While IBM dragged its feet, companies like Hewlett-Packard and Sun Microsystems saw an opening. HP's PA-RISC was one of the first major commercial implementations, but Sun Microsystems made the boldest move. Refusing to pay the exorbitant licensing fees demanded by MIPS Computer, Sun decided to build their own chip, the SPARC. Asianometry details how Sun managed to produce a chip that "screamed" in performance, hitting 10 million instructions per second at a fraction of the cost of competitors.
The author emphasizes the strategic brilliance of Sun's approach. They didn't just sell a chip; they sold an ecosystem. "Sun positioned SPARC as an open architecture," Asianometry writes, aiming to replicate the success of their NFS file system by encouraging other vendors to license the design. This created a competitive force capable of challenging the dominance of Intel and the Apple Macintosh.
Critics might note that the "open architecture" was only open in the context of the 1980s, and many competitors hesitated to license a chip controlled by such an aggressive rival. This hesitation ultimately fragmented the RISC market, allowing Intel to consolidate power in the PC space while RISC thrived in specialized workstations. The author acknowledges this fragmentation, noting that while RISC proved its worth in high-performance computing, the "software library problem" remained a persistent thorn in its side.
The Legacy of the Wars
The piece concludes by reflecting on the long-term impact of these battles. The RISC wars were not just about benchmarks; they were about defining the future of computing. Asianometry writes, "It was a time of shifting alliances, leaps of inspiration, wild technical claims, and the iron fist of Intel." The eventual victory of RISC principles is undeniable, even if the specific architectures like SPARC and PA-RISC have faded from the mainstream. The core idea—that simple hardware driven by sophisticated software compilers is the key to performance—has become the standard for modern processors, from smartphones to supercomputers.
The author's coverage is particularly effective in humanizing the technical struggle. By focusing on the personalities involved, from John Cocke's frustration with microcode to Bill Joy's skepticism of CISC, the piece transforms a dry technical history into a gripping drama of innovation and hubris.
The advantages offered by RISC architecture didn't convince people here it was worth the pain of rewriting our software.
Bottom Line
Asianometry delivers a masterclass in tech history, framing the RISC wars not as a footnote but as the pivotal moment that broke the monopoly of complex architectures. The strongest part of the argument is the clear delineation of the economic versus technical trade-offs, showing why IBM hesitated while Sun surged. The biggest vulnerability is the brief treatment of why RISC eventually lost the general-purpose PC war to x86, a topic that deserves its own deep dive. For the busy reader, this piece is a vital reminder that the chips in your pocket are the result of a decades-long battle where the simplest idea often wins.