← Back to Library
Wikipedia Deep Dive

Moore's law

Based on Wikipedia: Moore's law

In 1965, Gordon Moore, then the director of research and development at Fairchild Semiconductor, sat down to write a brief editorial for the thirty-fifth-anniversary issue of Electronics magazine. He was not a prophet, nor a physicist deriving a universal constant from the fundamental forces of nature. He was an engineer looking at a graph. With a pen in hand, he performed what he later admitted was a "wild extrapolation." He looked at the trajectory of semiconductor components and predicted that by 1975, a single quarter-square-inch of silicon could hold 65,000 components. He noted that the complexity for minimum component costs had been increasing at a rate of roughly a factor of two per year. He wrote that while the short-term rate could be expected to continue, perhaps even increase, the longer-term outlook was uncertain, though there was no reason to believe it would not remain nearly constant for at least a decade. That offhand observation, a casual guess about the future of a nascent industry, would eventually come to govern the global economy, dictate the pace of human innovation, and serve as the central nervous system of the digital age.

It is a common misconception that Moore's Law is a law of physics. It is not. It is an empirical observation, a pattern identified in the data of human manufacturing and ingenuity. It describes the doubling of the number of transistors in an integrated circuit approximately every two years, with minimal increase in cost. This phenomenon is technically an experience curve effect, quantifying the efficiency gains that come from learned experience in production. Yet, the moniker "law" stuck, and with it came a weight that no scientific principle could have borne alone. It became a self-fulfilling prophecy. Because the industry believed the trend would continue, they planned for it, invested in it, and engineered their entire supply chains to sustain it. As Moore himself later remarked with characteristic optimism, "Moore's law is a violation of Murphy's law. Everything gets better and better."

The roots of this trajectory stretch back slightly before Moore's 1965 article. In 1959, Douglas Engelbart, a visionary who would later invent the computer mouse, studied the projected downscaling of integrated circuit (IC) size. He published his findings in an article titled "Microelectronics, and the Art of Similitude" and presented them at the 1960 International Solid-State Circuits Conference. Moore was in the audience, absorbing the implications of shrinking silicon. But it was Moore's specific articulation of the doubling rate that crystallized the industry's focus. In 1975, looking forward to the next decade, Moore revised his forecast. He realized that doubling every year was unsustainable. He adjusted the timeline, predicting that semiconductor complexity would double every two years, a compound annual growth rate (CAGR) of 41%. This revision has held true for nearly half a century, guiding long-term planning and setting targets for research and development across the globe.

The Engine of the Modern World

The impact of this relentless doubling cannot be overstated. It is the invisible hand that has shaped the last sixty years of human history. Advancements in digital electronics are inextricably linked to Moore's Law. Consider the reduction in quality-adjusted prices of microprocessors; a processor that would have cost millions in 1970 now costs pennies. Look at the explosion in memory capacity. The RAM in a modern smartphone dwarfs the total memory of all computers in the world combined in 1965. The improvement of sensors, the number and size of pixels in digital cameras, the bandwidth of our internet connections—all of these are direct consequences of the ability to pack more transistors onto a smaller piece of silicon for less money.

These ongoing changes have been the driving force of technological and social change, productivity, and economic growth. The law did not just make computers smaller; it made them ubiquitous. It turned the computer from a room-sized machine for the military and the university into a device in the pocket of billions. It transformed industries, from finance to healthcare, from agriculture to entertainment. The speed at which we process information, the complexity of the models we can run, and the sheer volume of data we can store are all functions of this exponential curve. When we speak of the "digital revolution," we are often merely speaking of the physical manifestation of Moore's Law.

Yet, the mechanics behind this growth are not merely about squeezing more transistors into the same space. In 1974, Robert H. Dennard at IBM recognized the rapid MOSFET scaling technology and formulated what became known as Dennard scaling. This concept is crucial for understanding why Moore's Law worked for so long. Dennard scaling described a phenomenon where, as MOS transistors got smaller, their power density stayed constant. This meant that the power use remained in proportion with the area. As you shrunk the transistor, it didn't just get smaller; it got faster and more energy-efficient simultaneously. The heat generated per unit area remained manageable, allowing clocks to speed up without melting the chip.

This interplay between Moore's Law and Dennard scaling created a golden era. Mathematically, Moore's Law predicted that transistor count would double every two years due to shrinking dimensions. Dennard scaling predicted that power consumption per unit area would remain constant. When combined, as Intel executive David House deduced in 1975, the result was that computer chip performance would roughly double every 18 months, with no increase in power consumption. This 18-month figure is often misquoted as the definition of Moore's Law, but it was actually House's synthesis of Moore's transistor count prediction with Dennard's power density observations. The energy-efficiency of silicon-based computer chips roughly doubled every 18 months, fueling an era of unprecedented computing power growth.

The Cracks in the Silicon Foundation

However, the laws of physics are unforgiving, and the era of easy scaling is ending. The inverse relationship between power density and areal density described by Dennard scaling broke down in the mid-2000s. As transistors shrank to the nanometer scale, quantum effects began to interfere, and leakage currents increased. The heat problem returned with a vengeance. The "power wall" became a reality; chips could no longer simply run faster without generating prohibitive amounts of heat. This forced a fundamental shift in how the industry approached performance. Instead of relying solely on clock speed increases, architects began to focus on multi-core processors and specialized architectures.

Industry experts have not reached a consensus on exactly when Moore's law will cease to apply, but the signs of deceleration are undeniable. Microprocessor architects report that semiconductor advancement has slowed industry-wide since around 2010, slightly below the pace predicted by Moore's Law. In September 2022, the debate reached a fever pitch. Jensen Huang, the CEO of Nvidia, declared Moore's Law dead, signaling a shift toward new paradigms like AI accelerators and heterogeneous computing. Conversely, Pat Gelsinger, the then-CEO of Intel, maintained the opposite view, arguing that innovation was still tracking the curve through new materials and packaging techniques. Brian Krzanich, the former CEO of Intel, offered a more nuanced perspective, citing Moore's 1975 revision as a precedent for the current deceleration. He described the slowing not as a failure, but as a "natural part of the history of Moore's law," a result of technical challenges that the industry must navigate.

The economic reality of this slowdown is stark. As the cost of computer power to the consumer falls, the cost for producers to fulfill Moore's Law follows an opposite trend. The Research and Development, manufacturing, and test costs have increased steadily with each new generation of chips. This phenomenon led to the formulation of "Moore's Second Law," also known as Rock's Law, named after venture capitalist Arthur Rock. Rock's Law states that the capital cost of a semiconductor fabrication plant (fab) increases exponentially over time. The tools required to manufacture these chips are becoming astronomically expensive. Specifically, the cost of Extreme Ultraviolet Lithography (EUVL) machines, the primary tools used to print the smallest features on modern chips, doubles approximately every four years. A single EUVL machine can cost over $200 million. This rising barrier to entry has consolidated the industry, leaving only a handful of companies—TSMC, Samsung, and Intel—capable of leading the charge into the most advanced nodes.

The Human and Economic Cost of Scaling

The narrative of Moore's Law is often told as a triumph of engineering, a story of shrinking silicon and expanding capability. But beneath the technical jargon lies a profound economic and human reality. The pursuit of the next node is not a gentle progression; it is a high-stakes gamble that requires trillions of dollars in investment. The factories required to build 5nm or 3nm chips are marvels of engineering, but they are also monuments to capital intensity. They are clean rooms where the air is filtered to a degree that no hospital operating room can match, where vibrations from a passing truck miles away can ruin a batch of wafers. The workforce in these facilities is highly specialized, a global talent pool competing for roles that require years of advanced training.

The shift in focus from pure scaling to application-specific needs has also reshaped the industry. As the rate of physical improvement slows, the industry has pivoted. The focus is no longer just on making the transistor smaller; it is on making the chip smarter for specific tasks. This is the era of the AI model lab, where the value is captured not just by the silicon, but by the architecture that sits on top of it. The doubling of transistor count is no longer the sole metric of success. Instead, the industry is looking at system-level performance, energy efficiency, and the ability to handle massive parallel workloads. The "cleverness" that Moore cited in 1975 is now being applied to packaging technologies, stacking chips vertically, and integrating different materials like gallium nitride and silicon carbide.

Despite the slowdown, as of 2019, leading semiconductor manufacturers like TSMC and Samsung Electronics claimed to keep pace with Moore's Law, bringing 10nm, 7nm, and 5nm nodes into mass production. The curve is flattening, but it has not broken. The question is no longer if the law will end, but how the industry will adapt to the new normal. Will we see a new form of computing that bypasses silicon entirely? Will quantum computing, neuromorphic chips, or optical computing take the baton? Or will the industry simply accept a slower, more expensive rate of growth?

The Legacy of a Prediction

Moore's Law is more than a technical specification; it is a cultural touchstone. It has shaped the expectations of consumers, investors, and governments. For decades, it has been a promise that technology would always get cheaper and more powerful. This promise has fueled the internet boom, the smartphone revolution, and the current AI explosion. But promises, even those backed by decades of data, are not guarantees. The experience curve effect that Moore identified is a product of human effort, not a fundamental law of the universe. When the effort required to double performance outpaces the economic return, the curve will bend.

The history of the law also reminds us of the role of human agency in technological progress. It was not an inevitable force of nature. It was driven by the decisions of engineers, the investments of venture capitalists, and the strategic planning of corporate leaders. Gordon Moore himself viewed his prediction as surprising and optimistic. He knew it was a projection, not a prophecy. Yet, by articulating it, he provided a target for the industry to aim at. In this sense, Moore's Law was a self-fulfilling prophecy. The industry believed in the doubling, so they worked to make it happen. They innovated around the obstacles. They developed new lithography techniques, new materials, and new designs to keep the curve alive.

Today, as we stand on the precipice of a new era in computing, the relevance of Moore's Law is being re-evaluated. The shift to model labs and AI value capture suggests that the future of computing may not be defined by the number of transistors, but by the sophistication of the algorithms and the efficiency of the systems. The physical limits of silicon are being tested, but the human drive to compute, to solve, and to create is not. The doubling of transistor count may slow, but the doubling of human capability through technology continues. The law may be dying, but the momentum it created will last for generations.

In the end, Moore's Law serves as a powerful reminder of the relationship between observation and reality. It shows how a simple observation, backed by data and belief, can shape the course of human history. It is a testament to the power of human ingenuity to push against the boundaries of the physical world. Whether the law holds for another decade or fades into history, its legacy is secure. It has been the engine of the modern world, the silent partner in every digital innovation, and the benchmark against which we measure our progress. As we look to the future, we may no longer rely on the doubling of transistors, but we will always rely on the spirit of innovation that Moore's Law represents.

The story of Moore's Law is not just about chips. It is about the human capacity to imagine a future and then build it, brick by brick, transistor by transistor. It is a story of optimism in the face of complexity, of belief in the power of technology to improve the human condition. Even as the physical limits of silicon are approached, the lessons of Moore's Law remain: that progress is possible, that challenges can be overcome, and that the future is not written in stone, but in the collective effort of those who dare to dream it.

The debate continues. Is Moore's Law dead? Or is it simply evolving? The answer depends on how we define the law. If it is strictly about the doubling of transistors every two years, then yes, the pace is slowing. But if it is about the relentless drive to make computing more powerful, more efficient, and more accessible, then the law is alive and well, even if the mechanism has changed. The industry is finding new ways to innovate, new ways to scale, and new ways to capture value. The tools are different, the challenges are greater, but the goal remains the same. And as long as that goal exists, the spirit of Moore's Law will endure.

In the quiet of a semiconductor fab, amidst the hum of machines and the glow of laser light, the legacy of Gordon Moore is being written anew. Every wafer that is processed, every chip that is tested, is a testament to a prediction made in 1965. It is a reminder that sometimes, the most profound truths are found not in the depths of the universe, but in the careful observation of a graph, and the bold belief that the future can be better than the past. The curve may flatten, but the ascent continues. The story is far from over. It is merely entering a new chapter, one where the rules are being rewritten, but the promise remains. The future of computing is not just about how small we can make the transistor. It is about how big we can make the possibilities. And in that sense, Moore's Law is just beginning.

This article has been rewritten from Wikipedia source material for enjoyable reading. Content may have been condensed, restructured, or simplified.