← Back to Library

The birth, rise and eclipse of the personal computer

Babbage challenges the comforting narrative that the personal computer was a linear march toward user-friendly appliances, revealing instead that the most "personal" machines were those that demanded the most from their owners. The piece's most striking claim is that the golden age of personal computing coincided with a loss of intimacy, as modern devices have become more seamless but far less controllable than the clunky, expandable kits of the 1970s.

The Vision Before the Machine

Babbage begins by anchoring the history of personal computing not in the garage startups of California, but in the 1945 imagination of Vannevar Bush. While most histories start with the ENIAC, a room-sized machine built for the military-industrial-academic triangle, Babbage highlights Bush's essay "As We May Think" as the true genesis of the concept. Bush envisioned a "memex," a device that would serve as an "enlarged intimate supplement to his memory." This framing is crucial because it establishes that the original definition of "personal" was about deep, individual connection to information, not just ownership.

The birth, rise and eclipse of the personal computer

The author notes that Bush's vision was surprisingly specific, describing a desk with "slanting translucent screens" and a keyboard, anticipating the modern workstation decades before the hardware existed to build it. Babbage writes, "Consider a future device for individual use, which is a sort of mechanized private file and library." This quote underscores that the goal was always about individual agency over data, a principle that would later be tested by the commercialization of the technology.

The Paradox of the Kit

The narrative takes a sharp turn when Babbage addresses the machine that actually launched the industry: the Altair 8800. Far from the sleek, graphical interfaces of the Xerox Alto, the Altair was a box of blinking lights that required users to toggle switches to enter code. Babbage argues that this was the true turning point because it was the first machine that was both affordable and expandable. The core of the argument is that the "personal" nature of the computer was defined by the user's ability to hack, modify, and control the hardware, not by how easy it was to use.

As Babbage puts it, "The mouse and graphical user interface were not pre-requisites for a machine being 'personal'." This is a provocative stance that forces the reader to reconsider what we value in technology. The author points out that the Altair's success lay in its openness; owners could add RAM and controllers, turning a $400 kit into a functional system. This era birthed a culture where users were expected to understand the machine's inner workings.

It wasn't about the interface; it was about the ability to peek and poke, to understand the machine, and to hack it.

Critics might argue that this definition of "personal" excluded the vast majority of the population who lacked the technical skills to operate a teletype or toggle switches. However, Babbage counters that the spirit of the era was one of democratization through accessibility, even if the learning curve was steep. The rise of magazines like Byte and the proliferation of companies like Apple and Commodore proved that the market wanted tools for the individual, not just appliances for the family.

The IBM Standard and the Open Ecosystem

The commentary then shifts to the introduction of the IBM Personal Computer in 1981. Babbage highlights a fascinating irony: while Apple marketed itself as the rebel against "Big Brother" in its famous 1984 Super Bowl ad, the IBM PC was arguably the more "personal" machine because of its open architecture. The author draws a parallel to Charlie Chaplin's "The Tramp," who represents the struggle to preserve humanity in a mechanized world. Babbage writes, "The Tramp's travails in Modern Times... should provide strength and comfort to all who feel like helpless cogs in a world beyond control."

This analogy is used to illustrate how the IBM PC allowed users to maintain control over their computing environment. Unlike the closed ecosystem of the Macintosh, the IBM PC featured expansion slots and a flexible architecture that encouraged an explosion of third-party hardware and software. Babbage notes that "users had more control over their PCs than over their Macintosh's," a claim that redefines the success of the PC not as a marketing victory, but as a victory for user agency.

The Eclipse of Control

The piece concludes with a sobering observation: the term "personal computer" has been in steady decline since peaking in 1986, eclipsed by the smartphone. Babbage argues that while modern devices are more "intimate"—carried on our bodies and used constantly—they are less "personal" in the original sense. We have traded control for convenience. The author writes, "We have much less control over our smartphones and tablets than we have over our computers."

This section serves as a critique of the current technological landscape, where users are often reduced to consumers of content rather than creators of their own tools. Babbage suggests that the vision of Vannevar Bush has been reached in terms of connectivity, but the intimate, mechanical connection between human and machine has been severed. The smartphone, described by Steve Jobs as a combination of an iPod, phone, and internet device, represents a shift away from the open, hackable platform that defined the personal computer era.

The decline in the use of 'personal computer' is consistent with another change: 'personal computers' have become more intimate but less personal.

A counterargument worth considering is that the complexity of the old personal computer was a barrier to entry that prevented true mass adoption. The closed, curated nature of modern devices allows billions of people to access powerful computing tools without needing to understand the underlying code. Yet, Babbage's point remains valid: the loss of control is a significant trade-off that redefines the relationship between the user and their technology.

Bottom Line

Babbage's most compelling contribution is the redefinition of "personal" not as a measure of user-friendliness, but as a measure of user control and expandability. The piece's greatest vulnerability is its romanticization of the technical barriers of the past, which may have limited the very "personal" nature it champions. Readers should watch for how the industry balances the demand for seamless, intuitive interfaces with the growing desire for open, user-controlled platforms in the era of AI and cloud computing.

Sources

The birth, rise and eclipse of the personal computer

The idea of the personal computer.

In The Innovators, his wide-ranging history of the development of computer technology, Walter Isaacson starts the chapter ‘The Personal Computer’ with a discussion of Vannevar Bush’s Memex:

The idea of a personal computer, one that ordinary individuals could get their hands on and take home, was envisioned in 1945 by Vannevar Bush. After building his big analog computer at MIT and helping to create the military-industrial-academic triangle, he wrote an essay for the July 1945 issue of the Atlantic titled "As We May Think.” In it he conjured up the possibility of a personal machine, which he dubbed a memex, that would store and retrieve a person's words, pictures, and other information: "Consider a future device for individual use, which is a sort of mechanized private file and library.... A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory." The word intimate was important. Bush and his followers focused on ways to make close, personal connections between man and machine.

The Atlantic article appeared a few months before ENIAC, the first programmable, electronic, general-purpose digital computer, ran it’s first practical program, one designed to help understand the feasibility of creating a hydrogen bomb. Bush led the U.S. Office of Scientific Research and Development (OSRD) which co-ordinated war related scientific research, including early work on atomic weapons. But with the war almost over, his thoughts were moving to his earlier interests in computers as tools for civilian research. In the 1920s and 1930s he had helped develop a mechanical analog computer - known as a ‘differential analyser’ - to solve differential equations. It’s perhaps not surprising then that he included a reference in his Atlantic article to Charles Babbage’s attempts to create a mechanical computer a century earlier:

Babbage, even with remarkably generous support for his time, could not produce his great arithmetical machine. His idea was sound enough, but construction and maintenance costs were then too heavy. Had a Pharaoh been given detailed and explicit designs of an automobile, and had he understood them completely, it would have taxed the resources of his kingdom to have fashioned the thousands of parts for a single car, and that car would have broken down on the ...