Automation
Based on Wikipedia: Automation
But automation's story begins far earlier than Ford's famous announcement. Humanity's obsession with machines that think for themselves stretches back to ancient civilizations. In Ptolemaic Egypt, around 270 BC, Ctesibius described a float regulator for a water clock—a device not unlike the ball and cock in a modern flush toilet. This was the earliest feedback-controlled mechanism humanity ever documented, a whisper of what would eventually become one of the defining features of modern industry.
The Greeks and Arabs, between about 300 BC and 1200 AD, shared this preoccupation with keeping accurate track of time. The appearance of the mechanical clock in the 14th century made water clocks and their feedback control systems obsolete—but the idea had taken root. By the 17th century, Christiaa Huygens had invented the centrifugal governor, used to adjust the gap between millstones. This simple device would become foundational to how we think about automatic control.
The introduction of prime movers—self-driven machines advanced grain mills, furnaces, boilers, and the steam engine—created an entirely new requirement for automatic control systems. Temperature regulators (invented in 1624 by Cornelius Drebbel), pressure regulators (1681), float regulators (1700), and speed control devices appeared one after another. Another control mechanism was used to tent the sails of windmills, patented by Edmund Lee in 1745.
That same year saw Jacques Vaucanson invent the first automated loom—a machine that could produce complex patterns without constant human intervention. Around 1800, Joseph Marie Jacquard created a punch-card system to program looms, effectively programming machines before computers existed. In 1771, Richard Arkwright invented the first fully automatic spinning mill driven by water power.
Oliver Evans developed an automatic flour mill in 1785, making it the first completely automated industrial process. A centrifugal governor was used by Mr. Bunce of England in 1784 as part of a model steam crane. The governor was adopted by James Watt for use on a steam engine in 1788 after Watt's partner Boulton saw one at a flour mill they were building.
But the governor had limitations. It could not actually hold a set speed—the engine would assume a new constant speed in response to load changes. Yet it handled smaller variations such as those caused by fluctuating heat load to the boiler. There was also a tendency for oscillation whenever there was a speed change. As a consequence, engines equipped with this governor were not suitable for operations requiring constant speed, such as cotton spinning.
The mathematical basis of control theory began in the 18th century and advanced rapidly in the 20th. The design of feedback control systems up through the Industrial Revolution was by trial-and-error, together with a great deal of engineering intuition. It was not until the mid-19th century that the stability of feedback control systems was analyzed using mathematics—the formal language of automatic control theory.
This mathematical foundation received relatively little scientific attention until James Clerk Maxwell published a paper that established the beginning of a theoretical basis for understanding control theory. His work would transform how engineers thought about machines.
Relay logic was introduced with factory electrification, which underwent rapid adaptation from 1900 through the 1920s. Central electric power stations were also undergoing rapid growth and the operation of new high-pressure boilers, steam turbines and electrical substations created a great demand for instruments and controls. Central control rooms became common in the 1920s, but as late as the early 1930s, most process controls were on-off.
Operators typically monitored charts drawn by recorders that plotted data from instruments. To make corrections, operators manually opened or closed valves or turned switches on or off. Control rooms also used color-coded lights to send signals to workers in the plant to manually make certain changes.
The development of the electronic amplifier during the 1920s, which was important for long-distance telephony, required a higher signal-to-noise ratio, which was solved by negative feedback noise cancellation. This and other telephony applications contributed significantly to control theory—and helped create the systems we rely on today.
In the 1940s and 1950s, German mathematician Irmlad Flügge-Lotz developed the theory of discontinuous automatic controls, which found military applications during the Second World War to fire control systems and aircraft navigation systems. Controllers, which were able to make calculated changes in response to deviations from a set point rather than simple on-off control, began being introduced in the 1930s.
These controllers allowed manufacturing to continue showing productivity gains to offset the declining influence of factory electrification. Factory productivity was greatly increased by electrification in the 1920s—U.S. manufacturing productivity growth fell from 5.2% per year between 1919–29 to only 2.76% per year between 1929–41, a troubling decline that automation would soon reverse.
Other key advances in automatic controls include differential equations, stability theory and system theory (1938), frequency domain analysis (1940), ship control (1950), and stochastic analysis (1941). Starting in 1958, various systems based on solid-state digital logic modules for hard-wired programmed logic controllers—the predecessor to modern programmable computers—began transforming factories once again.
Automation describes a wide range of technologies that reduce human intervention in processes, mainly by predetermining decision criteria, subprocess relationships, and related actions. Automation has been achieved by various means including mechanical, hydraulic, pneumatic, electrical, electronic devices, and computers, usually in combination. Complicated systems—such as modern factories, airplanes, and ships—typically use combinations of all of these techniques.
The benefits of automation include labor savings, reducing waste, savings in electricity costs, savings in material costs, and improvements to quality, accuracy, and precision. Automation includes the use of various equipment and control systems such as machinery, processes in factories, boilers, and heat-treating ovens, switching on telephone networks, steering, stabilization of ships, aircraft and other applications and vehicles with reduced human intervention.
In the simplest type of an automatic control loop, a controller compares a measured value of a process with a desired set value and processes the resulting error signal to change some input to the process, in such a way that the process stays at its set point despite disturbances. This closed-loop control is an application of negative feedback to a system—a concept borrowed from biology and refined through mathematics.
Examples range from a household thermostat controlling a boiler to a large industrial control system with tens of thousands of input measurements and output control signals. The World Bank's World Development Report of 2019 shows evidence that new industries and jobs in the technology sector outweigh the economic effects of workers being displaced by automation.
Yet Job losses and downward mobility blamed on automation have been cited as one of many factors in the resurgence of nationalist, protectionist and populist politics in the US, UK and France, among other countries since the 2010s. The conversation about what automation means for human labor continues to shape political discourse—and will define economies for generations.
The word "automation" itself, inspired by the earlier word \"automatic\" (coming from automaton), was not widely used before 1947 when Ford established that famous department. It was during this era that the industry was rapidly adopting feedback controllers—a technological revolution that would reshape how every factory on Earth operates.