Asianometry doesn't just recount the history of computing; they challenge the very foundation of our digital reality by asking why we settled on a two-state system when three might be superior. The piece stands out by weaving together obscure Soviet history, mathematical efficiency proofs, and the cutting edge of AI hardware, suggesting that our current binary dominance might be a historical accident rather than a technological inevitability. For busy professionals watching the limits of Moore's Law approach, this isn't just trivia—it's a potential roadmap for the next leap in processing power.
The Mathematical Case for Three
The author begins by dismantling the assumption that binary is the only logical choice. "A binary digit i.e. a bit represents two states 0 or 1. A ternary digit i.e. a trit represents three states. usually either a balanced ternary with states negative 1 0 1 or an unbalanced ternary with states 0 1 and two." This distinction is crucial because it shifts the conversation from simple counting to information density. Asianometry explains that while higher bases carry more information per digit, they also increase complexity, leading to a mathematical sweet spot. "Work that out and you find that the most optimal number is the famous mathematical constant Euler's number with an efficiency of 0.368." Since Euler's number is continuous and impractical for hardware, the author points out that base 3 is the closest discrete integer, boasting an efficiency of 0.366 compared to binary's 0.347.
This argument is compelling because it relies on hard math rather than hype. The efficiency gap suggests that ternary systems could theoretically use fewer components and wires to represent the same data. Asianometry writes, "So in theory, a ternary computer can use fewer components to represent data, which then means a reduction in the number of wires to connect said components." This reduction in physical complexity is a massive selling point for hardware designers struggling with heat and signal interference. Furthermore, the author highlights a logical elegance often missed in binary discussions: "Another notable benefit is that ternary has native support for negative numbers. Binary requires the use of an extra bit for that, which can be tricky to keep track of."
Critics might note that theoretical efficiency often crumbles under the weight of manufacturing reality. Just because a system is mathematically optimal doesn't mean it's easy to build with silicon. Yet, the argument that ternary logic simplifies comparisons is hard to dismiss. "Comparing one and two in binary logic takes two steps... Ternary logic on the other hand can do it in just one step by just outputting less equal to or greater." This single-step comparison could theoretically speed up decision-making processes in processors, a feature that becomes increasingly valuable as we hit the limits of clock speed scaling.
The Soviet Experiment and Its Demise
The narrative then pivots to history, revealing that ternary computing isn't a new fantasy but a proven, albeit abandoned, reality. The author details the story of the Setun computer, born from a petty academic feud in 1950s Moscow. "The whole thing began at Moscow State University in late 1955 because of a petty feud. The university was supposed to receive an M2 computer... However, the M2 computer team's lead... disliked MSU's director... because the latter had not voted for him in a previous academic election." This anecdote humanizes the technological struggle, showing how bureaucracy and personal grudges can shape the trajectory of computing history.
The Setun team, led by Nikolay Brusentsov, innovated by using magnetic ferrite cores to create a three-state system. "The MSU team modified this arrangement by making one of the compensating cores into a working core. So, two ferrite cores to contain each trit." Despite the ingenuity, the project faced political headwinds. "After passing its demo with flying colors, a plant in Russia began production. 46 were made and distributed around the Soviet Union." However, the momentum was short-lived. "Just before the checks began large-scale production, the whole Setun effort was shut down. It was alleged that this was done by a Soviet official in charge of a competing binary computer effort, the M20." The author paints a grim picture of the aftermath: "In the end, the Setun was thrown into the trash and the team dispersed."
This historical detour serves as a cautionary tale about the power of path dependence. Once binary became the standard, the ecosystem of tools, languages, and engineers solidified around it, making it nearly impossible for a superior technology to gain a foothold. As Asianometry notes, "The oldest known effort to produce a ternary computing device dates back to 1840... but historians did a reconstruction 150 years later based on a detailed written description." The persistence of these ideas across centuries suggests that the potential is real, even if the timing was never quite right.
"Only a Sith deals in absolutes. So, Brusentsov hopes that one day the world turns back to ternary."
The Hardware Hurdle and the AI Renaissance
Despite the theoretical and historical merits, the author admits that the practical barriers are immense. The core issue lies in the physical device itself. "Binary and MOS transistors go together like peanut butter and jelly. They're just two voltage states. and current issues aside, it's easy to tell between them. A ternary device, on the other hand, must reliably distinguish between multiple threshold voltage levels, and that's hard." Noise and signal degradation make distinguishing three states far more difficult than two, requiring tighter control and additional hardware that often negates the theoretical efficiency gains.
The article explores various attempts to solve this, from CMOS resistor setups in the 1980s to modern memristors and carbon nanotubes. "In January 2025, Chinese researchers... presented carbon nanotube based devices that they call a source gating transistor... The author said that the CNT transistor can be easily manufactured and produced several ternary circuits with it like an SRAM." These developments are promising, but the author remains skeptical about their immediate impact. "The peak of Moore's law in the 1990s and early 2000s diminished the perceived need for ternary strengths. Why does it matter that trits can carry some more information when bits and transistors are so cheap?"
However, the landscape is shifting with the rise of artificial intelligence. "The recent emergence of neural networks has put ternary back into the limelight." Deep learning models require massive amounts of memory and computation, creating a bottleneck that ternary logic could alleviate. "Ternarization is similar quantizing everything to trits. The memory footprint and logic complexity gets slightly worse, but there is more accuracy. Moreover, having a zero let you skip every multiplication action involving that zero." This ability to skip operations is a game-changer for energy-constrained AI devices.
Yet, the author warns against over-optimism. "To quote a famous movie, never go full ternarization. The information losses are too drastic. Mapping both 5 and 50 as one doesn't make sense." The trade-off between precision and efficiency remains a delicate balance. The recent Huawei patent mentioned in the title covers a ternary logic gate for simple math, but as Asianometry concludes, "Since we do not [have the full context of the patent's impact], it remains to be seen if this is a genuine breakthrough or just another theoretical exercise."
Critics might argue that the software ecosystem is too deeply entrenched in binary for a transition to ever happen. Rewriting decades of code and retraining engineers to think in three states is a monumental task that might outweigh the hardware benefits. The author acknowledges this, noting that "people are just used to thinking in binary terms. Programming in ternary means asking them to rewire their thinking."
Bottom Line
Asianometry's coverage effectively resurrects a forgotten chapter of computing history to argue that ternary logic is not just a mathematical curiosity but a viable, perhaps superior, alternative to binary. The strongest part of the argument is the clear demonstration of ternary's theoretical efficiency and its potential to solve the memory and energy bottlenecks plaguing modern AI. However, the piece's biggest vulnerability lies in the immense practical difficulty of building reliable three-state hardware and the inertia of the binary ecosystem. Readers should watch for breakthroughs in carbon nanotube and memristor technologies, as these may finally provide the physical foundation needed to turn ternary theory into practice. Until then, binary remains king, but the crown is no longer unassailable.