Five Techniques, Fifty Years
The Human Genome Project cost roughly three billion dollars and took more than a decade to produce its initial draft sequence in 2001. Today, companies like Ultima Genomics advertise whole-genome sequencing for a hundred dollars in consumables alone. Evan DeTurk, an MPhil student at Cambridge studying the history of science, traces that staggering collapse in cost through five sequencing methods that collectively reshaped molecular biology. Writing for Asimov Press, DeTurk delivers a clear, richly illustrated tour of deoxyribonucleic acid (DNA) sequencing from its radioactive beginnings to the handheld nanopore devices shipping today.
The article opens with the political theater surrounding the genome project's completion. President Bill Clinton called it "the most wondrous map ever produced by mankind." British Prime Minister Tony Blair went further.
This map would yield "a revolution in medical science whose implications far surpass even the discovery of antibiotics."
DeTurk notes, almost in passing, that whether Blair's claim turned out to be true "is debatable." That parenthetical deserves more weight than it gets. Twenty-five years later, genomic medicine has produced important diagnostic tools and targeted therapies, but the revolution Blair promised remains incomplete. The gap between sequencing a genome and understanding what it means has proved far wider than the politicians suggested.
Sanger and the Art of Decoding
The story begins with Frederick Sanger, the only person to win two Nobel Prizes in the same field. DeTurk positions Sanger as "biology's great decoder," a characterization that holds up across the biochemist's career arc from protein sequencing through ribonucleic acid (RNA) to DNA. But DeTurk is careful to note that Sanger was not, as many assume, the first to sequence DNA. That distinction belongs to Ray Wu at Cornell, who in 1970 published a method to read specific sections of bacterial virus genomes. Wu's approach proved "extremely labor-intensive and failed to catch on."
Sanger's breakthrough came in 1977 with the chain termination method, which exploited dideoxyribonucleotides (ddNTPs) -- modified building blocks that halt DNA strand elongation when incorporated. The technique was elegant in its simplicity.
Molecular biologists found both technically preferable and more "elegant" since it mirrored the natural copying of DNA.
The commercial era arrived in 1986, when Leroy Hood's lab at Caltech replaced radioactive labels with fluorescent ones, enabling machines to read sequences automatically. Applied Biosystems sold the first commercial Sanger machine that year for $92,500. Because Sanger never patented his method, competitors entered freely.
Pyrosequencing Lights the Way
The second act belongs to Swedish biochemist Pal Nyren, who realized in 1986 that an enzymatic light-emission cascade he had helped develop could detect the pyrophosphate released during DNA synthesis. The chemistry is beautifully indirect: a nucleotide gets incorporated, pyrophosphate is released, enzymes convert it to adenosine triphosphate (ATP), and firefly luciferase produces a flash of light. DeTurk calls this the start of "next generation" sequencing (NGS).
Funding constraints delayed development for years. It was not until 2005 that 454 Life Sciences, a Connecticut company that had licensed the technology, released the GS20 sequencer at a retail price of five hundred thousand dollars. The machine's first major achievement was sequencing the first million base pairs of the Neanderthal genome in collaboration with paleogeneticist Svante Paabo.
Pyrosequencing worked in real-time, though it struggled to accurately capture regions with several of the same nucleotides in a row. This was because the amount of light didn't always scale cleanly when pyrophosphate was produced through successive reactions.
That limitation, combined with Illumina's arrival, proved fatal. Roche shut down 454 in 2013. The technology mattered less for its longevity than for what it proved: Sanger sequencing was not the only viable approach.
Illumina's Dominance
The article's center of gravity is sequencing by synthesis, the method developed by David Klenerman and Shankar Balasubramanian at Cambridge. Their insight was deceptively simple: sequence DNA by building its complement one base at a time, using reversible terminators tagged with different-colored fluorophores. Photograph the chip. Cleave the blocker. Add the next base. Repeat.
The engineering challenge was detection. A single DNA molecule produces too faint a signal. The solution came from French scientists Pascal Mayer and Laurent Farinelli, whose colony sequencing method amplified fragments into clusters of identical strands on a flow cell. Bridge amplification -- strands bending over to bind nearby primers on the chip surface -- propagated each sequence into thousands of copies, producing a signal strong enough to photograph reliably.
Illumina has become by far the most common NGS method, maintaining roughly an 80 percent share over the last few years.
That market share is remarkable and, DeTurk argues, largely due to versatility. Illumina machines quantify the activity of genome editors like Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR), sequence degraded ancient DNA, and build reference genomes for organisms from tomatoes to Neanderthals. The short-read format that once seemed like a limitation turned out to be perfectly suited for the majority of sequencing applications.
Long Reads and the Third Generation
Pacific Biosciences (PacBio) and Oxford Nanopore Technologies (ONT) represent the counterargument to Illumina's short-read orthodoxy. PacBio's single molecule real time (SMRT) sequencing uses zero-mode waveguides -- containers just large enough for a single DNA polymerase -- to watch individual bases being incorporated in real time. The reads are long, exceeding ten thousand bases, making PacBio the tool of choice for repetitive or structurally complex genomes.
DeTurk illustrates the point with a vivid comparison. PacBio was used to sequence a bacterium called Clostridium autoethanogenum, which contains repeats, nine copies of a single gene, and insertions from bacterial virus genomes.
Basically the genomic equivalent of a Thomas Pynchon novel.
Nanopore sequencing, meanwhile, dispenses with synthesis entirely. A strand of DNA passes through a protein pore while an electric current runs through it. Each base disrupts the current differently, and the sequence is read from those disruptions. The MinION, ONT's first commercial product, shipped in 2015 for just a thousand dollars -- a handheld sequencer that would have been science fiction a decade earlier.
Early nanopore accuracy was poor, around 85 to 90 percent per base in 2017. DeTurk notes that recent improvements have pushed accuracy above 99 percent, though it is worth observing that Illumina achieved that threshold years earlier. The accuracy gap has narrowed dramatically, but for clinical applications where every base call matters, the methods are not yet interchangeable.
The Hundred-Dollar Genome
The article's final sections track the race toward ever-cheaper sequencing. Ultima Genomics claims to have hit the hundred-dollar target with its UG100 machine. Element Biosciences expects to match it with the VITARI system in 2026. But DeTurk flags an important caveat.
The $100 price tag advertised by these companies includes only the consumables used by the machine itself, excluding labor, data analysis, and other costs.
This is an honest disclosure that many genomics press releases omit. The consumable cost of sequencing has plummeted, but the total cost of turning raw reads into medically or scientifically useful information has not fallen nearly as fast. Bioinformatics, quality control, and clinical interpretation remain expensive. The hundred-dollar genome is a real achievement, but it is also partly a marketing construction.
What the Article Does Well
DeTurk's strength is narrative structure. Each of the five sequencing methods gets its own self-contained history, from initial insight through commercialization and eventual displacement or dominance. The article never loses sight of the people behind the chemistry -- Sanger's quiet persistence, Nyren's years of underfunding, Klenerman and Balasubramanian's pivot from single-molecule detection to colony amplification. The visual aids, created by illustrator Ella Watkins-Dulaney, complement the text effectively.
The writing is precise without being dense. DeTurk explains polymerase chain reaction (PCR), gel electrophoresis, and fluorophore labeling in terms accessible to a general science audience, without condescending to specialists. That balance is difficult to achieve, and the article sustains it across roughly four thousand words.
Innovation in DNA sequencing will surely continue, but these five techniques have already transformed a feat that was impossible just fifty years ago into something that can be done overnight.
Bottom Line
DeTurk has written the kind of explainer that justifies Asimov Press's existence: historically grounded, technically accurate, and genuinely educational. The visual guide format works because sequencing chemistry is inherently spatial -- strands extending, terminators blocking, light flashing through waveguides -- and the illustrations translate those processes into something a reader can follow without a biochemistry degree.
The article could push harder on the gap between sequencing capability and biological understanding. Reading a genome is now trivial; interpreting it remains profoundly difficult. That tension between technological triumph and scientific humility is the real story of the last fifty years in genomics, and DeTurk gestures at it without fully engaging. But as a history of how we got from radioactive gels to pocket-sized sequencers, this is authoritative and well told.