Cherry picking
Based on Wikipedia: Cherry picking
In the fifth century BCE, the atheist philosopher Diagoras of Melos stood before a crowd of superstitious onlookers and pointed to a wall of votive gifts. These offerings had been placed by sailors who had survived shipwrecks, their lives spared, they believed, only because they had prayed to the gods before the storms broke. The crowd saw the paintings of rescue and the statues of gratitude as proof of divine intervention. Diagoras, however, saw only half the story. He asked a simple, devastating question: Where are the offerings from those who prayed and drowned? The dead could not hang paintings on the wall. They could not carve statues. By displaying only the survivors, the believers were constructing a reality that ignored the vast, silent majority of the drowned. This ancient observation is the genesis of a logical error that continues to distort our understanding of medicine, politics, climate science, and daily life today.
This practice is known as cherry picking.
To understand the mechanics of this fallacy, one must first visualize the orchard. A cherry picker enters a grove with a basket. Their goal is to fill it with the most perfect specimens: the ripest, the reddest, the healthiest cherries. They naturally avoid the bruised, the green, and the rotting fruit hanging from the lower branches or hidden in the shade. When the picker presents their basket to an observer, the observer sees a basket full of perfection. It is a logical leap, but a dangerous one, for the observer to conclude that the entire tree, or indeed the entire orchard, is in such pristine condition. The sample is not representative; it is curated. The act of selecting data points that confirm a pre-existing hypothesis while ignoring the mountain of data that contradicts it is the essence of cherry picking. It can be a clumsy, unintentional error born of confirmation bias, or it can be a deliberate, calculated strategy of deception.
The Architecture of Deception
The term itself is deceptively simple, rooted in the agricultural reality of harvesting. Yet, the implications are profound. When a person engages in cherry picking, they are not merely omitting information; they are actively suppressing evidence that would lead to a complete picture. This is why the practice carries such a negative connotation. It is the intellectual equivalent of a magic trick where the magician shows you the card they want you to see while palming the rest. In the context of argumentation, this is often referred to as "card stacking" or "stacking the deck," a term borrowed from the world of gambling and stage magic. A magician might shuffle a deck of cards, but if the deck has been pre-arranged so that a specific hand is dealt to a specific player, the outcome is predetermined. The appearance of randomness is an illusion. Similarly, a cherry picker arranges the facts of an argument so that the conclusion seems inevitable, even if the reality is far more chaotic and nuanced.
This phenomenon is not limited to the fringe; it is a hallmark of pseudo-science and a primary tool of denialism. Consider the history of tobacco marketing. For decades, tobacco companies did not deny the existence of studies linking smoking to cancer. Instead, they cherry-picked the few studies that showed no link, or they highlighted the limitations of specific studies while ignoring the overwhelming consensus of the broader scientific community. They presented a mosaic of "uncertainty" by selecting only the tiles that fit their picture, while sweeping the thousands of tiles showing a clear, causal link into the shadows. The same tactic is visible in climate change denial, where skeptics will point to a single month of record-breaking cold temperatures or a specific year of low warming to refute decades of global temperature data. They are pointing to the single green cherry in a basket of rot, insisting that the green one proves the tree is healthy.
The psychological mechanism at play here is often confirmation bias, the most common example of a fallacy of selective attention. Humans have an innate desire to believe that their pre-existing views are correct. When we encounter information that supports our worldview, we accept it with little scrutiny. When we encounter information that challenges it, we scrutinize it with a microscope. The cherry picker exploits this bias. They do not need to convince you that the opposing evidence is false; they only need to convince you that the opposing evidence does not exist. They rely on the fact that most people will not go out and harvest the entire orchard for themselves. They will trust the basket presented to them.
The Tragedy of the Survivor
The most insidious form of cherry picking is known as survivorship bias, a concept that Diagoras of Melos intuitively grasped but which was only formally analyzed centuries later. This occurs when we focus on the people or things that "survived" some process and inadvertently overlook those that did not, typically because of their lack of visibility. The story of the shipwrecks is the archetype: we see the survivors' prayers and assume the gods answered them, forgetting the thousands of prayers that went unanswered because the pray-ers were at the bottom of the ocean.
In the modern era, this bias distorts our understanding of success. We study the habits of billionaire entrepreneurs, the diets of centenarians, or the workout routines of Olympic athletes. We conclude that their specific habits are the cause of their success. But we rarely study the thousands of people who followed the exact same habits and failed. We do not see the failed startups that followed the same business plans as the unicorns, or the athletes who trained just as hard but never made the team. By ignoring the "non-survivors," we create a false narrative of causality. We believe that if we just pick the right cherry, we will be successful, not realizing that the cherry we picked was just the one that happened to survive a lottery of chance.
This bias was starkly illustrated during World War II. The military asked mathematician Abraham Wald to recommend where to add armor to bombers. The data showed that the planes returning from missions had bullet holes concentrated in the fuselage and wings, but very few in the engines. The military analysts initially concluded that they should reinforce the areas with the most holes. Wald, however, realized they were making a classic cherry-picking error. They were only looking at the planes that survived. The planes that were hit in the engines did not come back. The absence of bullet holes in the engines of the returning planes was not evidence of the engine's durability; it was evidence that a hit there was fatal. The "data" the military was using was incomplete because it excluded the most critical cases: the destroyed aircraft. Wald's recommendation was to armor the engines, the very places where the data showed no damage. He understood that the silence of the data was as loud as the noise.
The Weaponization of Quotes and Data
While survivorship bias operates on the level of data sets, another form of cherry picking operates on the level of language: quote mining. This is a favorite technique of debaters and propagandists. It involves selecting a specific sentence or phrase from a speaker or writer that supports a position, while stripping it of the context that moderates or completely reverses its meaning. The facts within the quote may be technically true, but the argument they are used to support is a lie.
Imagine a scientist writing a paper that states, "The data is inconclusive, and while there is a slight correlation, we cannot rule out other factors." A cherry picker might take the phrase "there is a slight correlation" and use it to claim that the scientist has proven a link. They ignore the surrounding text that explicitly denies the strength or certainty of that link. This is particularly effective in the public sphere because research cannot be done live. A complex, nuanced argument requires time and effort to unpack, but a cherry-picked soundbite is instant, catchy, and easily remembered. Once a quote is mined and broadcast, it sticks. Even when corrected, the original distortion often lingers in the public consciousness, leading to a widespread misrepresentation of the groups or individuals targeted.
The French philosopher Michel de Montaigne recognized this human tendency in his essays on prophecies in the late 16th century. He wrote with biting wit about those who study almanacs and point to the few times a prediction came true as proof of the seer's power. > "I see some who are mightily given to study and comment upon their almanacs, and produce them to us as an authority when anything has fallen out pat," Montaigne observed. He noted that it is statistically impossible for a liar or a fool not to stumble upon a truth once in a while amidst an infinite number of lies. > "Nobody records their flimflams and false prognostics, forasmuch as they are infinite and common; but if they chop upon one truth, that carries a mighty report, as being rare, incredible, and prodigious." Montaigne understood that the human mind is wired to remember the hit and forget the misses. The cherry picker relies on this memory gap.
The Cost of One-Sidedness
The consequences of this fallacy extend far beyond academic arguments. In medicine, the selective use of evidence can cost lives. A 2002 review of antidepressant efficacy trials revealed a disturbing pattern of cherry picking. Researchers analyzed 31 different clinical trials and found that the exclusion criteria used to determine who could participate in the studies were so strict that the patients in the trials represented only a minority of those treated in real-world clinical practice. By excluding patients with complex profiles—those with multiple conditions or different demographic characteristics—the studies created a sanitized version of reality where the drugs appeared more effective and had fewer side effects than they actually did. When these results were generalized to the broader population, the ability to predict efficacy lacked empirical support. The cherry-picked sample did not reflect the messy reality of human biology, yet the conclusions were used to guide treatment for millions.
This is what happens when rigorous science is abandoned for the sake of a desired outcome. Rigorous science demands that we look at all the evidence, not just the favorable data. It requires controlling for variables, using blinded observations to minimize bias, and employing internally consistent logic. P-hacking, a practice where researchers manipulate data analysis until they find a statistically significant result, is a modern, technical form of cherry picking. It is the digital equivalent of picking through the orchard until you find a single perfect cherry, then claiming the whole tree is perfect.
Philosophy professor Peter Suber has argued that the one-sidedness fallacy does not necessarily make an argument invalid or unsound in a logical sense. The premises may be true. The conclusion may even be true. The fallacy lies in the persuasion. It consists of convincing the reader that we have said enough to justify a judgment when, in fact, we have only presented half the story. > "If we have been one-sided, though, then we haven't yet said enough to justify a judgment," Suber wrote. "The arguments on the other side may be stronger than our own. We won't know until we examine them."
Suber's insight cuts to the heart of why this practice is so dangerous in public discourse. We often think that one-sidedness is desirable if the goal is simply to win an argument. If the goal is persuasion, then manipulating the evidence seems like a valid strategy. But this is a short-sighted victory. If your argument is one-sided, you are vulnerable. You are unprepared for the counter-argument because you have spent your energy ignoring it. In a courtroom, a lawyer who presents only one side of the evidence is likely to be blindsided by a strong counter-argument they did not anticipate. The lesson, Suber suggests, is to cultivate two-sidedness in our thinking. We must actively seek out the evidence that contradicts our position. We must resist the urge to truncate our own understanding. A job that requires you to ignore half the facts is a job that requires you to be stupid.
The Propaganda of the Stacked Deck
In the realm of politics and advocacy, this technique is elevated to an art form known as card stacking. This propaganda technique seeks to manipulate audience perception by emphasizing one side of an issue and repressing the other. It is commonly used by political candidates to discredit opponents and elevate themselves. The technique is simple: highlight the impressive picture of the candidate's achievements while omitting their failures. Focus on the words "travel" and "adventure" on an enlistment poster while placing the words "enlist for war" in tiny, invisible font at the bottom. The goal is to create a reality where the audience believes they have seen the whole picture, when they have only seen a curated fragment.
The power of card stacking lies in its ability to exploit the cognitive load of the audience. Most people do not have the time or the energy to fact-check every claim they hear. They rely on heuristics, mental shortcuts that help them make quick decisions. A one-sided argument that is presented with confidence and repetition feels true. The absence of counter-evidence is interpreted as the absence of a counter-argument. This is why media bias is so potent. When a news outlet consistently reports on a specific set of facts while ignoring others, they are engaging in a slow-motion act of cherry picking that shapes the worldview of their audience.
The origin of the term "stacking the deck" reminds us of the magician's trick. The magician presents a deck that appears to be randomly shuffled, but every card is in a specific, predetermined order. The magician controls the outcome. In the same way, a propagandist controls the flow of information. They decide which facts are dealt to the public and which are kept in the pocket. Wherever a broad spectrum of information exists, appearances can be influenced by highlighting some facts and ignoring others. This is not just a tool of political candidates; it is used by advocacy groups, corporations, and even individuals in their daily lives to protect their egos and advance their agendas.
The Path to Integrity
The antidote to cherry picking is not just skepticism, but a disciplined commitment to intellectual honesty. It requires the humility to admit that our initial hypothesis might be wrong. It requires the discipline to seek out the "rotten cherries" in the orchard. When we encounter a claim, we must ask: What evidence is being presented? What evidence is missing? Who is the source, and what is their incentive to hide certain facts? We must look for the silence in the data, the missing voices in the conversation, and the unexamined assumptions in the argument.
In an age of information overload, where algorithms feed us content that reinforces our existing beliefs, the temptation to cherry pick is stronger than ever. We are constantly presented with curated feeds that show us only what we want to see. The orchard has been replaced by the filter bubble, and the picker is an algorithm designed to maximize engagement by confirming our biases. Breaking free from this requires a conscious effort to step outside the basket. We must actively seek out sources that challenge our views. We must read the studies that contradict our conclusions. We must listen to the arguments we find most annoying.
The story of Diagoras of Melos is not just an ancient anecdote; it is a timeless warning. It reminds us that the truth is rarely found in the survivors, the winners, or the perfect samples. The truth is found in the silence, in the drowned, in the failures, and in the messy, contradictory data that we are so eager to ignore. To understand the world as it is, we must be willing to look at the whole tree, not just the basket. We must be willing to accept that the orchard is full of both ripe and rotting fruit, and that our understanding of the harvest depends on our willingness to see it all.
The next time you hear a statistic that seems too perfect, a quote that seems too damning, or a story that seems too simple, pause. Ask yourself: What is being left out? Who is not speaking? Where are the votive gifts of the dead? The answer to those questions will often reveal the true shape of the argument, and perhaps, the true nature of the reality we are trying to understand. The cherry picker offers a basket of perfection, but the honest observer knows that perfection is often a lie. The truth is in the full harvest, with all its flaws, its contradictions, and its complexity. It is only by embracing that complexity that we can hope to arrive at a judgment that is not just persuasive, but true.