Freddie deBoer cuts through the noise of NFL draft season by exposing a dangerous intellectual trap: the belief that aggregating flawed opinions creates truth. While the sports media obsesses over whether a team "reached" based on a consensus board, deBoer argues that this metric is less a compass for talent and more a mirror of groupthink, where the "wisdom of crowds" collapses under the weight of shared biases and correlated data.
The Index Fund Illusion
DeBoer begins by dismantling the popular analogy that treats the consensus draft board like a safe stock market index fund. He writes, "Think of the consensus draft board as the index fund of the NFL Draft... you're not betting on any single scout's hot take; you're buying the whole market's collective opinion." On the surface, this seems prudent. Data suggests that averaging hundreds of analyst rankings does indeed predict where players will be selected better than any single individual could. However, deBoer quickly pivots to the fatal flaw in this logic: the market itself is broken.
He argues that treating the consensus board as an oracle for player success ignores the reality that the NFL draft is a "system that a) is inherently noisy and b) has a low basal hit rate." Unlike the stock market, which generally trends upward over time, the "market" of NFL draftees is a place where the median outcome is mediocre at best. As deBoer notes, "Refusing to deviate from a benchmark that's wrong roughly half the time isn't an act of discipline... you're just guaranteeing that you replicate the median outcome in a system where the median outcome is mediocre." This is a crucial distinction. In finance, an index fund works because the underlying assets have positive expected value. In the draft, the underlying assets are a coin flip. To borrow from the concept of the Law of Large Numbers, while the average of many predictions might stabilize, it stabilizes around a mean that is often wrong about the specific outcome that matters: on-field performance.
"The consensus board is a prediction about how NFL teams will select prospects, which is a separate question from which prospects will succeed."
This separation of "selection order" from "player success" is the article's most vital insight. The consensus board tells you what the market thinks will happen, not what will happen. DeBoer points out that even the best data fails here: "First round quarterbacks (the most important picks of all) hit at roughly 35–45%... regardless of where you look, a majority fail to become winning starters." When the baseline for success is so low, adhering slavishly to the consensus doesn't protect a team; it guarantees they will never outperform the league average. A counterargument might suggest that deviating from the consensus increases risk, but deBoer counters that in a low-hit-rate environment, the only way to find a star is to take a calculated swing, not to follow the herd.
The Collapse of the Wisdom of Crowds
The piece then turns its critical eye toward the theoretical underpinnings of the consensus model, specifically the idea of the "wisdom of crowds." DeBoer references James Surowiecki's 2004 book, noting that the theory relies on one non-negotiable condition: independence. "For the crowd to be wise, its estimates have to be drawn from genuinely separate information, separate reasoning processes, separate intuitions," deBoer writes. He argues that the modern draft ecosystem has destroyed this independence.
Instead of independent scouts working in silos, analysts are locked in a feedback loop, reading each other's mock drafts and reacting to the same combine metrics. DeBoer observes, "When estimates are correlated, when everyone is reading each other's mock drafts... aggregation no longer cancels error; it amplifies whatever shared bias the cluster carries." This is where the concept of groupthink becomes a structural failure rather than just a psychological quirk. The consensus board doesn't represent a hundred independent opinions; it represents a single opinion repeated a hundred times with slight variations.
This correlation makes Bayesian updating impossible, as the new information entering the system is neither independent nor particularly informative. DeBoer highlights that much of the data driving these rankings, such as combine scores for running backs, "correlate poorly with production despite driving major draft-capital decisions." The result is a system drowning in information but starving for insight. He concludes that "a hundred correlated boards is not a hundred independent samples but a bunch of closely-related projections derived from a lot of the same information."
"Telling investors not to pick and choose individual stocks is wise because the market generally and predictably goes up over time... But giving the same advice to NFL teams picking players is a much less sensible proposition, because the overall 'market' of NFL draftees has seen much less success over time."
DeBoer's critique of the "index fund" mentality is particularly sharp here. He suggests that if a general manager truly believes they have information the market lacks, sticking to the consensus is not just wrong; it's a failure of the job description. "If you feel very confident that you have information the other teams don't, you should take your swing," he asserts. The media's tendency to roast teams for "reaching" on a player who fell in the consensus rankings is, in his view, a misunderstanding of how value is created in an inefficient market.
Bottom Line
DeBoer's strongest contribution is exposing the statistical fallacy of treating the NFL draft like a stable financial market, where the "wisdom of crowds" is actually just the amplification of shared biases. The piece's vulnerability lies in its dismissal of the risk involved in deviating from the consensus; while the median outcome is mediocre, the cost of a bad "swing" pick can be catastrophic for a franchise's salary cap and competitiveness. However, the argument successfully reframes the debate: the goal shouldn't be to match the consensus board, but to beat the noise floor of a system where success is the exception, not the rule.