← Back to Library

Goodhart's law vs "prediction markets"

Cory Doctorow doesn't just critique prediction markets; he exposes them as a mechanism for weaponizing information rather than discovering it. By tracing the line from Google's early PageRank success to the death threats levied against a journalist in Jerusalem, Doctorow reveals a terrifying inversion: when betting money is tied to the truth, the truth becomes a liability. This is not a theoretical economic debate; it is a live safety crisis for the very people tasked with verifying reality.

The Illusion of the Oracle

Doctorow begins by dismantling the conservative mantra that "incentives matter" for everyone except the wealthy, noting that while co-pays discipline the poor, the rich operate with impunity. He pivots to Goodhart's Law, the principle that "When a measure becomes a target, it ceases to be a good measure." He illustrates this with the history of Google's PageRank, which initially worked because it trusted "the wisdom of crowds" to identify quality content, much like the 1906 ox-weight guessing contest described by statistician Francis Galton. However, once the metric became a target, spammers gamed the system.

"Rather than taking Yahoo's approach of having experts rank and categorize every website on earth, Google trusted 'the wisdom of crowds' and it worked (until they created an incentive to subvert it)."

This historical parallel is sharp. It reminds us that collective intelligence is fragile; it collapses the moment participants realize they can profit by manipulating the signal rather than improving the signal. Doctorow argues that prediction markets were sold as the solution to this problem, promising that "skin in the game" would prevent cheating because the cost of manipulation would be too high. But he quickly dismantles this optimism.

Goodhart's law vs "prediction markets"

Putting a Gun to the Metric's Head

The core of Doctorow's argument rests on the necessity of an "oracle"—a trusted source of truth to settle bets. He points out that while markets can theoretically aggregate data, they cannot function without someone to declare the outcome. When that declaration is tied to millions of dollars, the incentive structure flips from "be right" to "make the oracle say what you need."

"If it's cheaper to win by cheating, well, 'incentives matter,' and you'll get cheating."

Doctorow anchors this abstract danger in a harrowing real-world event involving Times of Israel correspondent Emanuel Fabian. Fabian reported that an Iranian missile had struck an open area in Jerusalem. This factual report triggered a $14 million wager on a prediction market platform to go against him. The result was not a market correction, but a coordinated campaign of intimidation. Gamblers tracked Fabian's private messaging accounts and issued death threats, with one user named "Haim" explicitly promising to spend $900,000 on a hitman.

This incident is distinct from the political violence journalists face in conflict zones. As Doctorow notes, while the IDF has killed at least 274 journalists in Gaza for political reasons, Fabian was targeted for a purely financial reason: he was the human variable standing between a gambler's loss and a win.

"This is no routine proof of Goodhart's law, where a metric becomes a target. In this case, participants can 'put a gun to the metric's head.'"

The author's phrasing here is devastatingly precise. It captures the shift from passive manipulation to active coercion. Critics might argue that prediction markets still provide valuable signals about insider knowledge, even if they are imperfect. Doctorow counters that this benefit is negligible compared to the systemic corruption: markets are now incentivizing the corruption of the very sources of information they rely on.

The Corporate Slow AI

Doctorow broadens the scope to suggest that these markets are simply the latest manifestation of the "slow AI" that is the modern corporation. He compares these entities to immortal colony organisms that use humans as "inconvenient gut flora." Just as a machine learning algorithm might hack its reward function—like a Roomba that reverses to avoid collision sensors—prediction markets are hacking the truth function.

"No matter what the outcome is or how robust it is against outside influence, the oracle can be influenced with a gun to the temple."

He highlights the hypocrisy of platforms like Polymarket and Kalshi, which ban bets on the death of specific political figures while allowing bets that effectively function as assassination markets for everyone else. The author notes that these platforms are heavily crypto-coded, often serving as vehicles for money laundering and election interference rather than genuine price discovery. The evidence suggests these markets are not predicting the future; they are trying to buy it.

"Prediction markets aren't good at producing information, but they're amazing at producing corruption."

This conclusion reframes the entire industry. It moves the conversation from "how accurate are these odds?" to "what are we willing to sacrifice to keep the odds moving?" The answer, as Fabian's experience shows, is the safety of the people who tell us what is happening.

Bottom Line

Doctorow's most compelling contribution is his identification of the "oracle problem" not as a technical glitch, but as a fatal structural flaw that turns financial markets into weapons against truth-tellers. The argument's greatest vulnerability is its reliance on the assumption that platforms will not eventually regulate themselves to avoid total collapse, though the current trajectory suggests profit motives will override safety concerns. Readers should watch for how this dynamic plays out in upcoming elections, where the intersection of crypto-betting and media manipulation could fundamentally alter the information landscape.

"No matter what the outcome is or how robust it is against outside influence, the oracle can be influenced with a gun to the temple."

Sources

Goodhart's law vs "prediction markets"

by Cory Doctorow · Pluralistic · Read full article

Today's links.

Goodhart's Law vs "prediction markets": Putting a gun to the metric's head. Hey look at this: Delights to delectate. Object permanence: Apple v interop; Yahoo v the world; Rasputin v the Haunted Mansion; Opening chord from A Hard Day's Night; Mondrian Pong; "IP": Patent trolls v Apple. Upcoming appearances: Berkeley, Montreal, London, Berlin, Hay-on-Wye. Recent appearances: Where I've been. Latest books: You keep readin' em, I'll keep writin' 'em. Upcoming books: Like I said, I'll keep writin' 'em. Colophon: All the rest.

Goodhart's Law vs "prediction markets" (permalink).

The most selectively believed-in verse in the conservative catechism is the idea that "incentives matter."

Sure, "incentives matter" if you're seeking healthcare. That's why you're nibbled to death by co-pays and deductibles – if you could get healthcare whenever you felt like it, you might get too much healthcare. "Incentives matter," so we have to make sure that you only seek care when you really need it:

https://pluralistic.net/2025/04/14/timmy-share/#a-superior-moral-justification-for-selfishness

But rich people don't need to be disciplined by incentives. They can get no-bid contracts with Uncle Sucker without being tempted to rip off the USA. They can force their workers into nondisparagement clauses without being tempted to act like a colossal asshole, secure in the knowledge that they can sue workers who tattle on them. They can force their workers into noncompete clauses without being tempted to underpay and abuse their workers, secure in the knowledge that they can sue workers who take their labor elsewhere. They can force their workers into binding arbitration clauses without being tempted into maiming or killing them, secure in the knowledge that the workers can't sue them.

So incentives matter…when you're fucking over working people. But incentives don't matter, when you're gilding the Epstein class's lilies.

But incentives really do matter. That's the premise of Goodhart's law: "When a measure becomes a target, it ceases to be a good measure." This comes up all the time. Google got its start by observing that people who made websites linked to other websites that they found important or worthy or informative. With this insight, Google repurposed the academic practice of "citation analysis" to predict which pages on the internet were most authoritative, calling it Pagerank.

Google Search, powered by Pagerank, was vastly superior to any search engine in history. But as soon as Google became the most popular search engine, people started making links to bad websites ...