← Back to Library

This Paradox Splits Smart People 50/50

This Paradox Splits Smart People 50/50", "author": "Derek Muller", "publication": "Veritasium", "text": "A strange problem has been dividing people for decades. It appears at philosophy conferences, in economics departments, and in casual conversations between friends who suddenly become enemies. The setup sounds simple: you walk into a room and find two boxes on a table. One contains $1,000 — you're certain about that. The other is a mystery box. A supercomputer that has never been wrong tells you it predicted whether you'd take just the mystery box or both boxes before you entered the room. If it predicted you'd choose only the mystery box, it placed $1 million inside. If it predicted you'd grab both boxes, it left the mystery box empty.

This is Newcomb's Paradox — named not for a person but for the mathematician William Newcomb — and it's one of the few problems where professional philosophers genuinely disagree by half.

Your choice reveals something fundamental about how you think about decision-making.

Two camps exist. Those who take only the mystery box argue that since the computer was correct almost every time, there's overwhelming evidence it predicted correctly this time too. Taking just the mystery box means walking away with $1 million. Those who take both boxes argue that whatever choice you make now doesn't change what happened before you entered the room — the money is already either in there or not. Taking both boxes guarantees at least $1,000 and potentially much more.

The first group uses what's called evidential decision theory. They reason that because the computer's accuracy has been proven across thousands of cases, their choice actually influences whether a million dollars sits waiting in the mystery box. The expected utility calculation favors one boxing — meaning taking only the mystery box — when the computer's accuracy exceeds roughly 50%. Given the track record described, that's clearly the case.

The second group uses causal decision theory. They reason that nothing about what happens now can retroactively change what was already predicted. The boxes were set up before you learned about the problem. Your current decision doesn't influence the past. Taking both boxes always gives you at least $1,000 plus whatever might be in the mystery box — making it the dominant strategy regardless of what the computer predicted.

Here's where it gets uncomfortable: if a perfect predictor exists, does that mean free will doesn't exist? If the computer's prediction was already fixed before you made any choice, then nothing you do in the moment changes anything. The future was already determined.

Some philosophers argue this reveals that free will is actually an illusion — but we live as though it's real anyway. Others push back, saying the paradox assumes a perfect predictor which doesn't exist in practice. If such a being existed with 100% accuracy, the debate would be settled. But that's an unrealistic assumption.

The strongest version of this argument comes from philosophers Gibbert and Harper's 1978 paper. They argue the rational choice is to take both boxes — but they admit one boxing actually fares better financially. The game may be rigged in favor of those who ignore logic and just grab everything available.

Both approaches are mathematically sound, which is what makes this so maddening.

The paradox doesn't have a clean answer because it exposes a genuine split in how people understand causation and evidence. Some see their choices as influencing future outcomes. Others see their choices as powerless against predictions already made. Neither group is wrong — they're just applying different decision theories to the same problem.

What Newcomb's Paradox reveals is that rationality isn't a single path. Two perfectly rational people can look at identical information and walk away with opposing answers. The problem isn't that one side is smarter than the other. It's that they fundamentally disagree about whether their decisions can change what's already been predicted.

There is a problem that I can't bring up without starting a fight. >> No. What? >> It just seems so obvious to me.

>> Now I'm all screwed up, man. >> It has infiltrated every single Veritasi meeting in the last 2 months. >> It's trivial. >> I didn't think you would fall for this side.

>> Just makes sense. >> Let's go. >> Creepy. >> And I even argued with Derek about it.

There's no way. You're trying to convince me. I don't care. >> So, here's the setup.

You walk into a room and there's a supercomput and two boxes on the table. One box is open and it's got $1,000 in it. There's no trick. You know it's $1,000.

The other box is a mystery box. You can't see inside. You also know that this supercomput is very good at predicting people. It has correctly predicted the choices of thousands of people in the exact problem you're about to face.

Now, you don't know what that problem is yet, but you do know that it has been correct almost every time. Now, the supercomputer says you can either take both boxes, that is the mystery box and the $1,000, or you can just take the mystery box. So, what's in that mystery box? Well, the supercomput tells you that before you walked into the room, it made a prediction about your choice.

If the supercomputer predicted you would just take the mystery box and you'd leave the $1,000 on the table, well then it put a million dollars into the mystery box. But if the supercomput predicted that you would take both boxes, then it put nothing in the mystery box. The supercomputer made its prediction before you knew about the problem. And it has already set up the boxes.

It's not trying to trick you. It's not trying to deprive you of any money. Its only goal is to make the correct prediction. So what do you do?

Do you take both boxes or do you just take the mystery box? Don't worry about how the supercomput is making its prediction. Instead of a computer, you could think of it as a super intelligent alien, a cunning demon, or even a team of the world's best psychologists. It really doesn't matter who or what is making the prediction.

All you need to know is that ...