Public discussion of Pascal’s wager and its consequences have been a disaster for the human race.
Pascal wager, for those who don’t know, is the notion that one should act as if the religion they have the highest credence in is true, because believing the right religion might get you into heaven. Thus, believing the right religion has infinite expected value. Pascal’s mugging, in contrast, is the idea that you should give someone your wallet if he promises to give you infinite utility in exchange—because any chance he’s telling the truth has infinite expected value.
The thing with bad consequences is not really Pascal’s wager itself, so much as people’s takeaways from Pascal’s wager. I think it might be reasonable to wager on the religion with highest probability, depending on how likely one thinks that religion is and how valuable it is to affirm that religion. I don’t take the wager because I don’t think that affirming a major religion is actually the best way to bring about infinite value, but I don’t think it’s crazy for people with different credences to take the wager. I wouldn’t accept the mugging for similar reasons.
Now, it’s bad enough that people’s takeaway from the wager is usually that they can simply ignore low probabilities—a view I think is definitely false. Rather than more soberly considering the actual utility of wagering on a religion and giving a mugger your wallet, people react by abandoning expected utility theory, despite the amazingly strong arguments in its favor. But even that I can live with.
The truly irritating takeaway that some people have from Pascal’s wager is that one can simply ignore conclusions with inconvenient takeaways so long as big numbers are involved.
When I posted on LessWrong about the case for caring about insects, lots of people rushed to declare it a Pascal’s mugging. I made the argument that even if you have a 10% credence in insects feeling intense pain, this still means that ~100% of expected intense pain in the world is experienced by insects. Huge numbers of people treated that argument as a Pascal’s mugging.
Sorry, no. That is ridiculous.
The takeaway of Pascal’s mugging—if this is the right takeaway which I reject—is that you get to ignore really low probabilities. You don’t have to reorient your life towards maximizing the 1/10 trillion odds of infinite value. Really low probabilities get to be ignored.
But it cannot be that any risk below 50% is ignored! It can’t be that the lesson of Pascal’s mugging is that Russian roulette is fine, so long as the gun has more than 10 chambers. It can’t be that the lesson is that a 1/10 chance of horrors beyond comprehension, orders of magnitude worse in terms of raw agony than all human suffering so far in history, is not a big deal. Not every argument where big numbers are involved is a Pascal’s mugging.
With Pascal’s mugging, the utility is theoretical—you don’t know if it exists. We know that there are ~10^18 insects. We should have a non-trivial credence that when they thrash around and starve themselves to death to avoid having to ingest capsaicin, they are in intense pain. It’s not a Pascal’s mugging to suggest that you shouldn’t boil lobsters, because the odds are at least 1% that they experience suffering at least 10% as great as you would if you boiled alive.
Now, maybe it would be a rightful Pascal’s mugging if you were astronomically certain—99.99%+—that insects were not intensely conscious. But why in the world would you be certain about that? Consciousness is mysterious. We don’t know much about it. Animals certainly display very dramatic behavior in response to external harm. Why, aside from vibes based on how they look, would anyone have this kind of amazing confidence? Especially when reliable attempts to estimate their sentience tend to think it’s in expectation within an order of magnitude or two as intense as ours.
I remember also once hearing someone suggest that we shouldn’t work on AI risk because it was a Pascal’s wager. Any individual person has a low chance of preventing AI extinction. But surely it can’t be that no one should work on important problems because their odds of making a difference are low. There are serious questions regarding how one should treat low risks, but it won’t do to simply ignore problems where you personally are unlikely to make a difference—even if collectively people have a serious impact.
People are also much likelier to declare conclusions Pascal’s wagers when they have inconvenient implications. I’ve never heard someone say the idea that the Ukraine war is bad partly because it poses risks of nuclear war is a wager. It’s only when a conclusion requires someone changes their life in some way that it becomes a wager—even if the alleged discountable risk is like 10%.
Anytime you find yourself dismissing a risk that’s above, say, 1/10,000 because it’s a Pascal’s mugging, you should seriously reconsider your life decisions.
I think expected utility theory is sound. I’m ashamed to admit I just choose to selectively ignore it when tiny probabilities of great value suggest I do something really unintuitive. I do this because I’m unreasonable.
For example, I find Pascal’s Wager very compelling and think I should profess my belief in a god on the off chance he exists and cares about what I think. Why not? But I’d feel like such a phony, and would be so badly misrepresenting myself that I just can’t. So I admit, by not accepting god on spec, I’m simply being unreasonable.
But I think fanatics (people who say they adhere to tiny probability/great value EUT) also selectively ignore it. They just won’t admit it.
For instance, if you believe there’s even a slim chance that all people go to heaven when they die (a perfectly good place of infinite happiness), and you concede that life on earth is brutish and imperfect, then it is in humanity’s overwhelming best interest, from an expected value standpoint, to painlessly exterminate ourselves.
I admit this a strong argument from an EUT perspective. But I choose to be unreasonable and reject it. I think most fanatics reject it, too (they’re still with us, after all), but they won’t admit to their unreasonableness in doing so.
Lol liking this before reading it because the whole post could just be this title; it’s such a hilarious move people make.