The Self-Indication Assumption's Narrative
SIA says you should think, based on your existence, that there are more people. How do we make sense of that?
How much gold can you hold in an elephant's ear?
When it's noon on the moon, then what time is it here?
If you could count for a year, would you get to infinity
Or somewhere in that vicinity?
—Tom Lehrer “That’s Mathematics.”
The self-indication assumption is the view of anthropics that says that, given that you exist, you should think that there are more people like you that exist. A notable case that differentiates SIA from various alternatives was presented by Joe Carlsmith:
God’s extreme coin toss: You wake up alone in a white room. There’s a message written on the wall: “I, God, tossed a fair coin. If it came up heads, I created one person in a room like this. If it came up tails, I created a million people, also in rooms like this.” What should your credence be that the coin landed heads?
SIAers think that, in this case, you should think the odds are a million to one that the coin came up tails, for in such a world it’s likelier that you, in particular, would come to exist. I’ve argued elsewhere that SIA is the correct way to reason about anthropics and the alternatives have various grotesquely counterintuitive consequences. But there’s a big difference between seeing that SIA is right and seeing why it’s right. I’ve elsewhere written that SIA is just being a Bayesian about your existence, but this explanation was much too quick. So here I’ll describe in more detail how SIA thinks about probabilistic reasoning.
Here’s one intuitive explanation of it: I know I have some particular set of experiences. If there were fewer people, then while there might be someone having the experiences I’m having, it’s less likely that my particular experiences would be had by me. If you think that beyond facts about what is being experienced there’s a particular fact about which version of those experiences is being experienced and who it is having them, then this makes quite a bit of sense. If there are more people, it’s likelier that any particular person would have some particular experience. Prior to looking at the world, I don’t know which of those qualitatively identical experiencers I am, so I should split my credence across the ones I might be. Then, upon finding out that I exist, I get evidence that more of these are existent, because that makes it likelier that whichever one I am, in particular, would exist.
Here’s another way of thinking about it. When one is a Bayesian, they’ll reason in the following way. First, they’ll look at what their credence would be in the various hypotheses before considering the fact that they’re updating on. Then, they’ll update on that fact, by seeing how many times more likely it is on one hypothesis than the other. If we apply this to one’s existence, we get the following. Prior to thinking about your existence, you have no reason to think any theory is more likely, so you split your credence across the possible people you might be with your experiences. Then, you realize that you exist, so you update in favor of theories that hold that a greater share of possible people exist. But this just is SIA. Prior to looking at the evidence, you’re not sure which of the possible people with your experience you are, so you should have an equal prior in all of them.
The mistake SSAers make is in assessing priors based on the evidence that you have. They reason as if they’re randomly selected from actual people when assigning priors. But that’s cheating and evil and duplicitous and an anti-Bayesian heresy condemned at the council of Constantinople in 381 AD (oh Patrick). Your priors can’t be based on the evidence—they must be assigned before looking at the evidence, and so they can’t be responsive to facts about the people in the world.
Here’s one way that makes good sense if you believe in souls—which I do—and I think can be broadly incorporated on different stories of personal identity, but it gets trickier. You’re some particular soul. The theory that, say 100 souls are created, is 100 times as likely to make your soul be created as the theory that just 1 is created. So from the fact that I exist, I should think there are more souls.
Here’s one final way of thinking about it: it’s easy to see the broad story behind the following view of anthropics which we might call the bare experience updating view. On this view, if you have some experience, you should treat it as if you received the following memo: “someone had these experiences,” followed by a detailed description of your experiences. On this view, from the fact that you’re having some experiences, you get evidence for theories that make it likelier that someone would have those experiences.
This theory, however, is fatally flawed. It violates a constraint called the conservation of evidence, according to which you shouldn’t expect your credence in some proposition to go up after seeing some evidence. So, for example, it was wrong for people trying witches to think that whether the witch sank or swam increased the odds she was a witch, because that means that in expectation, their credence would go up.
The view violates it in a straightforward way. Suppose that I’m considering two hypotheses. Both predict that there are two people in a room. One of them predicts that both of them will, when they look under a table see a red strip of tape, while the other theory predicts that one of the two people will see a red strip and the other a blue strip.
On this view, suppose the people in the room haven’t looked at the strip of tape yet. They should expect their credence in the second hypothesis to go up. If they see a blue strip, then the second hypothesis is confirmed. If they see a red strip then their credence won’t change, because both theories predict someone seeing a red strip. So therefore they should expect their credence to rise.
To get around this, we should modify the view. Rather than thinking of the evidence as “someone has such and such experiences,” think of it as “the person who had the first experiences had by a person will then have the second and the third, and so on.”
But from here, there’s a very plausible way to extend the theory. Rather than just thinking that a greater share of possible experiences will be had by someone, think that a greater number of absolute experiences are had. This avoids implausible discontinuities, wherein one’s credence becomes very sensitive to minuscule changes in the content of the experience.
My claim is that the story told about why SIA makes sense is very much like the story about why the view just sketched out makes sense. The existence of more experiences in total increases the odds of any particular experience being had.
Here’s another way to think about it. Think of yourself like a random raspberry in modal space (as I say to myself three times every day in the mirror—I’m good enough, I’m smart enough, and I’m like a random raspberry in modal space). Imagine that there’s some raspberry that is such that if it exists you’d know about it. This is a fairly ordinary raspberry, out of the Beth 2 possible raspberries.
Upon finding this out that the raspberry exists, you get evidence for theories on which there are more raspberries. For if there are more raspberries, it’s likelier that any particular one would exist. I claim that you are like a raspberry—if there are more people, it’s likelier that any particular one would exist, and if you do exist, you’d be the first to know. The following principle is plausible:
if x is a non-extraordinary member of class y, and x is such that if it exists you’d know about that, wherever it exists, and you do in fact know that x exists, then you have reason, ceteris paribus, to think that more members of class y exist.
A non-extraordinary member of a class is a member of a class that is reasonably standard and non-special. So God wouldn’t count as a non-extraordinary agent, but, say, a random guy named John, or you, or I would.
Let’s apply this principle. If there’s some possible raspberry which is such that if it exists you’ll know about it, wherever it is, and you do know about it, all else equal you have reason to think there are more raspberries. But if we apply this to selves, and assume that you’re not special, we have reason to think that there are more people.
All of these explanations may seem a bit weird. They don’t to me, but I can see how they might to someone else. The problem for opponents of SIA is that, while these explanations might be a bit weird, there is no similarly plausible story in support of any other theory of anthropics that’s plausible.
For example, you might have a view on which you don’t update on your existence. But that’s clearly crazy—it implies that from the fact that you exist, you don’t get any evidence that your parents didn’t use effective contraception in conceiving you (and other implausible things).
SSA says that you should reason as if you’re randomly selected from beings in your reference class, where your reference class is some group of beings similar enough to you. But there’s no principled basis for the reference class. They basically just make it up to make sense of our anthropic intuitions. But this means that just as a story of what is going on in anthropic reasoning, prior to looking at specific cases, SIA is way more plausible than all its rivals.
I’m also suspicious because alternatives to SIA don’t really treat your existence as something that you update on. They don’t treat that you exist as a new fact that you learn that raises the likelihood of theories. But this seems like an arbitrary exception to the general Bayesian maxim: update on everything and consider the total evidence.
Here’s one last way of putting the basic intuition that if there are more people, it’s likelier that any particular person would exist. Imagine that there were 5 possible people, A, B, C, D, and E. There are two theories one is deciding between: the first one says all 5 people will be created, and the second says only one person will be created at random. On non-anthropic grounds, those theories are equal. SIA prescribes that upon finding out one exists, they should think that the first theory is 5 times as likely as the second.
But this seems like the right way to go! On the second hypothesis, the odds A would exist are 1/5, so upon realizing they exist, A should think the first hypothesis is 5 times as likely as the second one. If you treat the fact that you exist as an ordinary event, then you should think that, because theories that predict fewer people exist make it less likely that any particular person would exist, they are consequently worse. If you keep running the experiment, 5/6 of the worlds that have person A are ones where the first hypothesis came true.
>I know I have some particular set of experiences. If there were fewer people, then while there might be someone having the experiences I’m having, it’s less likely that my particular experiences would be had by me.
Isn't this false in general, and thus in need of greater qualification? Suppose God flips a coin: if heads, he creates a thousand people and randomly assigns ten a red jacket and the rest blue jackets; if tails, he just creates one person with a red jacket. I notice my jacket is red. This is obviously going to be evidence for the tails outcome (since observing blue would be definitive evidence for the heads outcome), in spite of the fact that fewer people with my experience of seeing a red jacket exist on tails.
But your existence is a necessary fact. It is metaphysically impossible for you not to exist, because the act of calculating the probability of your own existence comes only after your own existence has been established.
This is entirely distinguishable from contraception because contraception doesn’t impact the probability of whether you exist, it impacts the probability of you being a flesh and blood human whose body was created by sexual reproduction. The changes of you specifically existing in some form are always 100%.
Saying that it’s about your own experiences is also not relevant. If you exist, maybe you will have some arbitrary set of experiences, but since you’re guaranteed to exist, having some arbitrary set of experiences doesn’t predict anything. The chances of you having this are 100% either way.