Discussion about this post

User's avatar
TheKoopaKing's avatar

One problem with sentientism as you've defined it is that it's in principle impossible to figure out which beings are sentient besides yourself. Sentient beings are supposed to have a phenomenal experience that's depsychologized - no amount of studying their brain, reactions, dispositions, etc is ever going to produce a "Eureka!" moment where you've discovered that they are in fact sentient, because there's no conceptual link between phenomenal states and physical states. This means that epistemically, some version of phenomenal solipsism ends up being most likely to be true, since there's no way in principle to sort out other putatively sentient beings (zombies) from truly sentient beings (ones with phenomenal consciousness).

Also, to make an inductive or inference to the best explanation appeal that creatures with physical states like yours have phenomenal consciousness because you have phenomenal consciousness, would just be falling into psychologizing bias - your psychologically identical zombie twin would have no phenomenal states, so there's in principle no similarity you can point to between yourself and other physically similar beings that ensures you both have phenomenal consciousness (besides having phenomenal consciousness itself, of which accepting the possibility of p-zombies rules out as having any knowable physical supervenience bases).

Fwiw I think moral status should be granted to individuals on a behaviorist and pragmatic basis - if a being can cry, scream, fight back, produce reports of being in pain, or produce any other behavior that we find morally important, then they have moral status. It's also the morally safer attitude to take in case the phenomenal-consciousness-first approach narrows the moral domain down to 1. (And even if it doesn't, I believe you have a prior commitment to consciousness being non-vague, meaning it "turns on" exactly at some point or another. This development would obviously be a major concern for the phenomenal-consciousness-first approach - when does this happen and how can you be sure it doesn't happen before the point you've identified?)

If any of these objections go through, the phenomenal-consciousness-first approach is going to end up falling back on something like the above behaviorist principle. Also, unless you think it's fine for yourself to be killed while you're sleeping (and thus having no phenomenal experience), then you already make pragmatic appeals to other principles that constitute a being's moral worth.

Expand full comment
Daniel Filan's avatar

It doesn't strike me as crazy to think that one relevant notion of welfare is things that have preferences, grounded out as tendencies to choose one thing over another. Under this sort of definition, it seems like non-sentient things could have welfare.

Expand full comment
10 more comments...

No posts