11 Comments

One problem with sentientism as you've defined it is that it's in principle impossible to figure out which beings are sentient besides yourself. Sentient beings are supposed to have a phenomenal experience that's depsychologized - no amount of studying their brain, reactions, dispositions, etc is ever going to produce a "Eureka!" moment where you've discovered that they are in fact sentient, because there's no conceptual link between phenomenal states and physical states. This means that epistemically, some version of phenomenal solipsism ends up being most likely to be true, since there's no way in principle to sort out other putatively sentient beings (zombies) from truly sentient beings (ones with phenomenal consciousness).

Also, to make an inductive or inference to the best explanation appeal that creatures with physical states like yours have phenomenal consciousness because you have phenomenal consciousness, would just be falling into psychologizing bias - your psychologically identical zombie twin would have no phenomenal states, so there's in principle no similarity you can point to between yourself and other physically similar beings that ensures you both have phenomenal consciousness (besides having phenomenal consciousness itself, of which accepting the possibility of p-zombies rules out as having any knowable physical supervenience bases).

Fwiw I think moral status should be granted to individuals on a behaviorist and pragmatic basis - if a being can cry, scream, fight back, produce reports of being in pain, or produce any other behavior that we find morally important, then they have moral status. It's also the morally safer attitude to take in case the phenomenal-consciousness-first approach narrows the moral domain down to 1. (And even if it doesn't, I believe you have a prior commitment to consciousness being non-vague, meaning it "turns on" exactly at some point or another. This development would obviously be a major concern for the phenomenal-consciousness-first approach - when does this happen and how can you be sure it doesn't happen before the point you've identified?)

If any of these objections go through, the phenomenal-consciousness-first approach is going to end up falling back on something like the above behaviorist principle. Also, unless you think it's fine for yourself to be killed while you're sleeping (and thus having no phenomenal experience), then you already make pragmatic appeals to other principles that constitute a being's moral worth.

Expand full comment

It doesn't strike me as crazy to think that one relevant notion of welfare is things that have preferences, grounded out as tendencies to choose one thing over another. Under this sort of definition, it seems like non-sentient things could have welfare.

Expand full comment

Is it intuitive that sentience is required for moral value? It seems to me that dumping large quantities of toxic waste on a planet full of plants would plausibly be immoral, unless there were strong justifying reasons (e.g. if we don't do it then the waste will harm a large quantity of sentient beings). Perhaps one might think that sentient beings matter MORE than non-sentient ones, but it strikes me as intuitive that non-sentient living things have some degree of moral worth.

Also, it's false that "in order to avoid sentientism, one has to have an utterly bizarre view of welfare." Perfectionist accounts don't imply sentientism, and they're both much older and much better than any of the three you listed :) There's also a fairly robust literature defending perfectionism; it's a minority view, but it's not like nobody prominent endorses it.

Also, about the "what if we found out babies and disabled people weren't technically human" example, it seems like one could say that given how human-like they are (such that we've been fooled into thinking they were human up until now), it's probable that whatever kind they belong to has the same moral properties as homo sapiens. Thus, we should at least err on the side of caution and treat them as if they have full moral worth.

Of course, I know you're arguing against those who try to limit the moral sphere to human beings (which is obviously wrong), whereas I'm making the opposite point (that it ought to be expanded further). I just thought it was worth commenting.

Expand full comment

were you inspired by vegan bivalve discourse?

Expand full comment