10 Comments

> Suppose that you know that in one hour, you’ll be a consequentialist

I would drive myself to the hospital and kill myself to donate my organs. This avoids both the 5 deaths and the infinitely worse consequence of me being a utilitarian.

More broadly. I flatly reject any hypothetical that requires assuming such radical shifts in personal mental states. It’s either incomparable with free will, which moots morality, or it requires mind control shenenganges which are plainly subject to different rules (and impossibility).

> moral person wants you to do the wrong thing

You’re sneaking in equivocation of terms. They are not making a moral judgment, they are evaluating world-states.

Would a perfectly moral person want you to make the world worse? That is the price of principle.

> Paralysis

No matter how much utilitarians fantasize over infinite future lives, you cannot violate the rights of someone who does not exist…

Expand full comment
author

//I would drive myself to the hospital and kill myself to donate my organs. This avoids both the 5 deaths and the infinitely worse consequence of me being a utilitarian.//

This isn't an option.

//More broadly. I flatly reject any hypothetical that requires assuming such radical shifts in personal mental states. It’s either incomparable with free will, which moots morality, or it requires mind control shenenganges which are plainly subject to different rules (and impossibility).//

But people do actually change their minds. There's nothing impossible about a device which predicts that you will change your mind before you actually do change your mind.

//You’re sneaking in equivocation of terms. They are not making a moral judgment, they are evaluating world-states.//

Preferring X is the same as preferring a world where X occurs. Upon finding out that they live in a world where you killed the person and harvested her organs, they are happy and relieved. Were this not the case, you could have a desire for something to happen and also not to be in a world where it happened--would you then desire it happens somewhere not in the owrld?

On paralysis, of course you can violate the rights of someone who doesn't exist. Suppose I set a bomb that will detonate in 150 years. I will violate the rights of someone who doesn't currently exist.

Expand full comment

//But people do actually change their minds. There's nothing impossible about a device which predicts that you will change your mind before you actually do change your mind.//

This is an empirical question that I am not competent to answer. My view is that if such a machine existed and was perfectly accurate free will would be false and morality would thus be a moot question.

//Were this not the case, you could have a desire for something to happen and also not to be in a world where it happened--would you then desire it happens somewhere not in the owrld?//

The world is great, the process of getting there is no Bueno. The state of the world is also irrelevant, the only moral question is what acts one should do. Honestly, you can probably just avoid the paradox entirely by saying that "ranking worlds" is meaningless from a moral perspective. I would have to think about that...

//On paralysis, of course you can violate the rights of someone who doesn't exist. Suppose I set a bomb that will detonate in 150 years. I will violate the rights of someone who doesn't currently exist.//

And your Bomb may very well harm no one. But if it did "harm" someone by, say, blowing them up, then it harmed people who do in fact exist. Nonexistent beings cannot be blown up. It would not harm the children of those blown up people, who now do not exist.

Expand full comment

Most of these examples assume forms of deontology where there is more than one absolute non-overridable principle. Hierarchical or lexical forms of deontology don't necessarily suffer from the same problems.

Expand full comment
author

Literally none of them assume that.

Expand full comment

Consider your first argument.

"It seems very obvious that they shouldn’t diffuse their own bomb—they should instead diffuse the two others. But this is troubling—on the deontologist’s account, this is hard to make sense of. When choosing between diffusing their own bomb or two others, they are directly making a choice between making things such that they violate two people’s rights or violate another person’s rights."

That is completely unconvincing to a deontologist who believes the right to life overrides property rights in all cases.

Expand full comment
author

It’s not the right to property being invoked--it’s a choice between you killing one and 2 others killing one. Deontologists generally say that you oughtn’t kill one to prevent two killings.

Expand full comment

"...or violate another person’s rights."

Expand full comment
author

They're violating the right to life, in this case.

Expand full comment
Comment deleted
Expand full comment