10 Comments

> Suppose that you know that in one hour, you’ll be a consequentialist

I would drive myself to the hospital and kill myself to donate my organs. This avoids both the 5 deaths and the infinitely worse consequence of me being a utilitarian.

More broadly. I flatly reject any hypothetical that requires assuming such radical shifts in personal mental states. It’s either incomparable with free will, which moots morality, or it requires mind control shenenganges which are plainly subject to different rules (and impossibility).

> moral person wants you to do the wrong thing

You’re sneaking in equivocation of terms. They are not making a moral judgment, they are evaluating world-states.

Would a perfectly moral person want you to make the world worse? That is the price of principle.

> Paralysis

No matter how much utilitarians fantasize over infinite future lives, you cannot violate the rights of someone who does not exist…

Expand full comment

Most of these examples assume forms of deontology where there is more than one absolute non-overridable principle. Hierarchical or lexical forms of deontology don't necessarily suffer from the same problems.

Expand full comment