Huemer is, once again, concerned about utilitarianism’s distribution of cookies. He writes “h. Excess altruism
John has a tasty cookie, which he can either eat or give to Sue. John knows that he likes cookies slightly more than Sue, so he would get slightly more pleasure out of it. Nevertheless, he altruistically gives the cookie to Sue. According to utilitarianism, this is immoral.”
Not necessarily, it’s conceivable, and true in the real world that people often are made happy by other people helping them. Offering to hold the door open for someone is good not just because they’re intrinsically less able to hold open doors, but also because it signals a nice gesture. Giving Sue a cookie is similar. So for utilitarianism to yield the wrong verdict, Huemer would have to stipulate that either Sue gets no happiness from being given the cookie independent of its taste, or that John likes the cookie slightly more than Sue likes the combined experience of the taste of the cookie and experience of being given a gift by John.
Additionally, immorality usually refers failing to care about others, so semantically it’s slightly strange to call it immoral, but it’s certainly bad. John should have done otherwise. Presumably we’d all agree that it would be morally neutral to donate the cookie if John and Sue liked cookies equally. However, if Huemer thinks that it would be morally neutral to give the cookie to Sue if Sue enjoys it more, then his obligation to give it to Sue wouldn’t change at all from Sue liking the cookie more. However, this is implausible. Surely one’s obligation to give someone a cookie is sensitive to how much they like cookies.
A plausible theory of good and bad action should not result in everyone taking good actions making everyone worse off. However, suppose that we stipulate that Sue also has a tasty cookie that John prefers. If John gives sue his cookie and Sue gives John her cookie, everyone is worse off. This is bad.
To raise the stakes, suppose that instead of cookies they’re offering each other agony reduction devices. John can either reduce his agony by 2 units or reduce Sue’s by 1 unit. Sue can similarly either reduce her agony by 2 units or John by 1 unit. They both start in a state of 100 billion units of pain and with 50 billion agony reduction devices. Additionally, suppose that when one is brutally tortured they’re in 10,000 units of agony. In this case, it seems clear that John and Sue should both use the suffering reduction devices on themselves. If they both use it on the other then they’d both end up in a state of 50 billion units of agony, which is 5 million times worse than horrific torture. So that seems erm, not great.
Huemer’s intuition here doesn’t seem very deep. The notion that it would be immoral to be excessively altruistic seems semantically weird, but if we replace immoral with bad, it seems perfectly reasonable.
We can see this with another argument. If we accept
1 Correct moral theories should inform agents of what they should do in all cases
2 In Huemer’s case, John can only give the cookie to himself or to Sue
3 Thus, correct moral theories should inform John of whether to give the cookie to himself or Sue
4 The correct moral theory would not inform John that he should give the cookie to Sue
Therefore, the correct moral theory would inform John that he should give the cookie to himself.
Well, that wraps up Huemer’s objections pretty well. The next post will conclude. Thanks everyone for reading this far.