Huemer's Paradox of Deontology's More Threatening Twin
Deontology might leave everyone desolate and miserable
Huemer has a paradox of deontology. The basic idea is the following. If you reduce a person’s suffering by 2 units at the cost of inflicting one unit of suffering, deontology says that’s wrong. However, if you do that to two people, the combination of those acts reduces everyone’s suffering and is clearly good. But this is what you get from two wrong actions—though if you do two wrong things, and each is wrong even conditional on the other, it shouldn’t result in something right.
This argument is, I think, quite tricky to get out of. But the idea is not that it’s obvious that taking the combination of actions is necessarily good, instead, it’s that the combination of actions is not different from one action that combines their effects, which is clearly good, because, when taking an action, it is irrelevant whether it’s considered one action or two actions.
But I think there’s another argument in this vicinity that is even more threatening. This argument may be enough to refute deontology single-handedly. The basic argument shows that rights violations for trivial benefits are sometimes fine.
Take the following example of theft. Suppose that there are 1,000 aliens, each of which has a stone. They can all steal the stone of their neighbor to decrease their suffering very slightly any number of times. The stone produces very minimal benefits—the primary benefit comes from stealing the stone.
The aliens are in unimaginable agony—experiencing, each second, more suffering than has existed in human history. Each time they steal a stone, their suffering decreases only slightly, so they have to steal it 100^100 times in order to drop their suffering to zero. It’s very obvious, in this case, that all of the aliens should steal the stones 100^100 times. If they all do that, rather than being in unimaginably agony, they won’t be badly off at all.
The following seem true.
If deontology is true, it is wrong to steal for the sake of minimal personal benefits.
If it is wrong to steal for the sake of minimal personal benefits, it is wrong to steal repeatedly where each theft considered individually is for the sake of minimal personal benefits.
In the alien case, it is not wrong to steal repeatedly where each theft considered individually is for the sake of minimal personal benefits.
Therefore, deontology is false.
1 is obvious enough. 2 is also obvious—if one thing is wrong to do once, then if lots of people do it repeatedly, that would be especially wrong. 3 was described above—if the aliens don’t steal repeatedly, they will end up in a state where they experience more suffering per second than has existed in all of history.
This also generalizes to any case of a rights violation that can be done repeatedly (E.g. the aliens could be grabbing each other’s legs without the others knowledge or consent).
> Take the following example of theft. Suppose that there are 1,000 aliens, each of which has a stone. They can all steal the stone of their neighbor to decrease their suffering very slightly any number of times. The stone produces very minimal benefits—the primary benefit comes from stealing the stone.
Well, presumably all the aliens are fine with "theft" since it prevents them from being in unimaginable agony. If there's one crazy alien, then the remaining 999 can steal among themselves 10^100 times.
If the Aliens are not fine with theft, then they apparently want to be infinitely tortured. Maybe for them, torture being good is as obvious as the goodness of pleasure is to you. Leave them be.
My only objection is to
> Therefore, deontology is false
I don’t think you can call Deontology true or false. There are particular Deontological systems of morality which produce perverse results (e.g. saying never steal under any circumstances, even if your children are starving). But you can absolutely construct a deontological system which allows you to steal!
IMO this is kind of the point of Deontology (as opposed to Utilitarianism). Deontology says you can’t build your morality around maximizing a single variable. Instead, you build a pluralistic system of rules, intuition, etc, to help you navigate moral situations.