Suppose you’re a deontologist. You think you should kill one person to prevent multiple other killings. However, you have a really bad temper. When you lose your temper, you kill people. You don’t endorse this killing upon reflection — you just do it.
You have an identical twin that is in the same boat (this is a metaphorical boat — you are not both literally in a boat; this is not a lifeboat case).
Both you and your twin will kill five other people over the course of their lives. A genie appears and cuts you a deal. He says that if you kill a person, he’ll make it so that your twin won’t kill anyone — he’ll give your twin a great anger management drug that will prevent him from killing anyone. He offers the same offer to your twin. What should each of you do?
Well, it seems the straightforward deontological answer is that you shouldn’t take the deal. After all, you shouldn’t kill one person to prevent five killings from someone else. The fact that your twin can prevent your murders shouldn’t change that.
But then you get the result that the correct morality still generates prisoners’ dilemmas. The reason that the prisoners’ dilemma seems to arise is that people are selfish — but if they’re perfectly benevolent, you shouldn’t get prisoners’ dilemmas.
If you say that both people shouldn’t kill the person to prevent the other five killings, we can ratchet up the unintuitiveness. Suppose that the killing done to prevent the five other killings would be painless, but the killings done out of rage would be very painful. Thus, every person, including the “victim” of the killing would want you to kill the person — after all, it will prevent their much more painful death as well as the death of multiple others. Yet despite that, deontology bizarrely supposes that both people killing would be deeply wrong. Even though the moral character of both deontologists would be sullied more and everyone would be worse off, deontology still holds the people acted rightly not to kill the one. This is deeply implausible.
> Both you and your twin will kill five other people over the course of their lives.
This is false. I see no supporting argument for this premise.
> But then you get the result that the correct morality still generates prisoners’ dilemmas.
Makes no sense. You don’t define what such a dilemma is, and I don’t see one here in any event. You’re just saying it’s wrong to kill one to save 5. The fact that you can also kill two people to save 10 adds nothing. You two both cooperating doesn’t yield the best result, it still violates the rule against murder.
> Thus, every person, including the “victim” of the killing would want you to kill the person
I don’t know what victims you’re talking to, but I certainly wouldn’t want to die painlessly in that circumstance.
> Even though the moral character of both deontologists would be sullied more
I don’t know what moral character means. I do know that violating the rules is bad.