It's funny, Torres is trying to portray Yudkowsky as just some sort of ignorant rube who anyone who knows anything about philosophy would disagree with when he says, "Suffice it to say that most philosophers would vehemently object to this conclusion." But by saying that, he's actually revealing much more about himself (or perhaps his intended audience). Anyone who's actually familiar with philosophy would know that Yudkowsky's position is one that philosophers take very seriously and are much more likely to hold than the general public. So trying to dismiss by implying that philosophers know better or that there's an expert consensus against the view implies that either Torres has no idea what other philosophers believe or that he expects his audience not to and is misleading them.
I agree the additive view has lots of good arguments in its favor — as someone who argues for bounded aggregation, I still have a very high credence (0.4?) in the additive view. For me, the argument that I really get worried about is the risk one. On risk, the way I try to salvage the bounded view is this: imagine you’re entering a world and you don’t know who you’re going to be. There is Rayo’s number people, who each experience a dust speck. There is also one person who experiences torture. So, my chance of being the person who experiences torture is 1/Rayo+1, and my chance of a dust speck is near-certain. Yet, I don’t think I’d prefer entering this world over a 100% chance of a slightly-worse-dust speck.
You know, the joke answer here is that while it is specified that you don't know who you are going to be, it is not specified that your chances are as distributed as those native to the world you are entering.
It could be that the entity presenting you with the choice iterated over infinite people until it got a 1/Rayo probability coinflip, and then is giving you the choice.
Doesn't it seem weird that every being in the observable universe getting a dust speck in their eye would be less worth preventing than odds of torture so remote that they are lower than the lowest probabilities you could remotely conceive of--so that you are much likelier to win the lottery Graham's number^Graham's number consecutive times than to prevent that torture.
Well, if I just think of things prudentially, I’d rather take a slight amount of additional pain with certainty rather than any chance of torture — torture seems *so bad*. But I agree this is kind of nuts and has very weird implications.
Totally disagree, and Ibrahim has set this up nicely. I have a general comment that I’ll share elsewhere, but you’re applying probability in the wrong place here. There’s a probability of 1 that someone will experience agonizing torture. I don’t want to live in a world where I know that is the case, regardless of whether that’s me or not. In comparison, I am not bothered at all by 100 specks of dust, and so certainly would not be bothered by the fact that everyone has one at some point.
Yes, this doesn't seem like it accurately understands what severe pain is like, and torture has a further psychological element. I wouldn't expect the writer has actually experienced even truly severe, lasting, pain.
In the real world, pretty much everyone will at some point get a dust speck and be totally unfazed. Probably more than once. We don't particularly do anything to prevent that in daily life, only in situations where there's higher risk of damage. Eg. wearing protective goggles while working with material that throws off a lot of particles. The concern is still not about a moment of mild discomfort, if that (I don't even think a spec is always painful AT ALL).
We do have laws against torture. It's a serious human rights abuse.
We also have laws and procedures (although not enough) around medical negligence, which is the reason I was left disabled and in constant significant nerve pain. Fortunately I have medication that helps. Without it, I would without question prefer assisted suicide, and consider it a serious option, as the pain worsens and the medication gets left effective. I've BEGGED people to, if I'm not able to access it and were to get stuck in the worst pain state (where it's not possible to do much for myself, twitching uncontrollably) not to leave me to exist like that. And, although it may feel torturous, and I have the knowledge this happened because of someone's arrogant and careless actions, which does have a psychological impact, it's still not like someone standing over you the entire time and inflicting pain deliberately, which is what torture is.
Funnily enough I don't spend any energies worrying about solitary dust specs - and of course, I've had that experience to understand it (and would care about safe working conditions, such as for electricians like my uncle, who did not always have that). I do spend some time on warning about medical negligence, and dangers within healthcare systems, though. And on a political campaign group that includes torture among the concerns.
I don't think you can quantify torture by comparing it to dust specs. It's not a matter of multiplying, they're not two experiences on a continuum, but different ones. And I do think this argument treats torture as though it's a fantasy hypothetical, not a real and serious concern, something that is actually done in the world today.
wellbeing comparisons between static universes seem ... a bit useless
actions have moral consequences, after all, simply because someone has chronic pain or there's more happy lizards, or there are bad guys on some island torturing each other for generations, ...
sure, we can ask which state we want to work towards, but that's also almost meaningless without a perfect oracle that can assign a future to our actions.
yes, if a bored god asks us which universe they should make next then it seems appropriate to ponder this.
otherwise high-certainty consequences dominate, and they are almost always local and short-term
There are other good risk-based arguments along with Huemer's. They're all Harsanyi derivatives. The basic thought is that -- behind a veil of ignorance -- the choice between dust specks and torture is a choice between (1) having each of an enormous number of people suffer a dust speck for sure, and (2) having each of those same people undergo a tiny risk of torture. For a tiny enough risk, they'd all be better off if you chose (1). So choose (1)!
Your (continued) conflation of discomfort, pain, torture, and suffering is mind-boggling. Just because these words can sometimes be used as synonyms stylistically doesn’t means they’re synonymous conceptually.
I find all this to be an irresponsible application of moral uncertainty. I don't think humans are suited to confidently answer questions like these on behalf of others, and fortunately we don't get many opportunities to do so.
It seems like a naive amalgamation of several unresolved areas of study (philosophy, consciousness, qualia of pain, math of infinities) to produce a recklessly confident answer.
I think refusing to answer is itself an answer when it comes to a moral question. So I don’t think that argument works, but you are correct that moral uncertainty is probably the best argument against biting this bullet. As higher order, moral uncertainty could potentially make it a bad idea to act on this judgement.
10 years is around 521 weeks. So that is 521 'brief dust speck experiences'.
6 hours is 21600 seconds.
521 'mildly irritating dust speck experiences' is nothing compared to 21600 seconds of brutal rape.
Brutal rape is several orders of magnitude worse than a dust speck (per second). Let's say around 10 million (10^7). Let's say the dust specks is irritating for 3 seconds so
21600*10^7/3=72 billion 'dust speck experiences'.
Since you experience 52 dust speck experiences a year, 72 billion/52 = 1.38 billion years.
So, it would have to exceed 1.38 billion years which is a very long time.
By the way the universe is 13.8 billions years old so that is a tenth of the universes current lifespan for the anal rape to be preferable.
I think that doesn’t work because being raped will affect your quality of life for the rest of your lifespan as you still remember it happened, if you stipulate that your memory is erased then it works better, but in that case, I think you intuition that you’d prefer it for eternity is simply the bias that BB mentions we have regarding large numbers as there is no reason to think you would be immune to that bias, and it seems obvious that the bias is stupid.
What do you mean by that? Do you mean the idea that the dust specs do not affect your quality of life at all? I think this is possible, but specs are kind of a stand in for a very minor pain and minor pains definitely do exist and affect your quality of life, even if the dust specs in particular would not so this is a bit beside the point. If you meant moral uncertainty, then he definitely doesn’t consider that, although to be fair, I think considering moral uncertainty too much when the probabilities are actually small can often lead to ridiculous conclusions, so it’s not the case that his position is obviously indefensible.
I don’t think we know that very minor pains affect our quality of life. I doubt if you remember with any specificity any of the hundreds of times you’ve gotten something in your eye for a second, suggesting it was an ephemeral experience. But anyone who has been tortured will not only remember it, but likely be traumatized by it. It’s possible either that severe pain is qualitatively different than mild pain, or that if it’s quantitatively greater, it increases in badness in a nonlinear way.
What is the most minor pain you would consider would affect a person quality of life, unless you are extremely unusual person. It’s definitely going to be something that can be described as minor, even if it’s not literally a dustspec. This is especially true because your argument requires these pains to count for literally zero otherwise a sufficiently large number of them will Overwelm any larger pain, unless you believe that there is some cut-off above which a pain is qualitatively so bad that even a huge number of pain just below the cut-off will not be worse than it. Non-linear effects will not help you because they just increase the number of minor pains. You need to make something worse than torture, but they don’t get rid of the basic issue that is sufficiently large number will still be worse than torture.
OK then, would you rather be raped in 100 years or deal with a weekly dust spec in the eye? In a thousand or a billion years? Proximity to today is not meant to be the point. The point is to put the choice in the hands of the same person who would experience the torture or the dust.
Infinity, by the way, is useful in math but not so useful in philosophy. It should be banned from these thought experiments because we can't think it.
I think in value comparison when you say something is infinite, it effectively means that even a minuscule probability of it will outweigh a certainty of something infinitely smaller that’s effectively what introducing infinity calculations mean assuming you meant that it’s comparing being raped in 100 years instead of being rape for hundred years. Then for six hours somewhere in the range of 6 to 12 orders of magnitude more time experiencing dust specs would probably be enough to make the dust specs worse. The basic problem with your thought experiment is that it’s actually a less counter intuitive restatement of the original bullet. So nobody willing to bite the original bullet would hesitate the at new bullet. The best argument for your position is the dust specs would have literally zero effect on your quality of life, which I admit is not crazy and as of course, moral uncertainty makes your position stronger.
All these arguments take as given that pain is a quantity, like area or mass, and that it is additive: if you break the pain into parts (as measured by time or temperature or whatever) then each part has an amount of pain that sums to the total of the original. There's an implicit assumption of a mathematical model here. Your dismissal of the convergent series also implicitly assumes a mathematical model; that the total pain of a group of people is the sum of their individual pains. The counterintuitive consequences suggest these model are wrong. Zeno made similar arguments two and half thousand years ago and nobody pays any attention them any more (except perhaps philosophers).
Yes. I wouldn't even say it's comparing degrees of pain. A dust spec may elicit mild discomfort, or even just an awareness that it was there an instant and now isn't. Torture is not only pain, but psychological impact: indeed, not all tortures are physically painful, at least in every instance.
I'd encourage anyone to whom the maths model seems reasonable to spend some time reading about real torture cases. This isn't abstract, it's something that is being done to people now, which resources exist to prevent, and organisations work to challenge. Isolated dust specs aren't.
Good article. Another thing related to torture vs dust specks is the suffering of invertebrates (like insects and shrimp) vs vertebrates (like mammals).
Some people I have spoken to say that we shouldn't waste our money on the Shrimp Welfare Project because they believe that no amount of mild suffering can be worse than extreme suffering. Those people are completely wrong.
Firstly, humans have biases that cause us to undervalue invertebrate suffering (it's inconvenient to include them in our moral calculus and they are small and weird-looking). The idea that invertebrates can not experience intense suffering is probably wrong. Also, your arguments show that enough mild pains can add up to be worse than extreme pain.
A very impressive presentation. Thank you for sharing. Question: what kind of philosophical training did you receive in school? I wish I had the same. Kindest regards from a soon to be 73-year-old who never tires of learning and is grateful for presentations like these.
So I probably agree with the thrust of your argument but I think it has a few holes.
First of all, I don't think a continuous chain of imperceptibility lower suffering is mathematically possible, fwiw. I think you can probably still rescue your argument without the "imperceptible" modifier, though it's tricky.
Also Eliezer's original post was on 3^^^3 dust specks. Literal infinities break everything so not having a good answer to infinite ethics doesn't strike me as a major knock-down on any moral theory or intuition.
I don't think this is correct. Suppose you are asked how much pain you're in on a 1-10 scale. If you start at 10, at some point the number goes from 10 to 9. Do you disagree?
Yeah but my guess is the first point where you say 9 is not perceptibly different from the point where you said 10. My guess is similarly that if sampled independently, there are points with intensity below the 9 threshold where you'd say 10 again.
The example of the boiling water is interesting because if you keep going, then the water will simply stop being uncomfortable at all (you will reach 27 or something) and yes, the graph will reverse.
It's not that pain is never transitive or that the graphs always look like walls to infinity. It’s simply that there are no general graphs and the pain is sometimes intransitive.
Of course, there are many cases where those rules apply. But if you ask a parent to choose between killing his first or second born child, and otherwise (if no choice is made) you will kill both, then everyone might react differently. Some might actually choose one. Some might be unable to choose. Some people might prefer dying with their loved ones. Some might have survivor guilt. There is no answer to the question of “is more pain worse or not” because it’s not always linear and additive. 2 people suffering might be unpredictably better than one (unpredictably being the key word there).
The same person might even change their mind based on the choice they make. If a loved one is suffering horribly from a terminal disease, then many people will feel this way: While he’s alive, you should do everything you can to save him. But once he passed away, then you can be happy that he stopped suffering. This seems like a contradiction but it’s simply a case where trying to get the optimal outcome changes the optimal outcome.
At the end of the day, if you give a choice to humanity today to either torture a baby (A) or everyone gets a speck in the eye (B), then we all know what’s going to happen. People will feel horrible if A is chosen and the world would be worse than if B is chosen in which case people will rejoice at saving a baby for a what is, in this case, a small cost.
Yeah at some point the water is pleasant so it's not bad. But so what? It's implausible that there are two painful states where one is slightly less painful than the other, but the first is worse than any number of the second.
Yes I agree most people think torture is worse than a dust speck. I think they're wrong!
I may have been unclear in my main point. I don’t think torture is any number of times worse than a speck. I don’t agree with the assumption (it is one) that utility can be a number on the real line (which will always make it comparable to others). That’s what the thought experiments show.
To BB, I know you disagree! But the point here is that Utilitarianism leads to a contradiction due to the above assumption. Because when everyone feels worse after the (supposedly) correct utilitarian choice, then this implies the same reasoning (utilitarian) to conclude that it wasn’t the right choice.
You could always make your utilitarian function more complex and integrating more incentives but at the end of the day, you are just using intuitive morals as the baseline and not the Utilitarian paradigm.
I think that rather than this being proof that dust specks are worse than torture, it’s an illustration of the folly of trying to construct perfectly consistent logical principles that one can flawlessly apply in every situation.
I’ll admit that any principle I could endorse could probably lead me to accepting something absurd, but I’m not actually forced to follow them there. A little inconsistency is a small price to pay to not have to torture someone.
I think this is another example of the misuse of thought experiments. There aren’t ways of putting dust in the eyes of a billion people whilst also deciding to torture someone. We’re being asked to tradeoff things that aren’t real. Real choices - do i swerve my car to the left or right risking injuries v death - are hard enough. Stretching beyond intuition doesn’t help.
Never understood this argument, always seemed extremely fanatical.
I like cute dogs. I want there to be more cute dogs with happy lives in the universe. Is this mathematically justifiable under util? No. I just want it.
I would rather have a bunch of cute happy dogs existing than tile the same amount of space with hyper-efficient pleasure-generating-cells. Similarly, I really despise the idea of one person being literally tortured to avoid a much less significant pain inflicted on many others.
Must we all accept utilitarianism until we bite the bullet on claims like this? I certainly do not. Seems like a complete denial of your identity and complex preferences.
But what you described doesn't bear any resemblance to any of the arguments I gave in my piece. None of the premises were that utilitarianism is correct.
Does this sort of quantitative analysis of a moral question not imply total acceptance of utilitarianism (to an essentially fanatical degree)?
To be clear, I am a utilitarian in the way Eliezer put it - 3/4 of the way there. I think it's totally fine to be like: torture is horrible and no mental gymnastics will make me believe otherwise.
No it doesn't. There are a lot of non-utilitarains who accept this result as I said in the piece. Thinking "some number of dust specks is worse than one person being tortured," does not imply "the action you have most reason to take is the one that maximizes total utility."
You can't just assume any argument with a conclusion you don't like that is made by a utilitarian assumes utilitarianism as a premise.
I'm not too familiar with consensus definitions on philosophical terminology, so I won't really engage with that line of argument.
What I have an issue with is the confidence with which you hold these claims - "Here, I’ll explain why the judgment that a bunch of dust specks are worse than a torture is simply correct."
If I believe I should listen to my moral intuition, and one of the most core preferences within it is that no single individual (consciousness/soul/whatever) should be subject to torture to avoid dust specks in a bunch of people's eyes, then that is what I believe. Maybe this preference is overly precise and falls apart when you try to operationalize and scale it – but I'm not interested in doing that.
Your arguments are good and interesting, I just don't like the whimsical overconfidence of the title and some other sentences. It probably boosts your engagement, but maybe put a disclaimer for claims that are epistemologically imprecise / lack context / you wouldn't directly endorse in a serious discussion.
I reject 1 — mild pain is not bad. Even moderate pain is not bad. It has 0 moral weight. In many cases, mild or moderate pain can even be good: they can strengthen us, help us build resilience. This is of course much less likely to be the case with extreme pain, which is a categorically different thing, not just a difference in intensity
Yes, the mild discomfort of the dust spec is serving the evolutionary purpose of avoiding damage to the eye.
The torture described here is inescapable pain, which doesn't serve the individual organism experiencing it at all.
I don't think it's intrinsically bad that it hurts to touch something too hot. I do think it's bad to be boiled alive, and that entirely deliberately by another thinking being, as described! Speaking of which, a relative of mine accidentally poured boiling water on herself recently after eye surgery: pretty bad, I feel very sympathetic. The pain at least served the purpose of telling her to stop ASAP, although it's still not good. But, it's healing (fortunately, could have been worse), and she is being extremely careful now, having realised she underestimated how much she needed to compensate for her vision. Would be preferable for it not to be such a painful and risky lesson, but the pain still served a purpose, even while the experience was much more intense and negative than, say, stepping in a bath a bit too hot for a moment.
In a utilitarian formula such as v=d*n*t (value equals degree times number times time) degree is not a linear scale, it is logarithmic. Moreover the infliction of pain is also not linear. The statement that 200 degrees is worse than 199 is both incorrect as well as an oversimplification. Human skin cells begin being damaged at 111 degrees Fahrenheit if exposed for a long time (6 hours), but anything over 158 causes immediate damage. At this threshold anything below the boiling point of water is mostly equally damaging and above boiling again becomes much more damaging. The point is that degrees of damage and pain are not linear. While the concept of v=d*n*t is simple, its application is quite complex. For this reason, utilitarianism is one of our useful tools for making ethical choices, but should be validated by other ethical frameworks including deontology, care ethics, and virtue ethics.
It's funny, Torres is trying to portray Yudkowsky as just some sort of ignorant rube who anyone who knows anything about philosophy would disagree with when he says, "Suffice it to say that most philosophers would vehemently object to this conclusion." But by saying that, he's actually revealing much more about himself (or perhaps his intended audience). Anyone who's actually familiar with philosophy would know that Yudkowsky's position is one that philosophers take very seriously and are much more likely to hold than the general public. So trying to dismiss by implying that philosophers know better or that there's an expert consensus against the view implies that either Torres has no idea what other philosophers believe or that he expects his audience not to and is misleading them.
I agree the additive view has lots of good arguments in its favor — as someone who argues for bounded aggregation, I still have a very high credence (0.4?) in the additive view. For me, the argument that I really get worried about is the risk one. On risk, the way I try to salvage the bounded view is this: imagine you’re entering a world and you don’t know who you’re going to be. There is Rayo’s number people, who each experience a dust speck. There is also one person who experiences torture. So, my chance of being the person who experiences torture is 1/Rayo+1, and my chance of a dust speck is near-certain. Yet, I don’t think I’d prefer entering this world over a 100% chance of a slightly-worse-dust speck.
I would!
You know, the joke answer here is that while it is specified that you don't know who you are going to be, it is not specified that your chances are as distributed as those native to the world you are entering.
It could be that the entity presenting you with the choice iterated over infinite people until it got a 1/Rayo probability coinflip, and then is giving you the choice.
Based!
Doesn't it seem weird that every being in the observable universe getting a dust speck in their eye would be less worth preventing than odds of torture so remote that they are lower than the lowest probabilities you could remotely conceive of--so that you are much likelier to win the lottery Graham's number^Graham's number consecutive times than to prevent that torture.
Well, if I just think of things prudentially, I’d rather take a slight amount of additional pain with certainty rather than any chance of torture — torture seems *so bad*. But I agree this is kind of nuts and has very weird implications.
Totally disagree, and Ibrahim has set this up nicely. I have a general comment that I’ll share elsewhere, but you’re applying probability in the wrong place here. There’s a probability of 1 that someone will experience agonizing torture. I don’t want to live in a world where I know that is the case, regardless of whether that’s me or not. In comparison, I am not bothered at all by 100 specks of dust, and so certainly would not be bothered by the fact that everyone has one at some point.
Yes, this doesn't seem like it accurately understands what severe pain is like, and torture has a further psychological element. I wouldn't expect the writer has actually experienced even truly severe, lasting, pain.
In the real world, pretty much everyone will at some point get a dust speck and be totally unfazed. Probably more than once. We don't particularly do anything to prevent that in daily life, only in situations where there's higher risk of damage. Eg. wearing protective goggles while working with material that throws off a lot of particles. The concern is still not about a moment of mild discomfort, if that (I don't even think a spec is always painful AT ALL).
We do have laws against torture. It's a serious human rights abuse.
We also have laws and procedures (although not enough) around medical negligence, which is the reason I was left disabled and in constant significant nerve pain. Fortunately I have medication that helps. Without it, I would without question prefer assisted suicide, and consider it a serious option, as the pain worsens and the medication gets left effective. I've BEGGED people to, if I'm not able to access it and were to get stuck in the worst pain state (where it's not possible to do much for myself, twitching uncontrollably) not to leave me to exist like that. And, although it may feel torturous, and I have the knowledge this happened because of someone's arrogant and careless actions, which does have a psychological impact, it's still not like someone standing over you the entire time and inflicting pain deliberately, which is what torture is.
Funnily enough I don't spend any energies worrying about solitary dust specs - and of course, I've had that experience to understand it (and would care about safe working conditions, such as for electricians like my uncle, who did not always have that). I do spend some time on warning about medical negligence, and dangers within healthcare systems, though. And on a political campaign group that includes torture among the concerns.
I don't think you can quantify torture by comparing it to dust specs. It's not a matter of multiplying, they're not two experiences on a continuum, but different ones. And I do think this argument treats torture as though it's a fantasy hypothetical, not a real and serious concern, something that is actually done in the world today.
wellbeing comparisons between static universes seem ... a bit useless
actions have moral consequences, after all, simply because someone has chronic pain or there's more happy lizards, or there are bad guys on some island torturing each other for generations, ...
sure, we can ask which state we want to work towards, but that's also almost meaningless without a perfect oracle that can assign a future to our actions.
yes, if a bored god asks us which universe they should make next then it seems appropriate to ponder this.
otherwise high-certainty consequences dominate, and they are almost always local and short-term
There are other good risk-based arguments along with Huemer's. They're all Harsanyi derivatives. The basic thought is that -- behind a veil of ignorance -- the choice between dust specks and torture is a choice between (1) having each of an enormous number of people suffer a dust speck for sure, and (2) having each of those same people undergo a tiny risk of torture. For a tiny enough risk, they'd all be better off if you chose (1). So choose (1)!
https://www.jstor.org/stable/48799008?seq=1
https://www.cold-takes.com/defending-one-dimensional-ethics/
Yeah nice one.
Your (continued) conflation of discomfort, pain, torture, and suffering is mind-boggling. Just because these words can sometimes be used as synonyms stylistically doesn’t means they’re synonymous conceptually.
But for present purposes the subtle distinctions between them don't matter at all!
They matter to the guy who will be tortured because of a conceptual error.
Could you supply an argument? Declaiming your position and then leaving is not philosophy.
Yes, I can: https://anovermuser.substack.com/p/the-dust-speck-vs-torture-argument
Let’s say you’re immortal.
Would you rather, for the next 10 years, every Saturday deal with one dust speck in your eye, or right now be brutally raped for the next six hours?
What’s the amount of years where you would prefer the anal rape?
I think I prefer the weekly dust speck, even for eternity. But different people, different strokes.
Real.
I find all this to be an irresponsible application of moral uncertainty. I don't think humans are suited to confidently answer questions like these on behalf of others, and fortunately we don't get many opportunities to do so.
It seems like a naive amalgamation of several unresolved areas of study (philosophy, consciousness, qualia of pain, math of infinities) to produce a recklessly confident answer.
I'm also choosing the weekly dust speck.
I think refusing to answer is itself an answer when it comes to a moral question. So I don’t think that argument works, but you are correct that moral uncertainty is probably the best argument against biting this bullet. As higher order, moral uncertainty could potentially make it a bad idea to act on this judgement.
I don't know the exact amount but I think there is an amount.
10 years is around 521 weeks. So that is 521 'brief dust speck experiences'.
6 hours is 21600 seconds.
521 'mildly irritating dust speck experiences' is nothing compared to 21600 seconds of brutal rape.
Brutal rape is several orders of magnitude worse than a dust speck (per second). Let's say around 10 million (10^7). Let's say the dust specks is irritating for 3 seconds so
21600*10^7/3=72 billion 'dust speck experiences'.
Since you experience 52 dust speck experiences a year, 72 billion/52 = 1.38 billion years.
So, it would have to exceed 1.38 billion years which is a very long time.
By the way the universe is 13.8 billions years old so that is a tenth of the universes current lifespan for the anal rape to be preferable.
In my heart, I knew this was true, but thanks for the proof.
I think that doesn’t work because being raped will affect your quality of life for the rest of your lifespan as you still remember it happened, if you stipulate that your memory is erased then it works better, but in that case, I think you intuition that you’d prefer it for eternity is simply the bias that BB mentions we have regarding large numbers as there is no reason to think you would be immune to that bias, and it seems obvious that the bias is stupid.
But…isn’t that something BB’s argument doesn’t consider?
What do you mean by that? Do you mean the idea that the dust specs do not affect your quality of life at all? I think this is possible, but specs are kind of a stand in for a very minor pain and minor pains definitely do exist and affect your quality of life, even if the dust specs in particular would not so this is a bit beside the point. If you meant moral uncertainty, then he definitely doesn’t consider that, although to be fair, I think considering moral uncertainty too much when the probabilities are actually small can often lead to ridiculous conclusions, so it’s not the case that his position is obviously indefensible.
I don’t think we know that very minor pains affect our quality of life. I doubt if you remember with any specificity any of the hundreds of times you’ve gotten something in your eye for a second, suggesting it was an ephemeral experience. But anyone who has been tortured will not only remember it, but likely be traumatized by it. It’s possible either that severe pain is qualitatively different than mild pain, or that if it’s quantitatively greater, it increases in badness in a nonlinear way.
What is the most minor pain you would consider would affect a person quality of life, unless you are extremely unusual person. It’s definitely going to be something that can be described as minor, even if it’s not literally a dustspec. This is especially true because your argument requires these pains to count for literally zero otherwise a sufficiently large number of them will Overwelm any larger pain, unless you believe that there is some cut-off above which a pain is qualitatively so bad that even a huge number of pain just below the cut-off will not be worse than it. Non-linear effects will not help you because they just increase the number of minor pains. You need to make something worse than torture, but they don’t get rid of the basic issue that is sufficiently large number will still be worse than torture.
OK then, would you rather be raped in 100 years or deal with a weekly dust spec in the eye? In a thousand or a billion years? Proximity to today is not meant to be the point. The point is to put the choice in the hands of the same person who would experience the torture or the dust.
Infinity, by the way, is useful in math but not so useful in philosophy. It should be banned from these thought experiments because we can't think it.
https://benthams.substack.com/p/infinite-dust-specks-are-worse-than/comment/230749038
In my opinion it would take 1.38 billion years (e.g 10% of the universes current lifespan) for the weekly dust specks to be worse.
I think in value comparison when you say something is infinite, it effectively means that even a minuscule probability of it will outweigh a certainty of something infinitely smaller that’s effectively what introducing infinity calculations mean assuming you meant that it’s comparing being raped in 100 years instead of being rape for hundred years. Then for six hours somewhere in the range of 6 to 12 orders of magnitude more time experiencing dust specs would probably be enough to make the dust specs worse. The basic problem with your thought experiment is that it’s actually a less counter intuitive restatement of the original bullet. So nobody willing to bite the original bullet would hesitate the at new bullet. The best argument for your position is the dust specs would have literally zero effect on your quality of life, which I admit is not crazy and as of course, moral uncertainty makes your position stronger.
All these arguments take as given that pain is a quantity, like area or mass, and that it is additive: if you break the pain into parts (as measured by time or temperature or whatever) then each part has an amount of pain that sums to the total of the original. There's an implicit assumption of a mathematical model here. Your dismissal of the convergent series also implicitly assumes a mathematical model; that the total pain of a group of people is the sum of their individual pains. The counterintuitive consequences suggest these model are wrong. Zeno made similar arguments two and half thousand years ago and nobody pays any attention them any more (except perhaps philosophers).
Yes. I wouldn't even say it's comparing degrees of pain. A dust spec may elicit mild discomfort, or even just an awareness that it was there an instant and now isn't. Torture is not only pain, but psychological impact: indeed, not all tortures are physically painful, at least in every instance.
I'd encourage anyone to whom the maths model seems reasonable to spend some time reading about real torture cases. This isn't abstract, it's something that is being done to people now, which resources exist to prevent, and organisations work to challenge. Isolated dust specs aren't.
Such a classic Matthew article
Good article. Another thing related to torture vs dust specks is the suffering of invertebrates (like insects and shrimp) vs vertebrates (like mammals).
Some people I have spoken to say that we shouldn't waste our money on the Shrimp Welfare Project because they believe that no amount of mild suffering can be worse than extreme suffering. Those people are completely wrong.
Firstly, humans have biases that cause us to undervalue invertebrate suffering (it's inconvenient to include them in our moral calculus and they are small and weird-looking). The idea that invertebrates can not experience intense suffering is probably wrong. Also, your arguments show that enough mild pains can add up to be worse than extreme pain.
A very impressive presentation. Thank you for sharing. Question: what kind of philosophical training did you receive in school? I wish I had the same. Kindest regards from a soon to be 73-year-old who never tires of learning and is grateful for presentations like these.
Depends on what you're looking for.
Conversations on philosophical topics geared to the lay person. One of my favorites is Philosophize This by Stephen West.
Everything by Michael huemer is good for getting informed
Thanks. Will check it out.
Undergrad degree + a lot of time reading on the internet.
Interesting. Which sources on the Internet do you use and recommend? Thanks in advance.
So I probably agree with the thrust of your argument but I think it has a few holes.
First of all, I don't think a continuous chain of imperceptibility lower suffering is mathematically possible, fwiw. I think you can probably still rescue your argument without the "imperceptible" modifier, though it's tricky.
See https://linch.substack.com/i/182589405/the-intermediate-value-theorem
Also Eliezer's original post was on 3^^^3 dust specks. Literal infinities break everything so not having a good answer to infinite ethics doesn't strike me as a major knock-down on any moral theory or intuition.
I mean, each time you lower some temperature by a millionth of a degree, it's perceptible. But those add up to something perceptible.
I don't think this is correct. Suppose you are asked how much pain you're in on a 1-10 scale. If you start at 10, at some point the number goes from 10 to 9. Do you disagree?
Yeah but my guess is the first point where you say 9 is not perceptibly different from the point where you said 10. My guess is similarly that if sampled independently, there are points with intensity below the 9 threshold where you'd say 10 again.
The example of the boiling water is interesting because if you keep going, then the water will simply stop being uncomfortable at all (you will reach 27 or something) and yes, the graph will reverse.
It's not that pain is never transitive or that the graphs always look like walls to infinity. It’s simply that there are no general graphs and the pain is sometimes intransitive.
Of course, there are many cases where those rules apply. But if you ask a parent to choose between killing his first or second born child, and otherwise (if no choice is made) you will kill both, then everyone might react differently. Some might actually choose one. Some might be unable to choose. Some people might prefer dying with their loved ones. Some might have survivor guilt. There is no answer to the question of “is more pain worse or not” because it’s not always linear and additive. 2 people suffering might be unpredictably better than one (unpredictably being the key word there).
The same person might even change their mind based on the choice they make. If a loved one is suffering horribly from a terminal disease, then many people will feel this way: While he’s alive, you should do everything you can to save him. But once he passed away, then you can be happy that he stopped suffering. This seems like a contradiction but it’s simply a case where trying to get the optimal outcome changes the optimal outcome.
At the end of the day, if you give a choice to humanity today to either torture a baby (A) or everyone gets a speck in the eye (B), then we all know what’s going to happen. People will feel horrible if A is chosen and the world would be worse than if B is chosen in which case people will rejoice at saving a baby for a what is, in this case, a small cost.
But not in other cases.
Yeah at some point the water is pleasant so it's not bad. But so what? It's implausible that there are two painful states where one is slightly less painful than the other, but the first is worse than any number of the second.
Yes I agree most people think torture is worse than a dust speck. I think they're wrong!
In my opinion, torturing a baby is worse than 8 billion people getting a dust-speck each.
If it was a googol people each getting a dust speck, the dust specks would obviously be worse.
Torture is much much worse than a dust speck but not infinitely worse.
I may have been unclear in my main point. I don’t think torture is any number of times worse than a speck. I don’t agree with the assumption (it is one) that utility can be a number on the real line (which will always make it comparable to others). That’s what the thought experiments show.
To BB, I know you disagree! But the point here is that Utilitarianism leads to a contradiction due to the above assumption. Because when everyone feels worse after the (supposedly) correct utilitarian choice, then this implies the same reasoning (utilitarian) to conclude that it wasn’t the right choice.
You could always make your utilitarian function more complex and integrating more incentives but at the end of the day, you are just using intuitive morals as the baseline and not the Utilitarian paradigm.
I think that rather than this being proof that dust specks are worse than torture, it’s an illustration of the folly of trying to construct perfectly consistent logical principles that one can flawlessly apply in every situation.
I’ll admit that any principle I could endorse could probably lead me to accepting something absurd, but I’m not actually forced to follow them there. A little inconsistency is a small price to pay to not have to torture someone.
But no part of the argument assumed that there is some easy way to construct universally applicable logical principles.
You shouldn't believe contradictory things because one of them has to be false.
I think this is another example of the misuse of thought experiments. There aren’t ways of putting dust in the eyes of a billion people whilst also deciding to torture someone. We’re being asked to tradeoff things that aren’t real. Real choices - do i swerve my car to the left or right risking injuries v death - are hard enough. Stretching beyond intuition doesn’t help.
Never understood this argument, always seemed extremely fanatical.
I like cute dogs. I want there to be more cute dogs with happy lives in the universe. Is this mathematically justifiable under util? No. I just want it.
I would rather have a bunch of cute happy dogs existing than tile the same amount of space with hyper-efficient pleasure-generating-cells. Similarly, I really despise the idea of one person being literally tortured to avoid a much less significant pain inflicted on many others.
Must we all accept utilitarianism until we bite the bullet on claims like this? I certainly do not. Seems like a complete denial of your identity and complex preferences.
But what you described doesn't bear any resemblance to any of the arguments I gave in my piece. None of the premises were that utilitarianism is correct.
Does this sort of quantitative analysis of a moral question not imply total acceptance of utilitarianism (to an essentially fanatical degree)?
To be clear, I am a utilitarian in the way Eliezer put it - 3/4 of the way there. I think it's totally fine to be like: torture is horrible and no mental gymnastics will make me believe otherwise.
No it doesn't. There are a lot of non-utilitarains who accept this result as I said in the piece. Thinking "some number of dust specks is worse than one person being tortured," does not imply "the action you have most reason to take is the one that maximizes total utility."
You can't just assume any argument with a conclusion you don't like that is made by a utilitarian assumes utilitarianism as a premise.
I'm not too familiar with consensus definitions on philosophical terminology, so I won't really engage with that line of argument.
What I have an issue with is the confidence with which you hold these claims - "Here, I’ll explain why the judgment that a bunch of dust specks are worse than a torture is simply correct."
If I believe I should listen to my moral intuition, and one of the most core preferences within it is that no single individual (consciousness/soul/whatever) should be subject to torture to avoid dust specks in a bunch of people's eyes, then that is what I believe. Maybe this preference is overly precise and falls apart when you try to operationalize and scale it – but I'm not interested in doing that.
Your arguments are good and interesting, I just don't like the whimsical overconfidence of the title and some other sentences. It probably boosts your engagement, but maybe put a disclaimer for claims that are epistemologically imprecise / lack context / you wouldn't directly endorse in a serious discussion.
You had me at "I like cute dogs."
I reject 1 — mild pain is not bad. Even moderate pain is not bad. It has 0 moral weight. In many cases, mild or moderate pain can even be good: they can strengthen us, help us build resilience. This is of course much less likely to be the case with extreme pain, which is a categorically different thing, not just a difference in intensity
Yes, the mild discomfort of the dust spec is serving the evolutionary purpose of avoiding damage to the eye.
The torture described here is inescapable pain, which doesn't serve the individual organism experiencing it at all.
I don't think it's intrinsically bad that it hurts to touch something too hot. I do think it's bad to be boiled alive, and that entirely deliberately by another thinking being, as described! Speaking of which, a relative of mine accidentally poured boiling water on herself recently after eye surgery: pretty bad, I feel very sympathetic. The pain at least served the purpose of telling her to stop ASAP, although it's still not good. But, it's healing (fortunately, could have been worse), and she is being extremely careful now, having realised she underestimated how much she needed to compensate for her vision. Would be preferable for it not to be such a painful and risky lesson, but the pain still served a purpose, even while the experience was much more intense and negative than, say, stepping in a bath a bit too hot for a moment.
In a utilitarian formula such as v=d*n*t (value equals degree times number times time) degree is not a linear scale, it is logarithmic. Moreover the infliction of pain is also not linear. The statement that 200 degrees is worse than 199 is both incorrect as well as an oversimplification. Human skin cells begin being damaged at 111 degrees Fahrenheit if exposed for a long time (6 hours), but anything over 158 causes immediate damage. At this threshold anything below the boiling point of water is mostly equally damaging and above boiling again becomes much more damaging. The point is that degrees of damage and pain are not linear. While the concept of v=d*n*t is simple, its application is quite complex. For this reason, utilitarianism is one of our useful tools for making ethical choices, but should be validated by other ethical frameworks including deontology, care ethics, and virtue ethics.