Discussion about this post

User's avatar
St. Jerome Powell's avatar

Interesting post: the generalized, finitary Pascal’s wager.

Isn’t it at least plausible (and indeed, widely argued to be the most likely case) that on a long-termist view the expected impact of the entire AI safety movement so far has been gobsmackingly negative, insofar as the movement seems to be causally responsible for the existence of OpenAI and Anthropic? Similarly, it’s at least plausible that bio-safety research programs lead to, for instance, lab leaks of horrifically deadly pathogens, as has indeed happened repeatedly. I think the consequentialist longtermism skeptic can much more easily argue that it’s really genuinely hard to calculate the expected sign of the impact than that the probability is small enough to ignore. The expected utility argument is not in itself convincing to a non-consequentialist, either, which makes it a bit strange that you don’t even mention this axiom. It’s certainly generally virtuous to do things that help lots of people, even if they’re far away, but I’m profoundly skeptical whether it’s virtuous to shoot while drunk at a target which your fuzzy eyes tell you is *likely* be an assailant but *might* be your own daughter.

Past people have often taken our lives in their hands in ways that in retrospect we find culpably ignorant, even if sincerely well-intentioned. I would suggest Lenin as an illustrative example here.

I think it is virtuous to avoid taking actions whose impact, though it may be large, is extremely difficult to predict in sign. There are always more robust actions available, and one can pray that a later person will be able to act from a position of more clarity. It is certainly possible to argue that you are actually the crux of history, and if you don’t act now, you can’t reasonably expect anyone else to be able to. But I think it’s very rarely true. Probably the Manhattan Project scientists are about the only people of epistemically grant this to, and even there it’s not entirely clear to me that there would be atomic bombs today if the US had held off then.

By the way, surely you’re regularly praying or attending church by now? It would be hard to even try to take this post seriously if you can’t take it even that seriously for yourself.

Expand full comment
Fish-Humble-Plural-Sweeten's avatar

I think your positions on abortion and longtermism are inconsistent. Granting that abortion does not kill a person, it still is the case that it *prevents* a future person from coming into existence. But according to longtermism, this is just as bad as ending a life right now! If "temporal location" does not matter for moral worth, then it doesn't make sense to judge abortion based on the current lack of personhood of the fetus, rather than on the personhood of the human that would be born. In other words, abortion reduces the number of human lives that would be lived by 1, which should make it equivalent to homicide for longtermists (I'm not a longtermist myself, to be clear).

Expand full comment
37 more comments...

No posts

Ready for more?