27 Comments

I found this to be an ok-written article, though a tad on the robotic side. I think this might have something to do with the left-brain, utilitarian mindset that seems to underlie EA. On that note, I think some of Rollins's points were funny, and it would have been more effective to reply to the jokes with jokes.

As for your rhetorical examples of things that aren't cults, I think a good case can be made that they're all cults, some more benign than others. I, for instance, am more of a fan of the cult of chess than the cult of St. Olaf. Also, I would classify most of the vegans I know as religious, for sure. Though I myself have been religiously keto at times, so I try not to judge too harshly.

I do have some questions, though. On egoism vs. altruism, isn't selflessness part of the definition of altruism? Maybe you didn't include it, but I didn't see Rollins say "perfectly selfless", just selfless, which seems fair to me.

"treat you as doing something seriously morally wrong. But they don’t treat you as evil."

I thought this was a definition of evil?

“Effective altruism is the project of trying to find the best ways of helping others, and putting them into practice.”

If I think the best way of helping others would be to do the opposite of pretty much everything EA recommends, could I still be an effective altruist?

Expand full comment
author

//As for your rhetorical examples of things that aren't cults, I think a good case can be made that they're all cults, some more benign than others. I, for instance, am more of a fan of the cult of chess than the cult of St. Olaf. Also, I would classify most of the vegans I know as religious, for sure. Though I myself have been religiously keto at times, so I try not to judge too harshly.//

If basically all social movements, as well as chess players are cults, then the word cult is benign, and it's no longer a smear word. The point of Rollins' article was to criticize EA as dangerous--by labeling it a cult. But we agree that chess players aren't dangerous, qua chess player. Thus, there has to be some reason why EA is more sinister than hcess players.

//"treat you as doing something seriously morally wrong. But they don’t treat you as evil."

I thought this was a definition of evil?//

There's a difference between being evil and doing evil. If a person does something they don't know to be immoral, they're doing evil, but they don't know it. Thus, they're not being evil.

//“Effective altruism is the project of trying to find the best ways of helping others, and putting them into practice.”

If I think the best way of helping others would be to do the opposite of pretty much everything EA recommends, could I still be an effective altruist?//

This is, I think, a good place to make a distinction that I'll flesh out in an article soon. There's the philosophy of EA which says we should try to help others effectively and then there are the practical ways that most actually existing EAs try to help others. We might call people on baord with the philosophy philosophical EAs and ones on board with the practical actions practical EAs. Thus, this would be a rejection of practical EA, but not philosophical EA.

Expand full comment

Thanks for you responses.

//Thus, there has to be some reason why EA is more sinister than hcess players.//

I think this might have to do with the cogitation-become-altruism flavor of EA, rather than the plain old virtue. It tends to lead to some pretty historic L's, IMO. Like SBF, who is an EA, right?

//Thus, this would be a rejection of practical EA, but not philosophical EA.//

Maybe in the article you could expand the definition, then, because as written it's pretty open-ended, and any number of moral philosophies could fit into it, though I doubt most or very many of them would classify as EA.

Expand full comment
author

//I think this might have to do with the cogitation-become-altruism flavor of EA, rather than the plain old virtue. It tends to lead to some pretty historic L's, IMO. Like SBF, who is an EA, right?//

SBF is an EA, but he's been routinely condemned by basically all EAs and the things he did might have been in some way motivated by EA, but they weren't things that the EA framework would endorse. Thus, I think a decent analogy is with Nixon. Nixon was a republican and did watergate spying to help the republicans, but watergate isn't a good reason not to be a republican.

//Maybe in the article you could expand the definition, then, because as written it's pretty open-ended, and any number of moral philosophies could fit into it, though I doubt most or very many of them would classify as EA.//

Yeah, I'll write an article talking about this at some point. Though I think Richard's definition is pretty good.

Expand full comment

"There's a difference between being evil and doing evil. If a person does something they don't know to be immoral, they're doing evil, but they don't know it. Thus, they're not being evil."

So there I am, carving up a steak. You patiently explain to me why eating that steak is morally wrong due to the suffering of feedlot cattle, the damage to the climate from bovine butt-methane emissions, and the selfishness of processing grain calories through cow stomachs rather than equitably distributing those plant calories through the world's hungry.

You can now confidently I assume that I know eating meat is immoral, according to the frame that has been explained to me.

I look you in the eye and take another big bite of that juicy steak.

Am I evil yet?

Expand full comment
author

I think you are doing something wrong. I don't think there are precise facts about whether anyone is evil, but I'd say that you're roughly equivalent to the many people throughout human history who have supported quite horrific social practices while knowing they're wrong (e.g. subjugation of women, slavery, and nazism).

Expand full comment

Ok, now we're getting somewhere. Presuming that this is a common view in EA (a majority view, perhaps?), what steps is EA willing to take to prevent this wrong, assuming the ideology gains the political power necessary to implement its policy preferences?

Expand full comment
author

Well, what EA tends to do is sometimes push for ballot initiatives that reduce animal cruelty on factory farms (e.g. ones by the humane league that make pregnant pigs have more room so they can turn around).

Expand full comment

I'm generally in favor of food animals having a good life, whatever that happens to mean for a particular species.

Expand full comment

Are you familiar with Joel Salatin? I only ask because I think who he is and what he does has done more for animal welfare than any EA advocate.

Expand full comment
Jan 29, 2023·edited Jan 29, 2023

I'd be satisfied with the same answer you'd give if you change the scenario to 1800 and selling a slave at auction, putting yourself in the judger's position.

In other words, I couldn't give a damn about someone's preferred use of the rhetorical word "evil", as distinct from the sense 'clearly morally wrong'.

I claim that causing horrific suffering for trivial sensory gratification doesn't become less morally wrong because it's currently sanctioned by society. I don't know whether or not it makes it less "evil", because I'm not sure at all how to pin down such a rhetorical word that almost always casts much more heat than light.

Expand full comment

Except that keeping slaves isn't implied by human biology. Consumption of animal protein absolutely is, we are obligate omnivores.

There's also the rather ugly anthropocentrism implicit in the position that only organisms with nervous systems are capable of suffering, and that therefore your hands are morally clean just because you only eat leaves. Life feeds on life, and all life suffers. There's no way out of that.

Expand full comment

Sounds like you're on a mission to get me to apply the word "evil". But no, I'll still just say that you're profoundly misguided, attempting to rationalize actions that are clearly very morally wrong.

- Why would it be "anthropocentric" to take the highly intuitive position that suffering is a feature of brains and nervous systems? It's just phenomenological conservatism, based on the same ground-level intuitions you use when you respond differently to a kid kicking the head of a puppy versus kicking an apple. Are you implying that I might be ignoring sentient AI?

- I neither believe that my "hands are clean", nor am I trying to make them so, nor do I think it's even possible for anyone. That kinda comes with the territory of consequentialism, I think. What I'm doing by advocating veganism is focusing on the actions that seem to have the worst harm-to-benefit ratios, more than I focus on actions with less extreme harm-to-benefit ratios.

Why would you think I'm somehow trying to evade the fact that conscious life involves suffering? Are you trying to evade that fact when you oppose child abuse or terrorism or whatever else you agree with me in opposing? The impossible perfectionism of existence you imagine there, is no more true of veganism than of any other moral issue. It's not about avoiding all harm that comes along with life; it's about stopping doing the worst -- and most easily avoided -- harms.

Expand full comment

These are excellent points.

Why are people so scared of the word cult, anyhow? Same root as culture, after all. I for one am a proud and unapologetic adherent of the cult of the iron temple ... which places me in eternal opposition to the cult of the vegetable diet, who are the sworn enemies of healthy androgen levels and sick gains.

I'm not just being ironic. In many cases the interests of different cults are intractably opposed. Not all games are non-zero sum; sometimes there must be winners and losers. If I am to continue eating gargantuan quantities of high-quality animal protein as befits the hominid apex predator, it follows that the plant-based diet fanatics must lose out in their quest to turn humans into placid herbivores. And vice versa.

So, how exactly does EA define the good? What is the utility calculation based on? Max pleasure/min pain for max population for max time? A planet of a trillion wireheads with AI-generated porn being fed into their optic nerves while soft robots massage their genitalia, being sustained on heroin-laced IV drips, stacked and packed in subterranean catacombs powered by geothermal taps so that the surface may remain pristine wilderness? What is the goal here?

Does the suffering of gazelles being eaten by lions factor in? What about the suffering of hungry lions?

Expand full comment
author

So, different EAs will give different answers to this, and most don't need to. I mean, pretty much all EAs will think that happiness and lives saved are generally good things, and suffering is worth averting. I think generally they won't take a strong stance on what the ultimate end state of consciousness will be. Now, I am a hedonic utilitarian--though I obviously think there are better future worlds than the heroin and porn world.

As for the point about cults, I think it would be fine if we, as a society, destigmatized the word cult, and just used it to refer to some types of movements. But as long as the word is stigmatized, it is wrong to claim that a perfectly innocent organization is a cult, without explaining that you're using an idiosyncratic definition.

Expand full comment
Jan 29, 2023Liked by Bentham's Bulldog

Right. This is basically an example of the "worst argument in the world".

https://www.lesswrong.com/posts/yCWPkLi8wJvewPbEp/the-noncentral-fallacy-the-worst-argument-in-the-world

Expand full comment

I found this post confusing. Is it Rollins, or Collins? Or are those two different people?

It also wasn't clear if you're primarily critiquing Rollins (Collins?), or Zvi, since it seems like most of the really egregious statements that the former takes issue with you're distancing yourself from. Is Zvi not representative of EA? If not, are you? Why should credibility be assigned to your take on EA and not Zvi's? Those are genuine questions, as I'm not deeply familiar with EA or its major proponents.

As a final note, a lot of people - myself included - would consider vegans and animal rights activities to be very culty. Not all, to be sure, but many are evangelical, pushy, and intolerant. As for Catholicism, it may not be a cult now but the early church absolutely was, pretty much by definition. Is the implication with that comparison that EA ultimately aims to become something akin to a major world religion? And if so, given that it's obviously in the gestational stages, wouldn't that make it a kind of cult by definition?

Expand full comment
author

It is Rollins. I (rather embarrassingly) misspelled it twice--I've now fixed it. I think that Zvi--in his article submitted for a contest--often misrepresented EA. For the parts of the definition, I was trying to provide defenses of them; for example, explaining why many who are considered by themselves and others to be EAs do not do the things that Zvi says are intrinsic to EA. Also, Zvi was providing a series of things that are associated with EA in some way, not a definition, so it's misleading to just assume EA is the conjunction of those things and then smear it as a cult. If the claim is that EA is a cult, but we're using cult to mean some super broad thing that includes all religions and most social movements, then this seems to devalue what people actually mean by cult. Generally, cults are harmful and sinister, while I don't think that animal rights people are--I've argued this more here https://benthams.substack.com/p/factory-farming-delenda-est. I compared it to catholicism jsut to establish that you can have a movement that defers to itself sometimes without being a cult--which is true of modern catholicism. It's unclear why trusting the epistemic standards of one's movement makes something a cult.

Expand full comment

I believe Rollins' point about the culty, let's call it odor of EA wasn't tied to any one thing. It's more of a holistic evaluation, based not on a single fragile thread of logic but on many threads woven together. There's the propensity towards dietary preachiness; there's the evangelicalism; and most worryingly of all there's the rather cold, soulless rationalism of its utilitarian ethos, which seems to take the position that EA knows what's best because EA is about doing what's best according to the criteria established within the framework of EA. As Harrison noted, that's classic left brain systematizing, which is a useful tool, but when unleavened by contextual or holistic thinking tends to produce a certain blindness. Historically, this has often led to unintentional atrocities carried out with the best of intentions.

The communists thought they had an objective, scientific framework with which to improve humanity as well. All they produced were ugly buildings and piles of skulls.

Expand full comment
author

But one thing that EAs are very big on is not violating important constraints because it tends to have disastrous outcomes. If you look at the things that EA does, they're not bad--they're things like distribute anti-malarial bednets, which prevents hundreds of thousands of people from dying. Now, as for the point about dietary preachiness, I think that's a good thing because eating meat is seriously wrong. https://benthams.substack.com/p/factory-farming-delenda-est

Expand full comment

By distributing malarial bednets and saving hundreds of thousands of lives, the population increases. Quite specifically, the population of countries that already cannot feed themselves. What are the implications of this? Is it possible that by meddling in this fashion the downstream effects might be much worse than any positives?

So far EA is just sounding like a fancy intellectualization of the same foreign aid type activities that have been beloved of white people since before Charles Dickens first skewered telescopic philanthropy. The track record of foreign aid is generally terrible. Nations that have adapted well to modernity - Japan, for example - are the ones that have lifted themselves out of material poverty, not the ones that became dependent upon charity. Suffering is the teacher of humanity, but the student must be allowed to learn the lessons on his own.

Expand full comment
author

EA is very heavily evidence backed. Even if you think foreign aid in general is bad, there's very good evidence from oodles of randomized control trials that bednet distribution works.

Expand full comment

I didn't argue that it doesn't work, for the narrow aim for which it is intended. The critique was not at the tactical level. I'm questioning the moral basis of your strategic doctrine.

Should human cultures not be allowed to develop at their own pace, in the manner they choose, rather than having do-gooders impose their goals upon them, however altruistically? What happens, for example, when the bed net shipments cease? Would they not have been better off learning to make them themselves, with their own local resources?

Expand full comment

As to the psychopathic angle, utilitarianism in general, and EA in particular, require an arms-length detachment from the people in one's immediate surroundings in order to weigh pleasures vs pains impersonally across the board. Should I spend my extra money on something that would bring joy to my own children, or do I make an impersonal donation to some international charity that promises to ease the suffering of people in faraway lands whom I will never meet? It is not natural or psychologically healthy to give your own children's needs and wants the same impersonal weight as those of abstract others, who exist only as statistics on a spreadsheet. I know the feminist care ethicists made this critique, though they went wrong and contravened human nature for different reasons, but their critique of utilitarianism still stands. A well-intentioned but misguided perspon who goes for utilitarianism would not be psychopathic, but would be a perfect mark for those who are (e.g., SBF).

For bad actors, utlitarianism does the hard part of convincing well-meaning people to detach themselves from the well-being of the real-life flesh-and-blood people around them and to instead use cold, impersonal statistics to guide their moral reasoning (stats that can be manipulated by bad actors and donations that can be misappropriated). That's the perfect foundation for malicious actors to use to promote, say, lockdowns and mask mandates for children over some abstract "greater good" for society at large. "Yeah, it'll harm your children, but my utilitarian calculus says it's moral!"

Expand full comment
author

I think that a good utilitarian will generally care a lot about their family. There are instrumental reasons to treat your family, for example, as intrinsically more important than others. https://benthams.substack.com/p/hedonism-and-objective-list-theory To be an EA, you don't have to be a utilitarian. If we look at the real world track record of utilitarians, it is quite good. https://rychappell.substack.com/p/how-useful-is-utilitarianism

Expand full comment

I'm not persuaded that Collins said anything foolish. I do think you did a good job demonstrating that Zvi's positions aren't necessarily representative of EA. Hard to not be a cult these days, I don't think it is the terrible insult you seem to think it is.

Expand full comment