Bentham's Newsletter

Share this post

Rollins' Reckless Slander

benthams.substack.com

Rollins' Reckless Slander

Claims that EA is a cult are foolish

Bentham's bulldog
Jan 28
4
27
Share this post

Rollins' Reckless Slander

benthams.substack.com

Jay Rollins has a truly bizarre specimen of an article. This article alleges, among other things, that EA is a cult; it does this based on flimsy, wholly unjustified assumptions combined with a startling and profound range of confusions. This is a new and rather odd breed of criticism of EA. It seems the old breed of effective altruism criticism was to generally just point at random things EAs have written and sputter—how dare they write that thing that sounds bad out of context? The outrage!

This breed was interesting, albeit totally bankrupt—much like criticizing the Democratic party by digging up random bad-sounding, out-of-context statements by obscure Democratic politicians. But this old strategy required a sort of art—one had to be somewhat informed of a lot of things EAs had said to be able to slander them effectively. This new breed seems even stranger and in some ways worse—more perverse. Instead of quote mining for things that sound bad, the new breed will merely poorly summarize EA, before lying about it repeatedly, claiming it supports things that it it diametrically opposed to. I’ve already replied to one of these articles—given their rapid proliferation, someone needs to do the unpleasant work of refuting them systematically.

Jay Rollins follows the new breed of EA criticisms based on misrepresenting it and alleging it’s horrible in various ways. Given that Rollins’ entire article is one grand, dramatic exercise in erecting straw-men, before declaring them scientologists, or similar cult members, let’s get clear on what EA is. I’ll quote Richard’s excellent summary of the topic.

According to CEA’s official definition, “Effective altruism is the project of trying to find the best ways of helping others, and putting them into practice.”

The way I put it: effective altruism is about trying to help others as effectively as you can with whatever (non-trivial) resources of time or money you're willing to put towards that altruistic project.

There’s then an EA movement/community of people who self-identify as engaged with this project, and who would like to encourage others to join them. I support this, as I think a community can often achieve things that isolated individuals cannot. But if you don’t like the EA community for any reason, there’s no inherent barrier to pursuing your own effective altruism independently. (Many people may implicitly be doing this already, without necessarily thinking of themselves as “effective altruists” at all.)

So, EA is a community of people trying to act on the idea of effective altruism—doing good as effectively as possible. That seems good, especially when the movement has saved over a hundred thousand lives. However, Collins seems to think that EA is a cult.

How does he argue it’s a cult. Well, he goes through a very confused list by someone else of lots of things that correlate a bit with EA, before declaring them necessary for EA, before declaring them indicative of cult-ness. The argument is sufficiently full of holes that an elephant could pass through it unscathed. Collins quotes 21 points from Zvi Mowshowitz. I’ll go through each of them and explain why they’re either not required for EA or not objectionable.

  1. Utilitarianism. Alternatives are considered at best to be mistakes.

Rob Henderson's Newsletter

’s take on utilitarianism is that the Venn diagram of college utilitarians and psychopaths is basically a circle. I’d go further. Utilitarianism is a philosophy with a specific use case: leadership decisions. If you are not representing your tribe, you have no business deploying utilitarian ethics, which consist of moral calculations about maximizing utility for groups. People who engage in utilitarianism for personal reasons have taken academic philosophy classes, and thus bear watching; they have formal education in how to make a special case of themselves.

One might expect that in a case where one is arguing that effective altruism is a cult because it’s too utilitarian, they would argue both that EA is objectionably utilitarian and that being objectionably utilitarian makes one a cult. Now, they provided no argument for the first claim—beyond pointing out that Zvi Mowshowitz hinted at it once, and the second claim is ridiculous and supported by nothing.

Now, seeing as our author has, rather bizarrely, just linked generically to Henderson’s newsletter, I have no idea which article they are referring to. When one does not cite their sources properly, it is hard to know which sources to address. But the utilitarians that are EAs—which is clearly the relevant reference class—are obviously not psychopaths; psychopaths don’t dedicate significant portions of their life to helping others. Psychopaths are not big donors to curing malaria overseas, for example.

One might additionally expect that the author would provide a reason to think that utilitarianism—an ethical view that’s been around hundreds of years with many adherents—is a bad theory; at least, bad for things beyond leadership decisions. However, they’d be wrong. Rollins is apparently above such mundane and frivolous tasks as arguing against moral theories—he can just smear its adherents as cultlike, while providing no reasons to judge his assessment to be accurate.

Additionally, there’s no reason to think EA is intimately utilitarian. As Richard notes, one only has to think that it’s good to make the world a better place and that more good is better than less to be an effective altruist. They don’t have to think any of the controversial things that utilitarians tend to think.

  1. Importance of Suffering. Suffering is The Bad. Happiness/pleasure is The Good.

Suffering builds character. If you want to bubble-wrap the world, there is a fundamental gap between your world-view and mine that probably isn’t getting bridged.

This is a standard objection to—or more accurately described, confusion about—utilitarianism, generally given by uninformed freshman undergraduates. Suffering is bad intrinsically but it may get a good thing. It is good instrumentally in that it sometimes produces other good things, but it is, by itself, bad.

Once again, this is irrelevant to basically all that EA does, unless you literally hold the view that factory farms and death from malaria are good because people have productive suffering. That view, however, would be crazy.

  1. Quantification. Emphasis on that which can be seen and measured.

I’m a fan of quantification. I also sincerely believe that there’s a wolf-god that incarnates within his followers from time to time, and that there are other gods who do the same. There’s a time and a place for each.

I’m confused as to what’s being said—perhaps it’s some witty remark. Given that I don’t know what the objection is supposed to be, if there is one, I’ll just leave a link to my article replying to the objection to utilitarianism that says calculation is impossible.

The next several things are all things that Rollins agrees with. Now, I don’t think most of them are inherent to effective altruism, but they’re not offered as objections, so I have no reason to respond.

  1. Altruism. The best way to do good yourself is to act selflessly to do good.

I am suspicious of anyone who says or implies they’re selfless. I think the overwhelming majority of the things people do are ultimately for their own benefit, even if that benefit is not obvious to observers, and I’m fine with that. I think you should not be intentionally antisocial, but if you don’t want to sing “Kumbaya,” more power to you.

Being altruistic doesn’t require saying you’re selfless. Indeed, I’m certainly not totally selfless, nor is anyone else on earth (plausibly). Nevertheless, I think one should try to do good things. Only on planet Rollins does saying people should do good mean “I’m perfectly selfless.”

  1. Obligation. We owe the future quite a lot, arguably everything.

One of my favorite lines from a Best of Craigslist post entitled Advice to Young Men From an Old Man is “You don’t owe the vast majority of people shit.” Don’t tell me I owe anyone anything. I know who I owe, and how much I owe them. I also know what I’m owed.

This is another case where I think Zvi went badly awry. EA doesn’t require saying we owe the future a lot—just that it’s very important to help the future. I’m something like a scalar utilitarian, so I don’t really believe in obligations—though it’s more complicated than that; I sort of believe in them—and I’m still an EA. Rollins gives no argument against this beyond quoting a line asserting that you don’t owe anyone anything, so there’s nothing to respond to.

I imagine Rollins walking past a drowning child who he could save at no personal cost. He declares “I don’t owe you shit!” Then he walks away and the child drowns.

Also, one can be an effective altruist and do the majority of EA stuff that focuses on animal welfare and global health.

  1. Selflessness. You shouldn’t value yourself, locals or family more than others.

This was the one where I considered compound blasphemies accompanied by a raised middle finger instead of an actual answer. But if I must take that statement seriously, that’s only one ethical framework. Another is any variation on concentric circles that enclose one’s loved ones, community, region/muncipality, and/or nation.

Another mistake from Zvi. No one really thinks that it’s wrong to care about your family. Some people think maybe perfect robots would care about all people equally, but no one in EA—seriously, give me one person who is—is loudly condemning people for helping their families when they could help others marginally more. The claim of EA is just that we should do a lot to help others, not that we should value them as much as our family and friends.

  1. Self-Recommending. Belief in the movement and methods themselves.

This is starting to sound explicitly culty. I don’t do blind belief in an organization, regardless of the organization. And I evaluate method as part of my day job. If your methods are good, fine. I don’t know that I think EA’s heuristics are good, though. They don’t reflect reality as I understand it in all cases.

Apparently if you think your movement has good methods for figuring things out, it’s a cult. Thus, if you, as a catholic, trust mainstream catholic doctrine on issues you haven’t explicitly investigated, then you’re in a cult. But I also don’t think you need to generally defer to EA to be part of it—though if you don’t, then you’ll probably be less sympathetic to some things EA is doing. For example, if you don’t think that much of EA is very good, you should be a bit less sympathetic to EA movement promotion.

  1. Evangelicalism. Belief that it is good to convert others and add resources to EA.

That is a cult, by any definition with which I’m familiar.

If a Democrat wants there to be more Democrats, that’s a cult.

If a Republican wants there to be more Republicans, that’s a cult.

If a Christian wants there to be more Christians, that’s a cult.

If a chess player wants more people to play chess, that’s a cult.

  1. Reputation. EA should optimize largely for EA’s reputation.

I assume by the first “EA” The Zvi means “Effective Altruists,” not “Effective Altruism.” I don’t optimize for anyone’s reputation but my own. That’s not how it works. I stand by my word and my actions. I can’t stand by anyone else’s; that’s not a reasonable expectation of anyone.

I don’t think this is a core part of EA at all. Lots of EAs seem to be willing to say controversial things without caring much about hurting movement optics. But while Rollins may not optimize for anyone else’s reputation, this says nothing at all about anything important. If you know that some controversial statement will start a scandal that will be bad for the world and turn people off to an important social movement, you shouldn’t say it probably.

  1. Modesty. Non-neglected topics can be safely ignored, often consensus trusted.

“The collective decides what its members can discuss.”

I remember reading a very bizarre article a while ago by one Amanda Marcotte. She was trying to smear Scott Aaronson after he opened up about various troubles he’d had—see here for the complete story. To do this, what she would do was quote a very reasonable sounding statement and then provide an outlandish translation. For example

You can call that my personal psychological problem if you want, but it was strongly reinforced by everything I picked up from my environment: to take one example, the sexual-assault prevention workshops we had to attend regularly as undergrads, with their endless lists of all the forms of human interaction that “might be” sexual harassment or assault, and their refusal, ever, to specify anything that definitely wouldn’t be sexual harassment or assault. I left each of those workshops with enough fresh paranoia and self-hatred to last me through another year.

She then of this passage translated it to the following.

Translation: I was too busy JAQ-ing off, throwing tantrums, and making sure the chip on my shoulder was felt by everyone in the room to be bothered to do something like listen.

Or another

On the contrary: I found reams of text about how even the most ordinary male/female interactions are filled with “microaggressions,” and how even the most “enlightened” males—especially the most “enlightened” males, in fact—are filled with hidden entitlement and privilege and a propensity to sexual violence that could burst forth at any moment.

Translation: Unwilling to actually do the work required to address my social anxiety---much less actually improve my game---I decided that it would be easier to indulge a conspiracy theory where all the women in the world, led by evil feminists, are teaching each other not to fuck me. Because bitches, yo.

This is sort of how I feel about the Rollins example. The original claim is that if there are lots of people working on a problem, then working on it is probably not very effective, at the margins. Rollins translates this to essentially “the EA thought police tells you what to think.”

I also don’t think that EAs have to embrace the consensus. There are tons of posts on the EA forum criticizing lots of things in EA. You just have to be broadly on board with improving the world to be an EA—at least, philosophically.

  1. Judgment. Not living up to this list is morally bad. Also sort of like murder.

If your social group has precepts not generally accepted by society, deviation from which makes you a moral outcast, its structure is religious by nature.

This is false; a common theme discussed by EAs is that none of us do the best thing—none of us always act rightly. Still, we should try to do what we can.

This also means that vegans are “religious by nature.”

  1. Veganism. If you are not vegan many EAs treat you as non-serious (or even evil).

If your social group has dietary rules not generally accepted by society, deviation from which puts you in a lower social or moral tier, you are a member of either a religion, or more rarely, a sex cult. I cannot think of any exceptions to this rule.

I think lots of EAs tend to treat you as doing something seriously morally wrong. But they don’t treat you as evil. I haven’t known this to happen. Most EAs aren’t vegan. By this standard, animal rights activists would also be a cult.

  1. Totalization. Things outside the framework are considered to have no value.

Y’all are a cult.

WHAT?? Why in the world did Zvi write that? Nice weather is outside the framework, but it obviously has value. This claim is just ridiculous.

Anyway, EA isn’t a cult. And if it is, it’s certainly not for the reasons that Rollins says.

27
Share this post

Rollins' Reckless Slander

benthams.substack.com
27 Comments
Harrison Koehli
Writes Political Ponerology
Jan 29·edited Jan 29

I found this to be an ok-written article, though a tad on the robotic side. I think this might have something to do with the left-brain, utilitarian mindset that seems to underlie EA. On that note, I think some of Rollins's points were funny, and it would have been more effective to reply to the jokes with jokes.

As for your rhetorical examples of things that aren't cults, I think a good case can be made that they're all cults, some more benign than others. I, for instance, am more of a fan of the cult of chess than the cult of St. Olaf. Also, I would classify most of the vegans I know as religious, for sure. Though I myself have been religiously keto at times, so I try not to judge too harshly.

I do have some questions, though. On egoism vs. altruism, isn't selflessness part of the definition of altruism? Maybe you didn't include it, but I didn't see Rollins say "perfectly selfless", just selfless, which seems fair to me.

"treat you as doing something seriously morally wrong. But they don’t treat you as evil."

I thought this was a definition of evil?

“Effective altruism is the project of trying to find the best ways of helping others, and putting them into practice.”

If I think the best way of helping others would be to do the opposite of pretty much everything EA recommends, could I still be an effective altruist?

Expand full comment
Reply
16 replies by Bentham's bulldog and others
John Carter
Writes Postcards From Barsoom
Jan 29

I found this post confusing. Is it Rollins, or Collins? Or are those two different people?

It also wasn't clear if you're primarily critiquing Rollins (Collins?), or Zvi, since it seems like most of the really egregious statements that the former takes issue with you're distancing yourself from. Is Zvi not representative of EA? If not, are you? Why should credibility be assigned to your take on EA and not Zvi's? Those are genuine questions, as I'm not deeply familiar with EA or its major proponents.

As a final note, a lot of people - myself included - would consider vegans and animal rights activities to be very culty. Not all, to be sure, but many are evangelical, pushy, and intolerant. As for Catholicism, it may not be a cult now but the early church absolutely was, pretty much by definition. Is the implication with that comparison that EA ultimately aims to become something akin to a major world religion? And if so, given that it's obviously in the gestational stages, wouldn't that make it a kind of cult by definition?

Expand full comment
Reply
6 replies by Bentham's bulldog and others
25 more comments…
TopNewCommunity

No posts

Ready for more?

© 2023 Bentham's bulldog
Privacy ∙ Terms ∙ Collection notice
Start WritingGet the app
Substack is the home for great writing