35 Comments

Endorsing this over FdB's comments, which are just churlish. Like his recent arguments against conserns over AI risk, it seems to be based on a bunch of logical fallacies stemming mainly from him just not liking the people involved. I noticed the point too where he just assumes that his own deontological moral intuitions are correct. No attempt to justify why they are better. I don't know what's got into him lately.

Expand full comment

This is so comprehensively non-responsive to my post that there's nothing I can say about it, other than that you need to develop much better reading comprehension.

What's good about EA is the parts that are typical; what's atypical about EA are the parts that are not good. The weirdo shit drags the whole philosophy down, but pieces like this exemplify the incredible sensitivity towards any criticism of the bizarre elements of that culture, which in turn prevents EA from getting better.

Expand full comment

Well said. I think what gets lost in this very-online drama is that most people have never heard of EA, and have never seriously considered the possibility that some charities are more effective than others. If EA becomes more popular, I can't see how it would make the average American donation less effective!

Expand full comment

whenever a writer appears good in general but then when they write about a topic you are knowledgeable in it's bad, that's a sign that they might actually not be good at writing and you just dont notice because it's never about something you are knowledgeable in. not to say that freddie deboer is a consistent bad writer but it's worth giving some thought.

Expand full comment

Maybe a productive way to frame the debate is this: "Relative to a counterfactual world without EA, how much extra good has EA done?" The way I read FdB, his take appears to me "not very much", while Scott Alexander argues it's hundreds of thousands of people. Alas, counterfactuals are tough.

For the EA/utilitarianism connection, perhaps this post may be of interest: https://inexactscience.substack.com/p/the-case-for-narrow-utilitarianism

Expand full comment

Point one: there is no amount of charitable exertion in the world which buys you either the right to congratulate yourself in public, or exemption from scrutiny.

Point two: you seem to be saying: Other people don't give to charity like you do, or give to fluffy animal charity because they are less intelligent than you. As it happens I give money to treat blindness in children in Africa, you do shrimp. You think that makes you interesting, and cleverer than me. I don't agree.

Point three: We want to make AI safe, who could possibly be against that, is a sleight of hand. EA is concerned with fringe, ooh look at us being weird and nerdy issues which I am glad to see got virtually no air time at all at Bletchley last month. You are not saving the world.

Point four and one I am sorry to have to make. There's no doubt and no denial - just a rather belated apology - about Bostrom's history of racism. There's Shear: "The Nazis were very evil, but I'd rather the actual literal Nazis take over the world forever than flip a coin on the end of all value." The denial which these points evokes from EA ers when they are pointed out on social media suggests a pattern here, and not a good one. If EA isn't creepy it's certainly creep adjacent. I see no reason, absent audited accounts, to believe any unsupported assertion that EAers in fact more reliably give more of their money than anyone else. Bill Gates is not an EAer despite desperate claims that he sort of is spiritually but without really knowing it. Sure there's money sloshing around the EA world, but SBF...

If you want to give to serious charity good for you, but please less of the Thank God I am not as other men are.

Expand full comment

DeBoer's book on education is not "great," nor is it well thought or honest. It is dishonest and evasive, and its bleak nihilism about student achievement is belied by many actually functioning schools. DeBoer defends the most dysfunctional government schools for reasons he can perhaps explain, but which seem to come down to his fundamentalist faith in the socialist idea: government schools have to be the best possible schools because, er, well because if they aren't, despite the vast money they receive, well then it must be that there are private sector alternatives that would actually work better, and that leads us into capitalism. And we can't have that, can we?

DeBoer is at his best as a social critic of recent rhetoric. When he has to deal with facts, he ends up floundering around and can't really function because facts don't support his hardline socialist ideals.

Expand full comment

Freddie is wrong 80% of the time and annoying 100% of the time.

Expand full comment

Not a complaint but this note has been constantly at the top of my notes feed for weeks, is it just me?

Expand full comment

Freddie makes his living being wrong. But he’s wrong in such a thoroughgoing and frankly impressive fashion that he does the work of 100 normal nutjobs, or maybe more. He might be his own nutjob sector, and you have to respect that.

Nutjobs are close to my heart (there but for the grace of God and all that) and we unquestionably need them, but I’m pretty sure I’m done subsidizing Freddie.

Expand full comment

EA makes two claims.

1) Within a given goal/value system you should use utilitarianism more.

This is non-controversial but also not new. As SSC notes the Gates Foundation and other non-profits already did this.

2) that utilitarianism can be used to determine what your goals/value system should be.

That is what I think people hate, for all kinds of different reasons.

Look, if you want to get hardcore utilitarian then “increase the population of low iq Africans by buying bed nets” is a fucking dumb long term utilitarian strategy.

If your goal is “save lives in the short term” the bed nets are the best way to do that and go for it buddy. Just don’t tell me you have a monopoly on truth.

I think 99.9999% of good in the world was done by selfish people for selfish reasons. Please just fucking re-invest your money for maximum ROI, that is the best way to create the most utils for most people most of the time.

Expand full comment

These AI defenses seem to talk past the entire point. It’s not that doing good is weird, or counterintuitive or obscure that’s the problem. It’s that they claim they have discovered the secret of the universe (like: how to maximize the Good), and then are offended that you are not credulous about that. No, it’s not that doing good the EA way is weird, it’s that it is simply not good at all, and the rest of us somehow understand that. It’s must what talking to communists was like in the 1910s.

Expand full comment