Great article. I also hang out with rationalists sometimes, and I am too frustrated by their absurd overconfidence in the "LessWrong canon". One could write similar articles on various topics (like decision theory), but, honestly, this just makes me sad about the whole premise of trying to think better.
And it's not like the other group trying to think better, analytic philosophers, is faring well. Maybe the sad truth is that humans are really bad at philosophy, and there is just no reliable way to fix that.
> Rationalists seem to have total confidence in physicalism about consciousness, despite the many objections to it. I like Rationalists a lot. They are much better at acquiring true beliefs than most normal people. But unfortunately, they are much worse at acquiring true beliefs than they think they are. This often results in absurd overconfidence in very tenuous views—views for which there are not decisive arguments on either side.
As a rationalist, this is a bit distressing to hear. Not because there's anything incorrect about what you've said, but because it *shouldn't* be correct. After this you describe a Yudkowsky who seems to have come up with a hypothesis via his internal intuition, and then failed to update his views according to available evidence. Yudkowsky's intuition is usually pretty good, but he himself wrote:
> The third virtue [of rationality] is lightness. Let the winds of evidence blow you about as though you are a leaf, with no direction of your own. Beware lest you fight a rearguard retreat against the evidence, grudgingly conceding each foot of ground only when forced, feeling cheated. Surrender to the truth as quickly as you can. Do this the instant you realize what you are resisting, the instant you can see from which quarter the winds of evidence are blowing against you. Be faithless to your cause and betray it to a stronger enemy. If you regard evidence as a constraint and seek to free yourself, you sell yourself into the chains of your whims. For you cannot make a true map of a city by sitting in your bedroom with your eyes shut and drawing lines upon paper according to impulse.
Interestingly I feel the same as you regarding qualia/consciousness with respect to Yudkowsky: I'm inclined toward moral realism, and Yudkowsky's view seems incorrect even just based on how humans work. If I stub my toe really hard, I believe I am experiencing qualia in that moment, but not that I'm experiencing "reflective self-awareness" in that moment. I don't understand how "reflective self-awareness" is supposed to create this qualia, such that without it there is no qualia. I similarly disagree with EY about the danger of AGI, where his reasoning that the very first AGI will probably kill us all seems to have skipped some logically necessary steps.
I don't see a problem with *rationalism* per se, because it seems to me that Yudkowsky laid out some very good principles. It seems to me that the main problem going on in rationalism is that some rationalists including Yudkowsky himself don't always seem to follow those principles. It may also be the case that there are some additional principles that are an important part of being rational, but which aren't included in Yudkowsky's Sequences, and we should expect this to be the case because reality is extremely complex relative to human mental capability, and therefore it would be surprising if any single person were the font of all truth. And indeed, my understanding of the twelfth and final virtue of rationality is that "there are some additional principles of rationality I haven't laid out here, and I myself don't know what they are, but they're important too".
Reminds me of this blog post by a philosophy professor, where he basically says that Yudkowsky - in the paper he had submitted - is totally sure that he is right about decision theory, despite completely misunderstanding basic points
Both sides of the debate seem to mostly assume that consciousness is one thing , but it's also possible that the words "consciousness" is an ambiguous, gerrymandered word. If phenomenal.consciousness and reflexive self awareness are two things that don't have to co.occur., there is a further issue about which is more.ethically relevant.
I found this an interesting way to explore ideas from the philosophy of mind, and I've saved your piece on physicalism to read later. As you touched on neuroscience, particularly in relation to consciousness, I'd recommend The World Behind the World by Erik Hoel. Erik is a popular writer here (for good reason), so you've likely come across the book already, but it makes some important points related to the pre-paradigmatic state of neuroscience and the extreme limits of the neuroscientific literature that are relevant to this essay.
But disagree with your view of diminished consciousness coinciding with high focus activity. Here I'm with Yudkowsky, when we're highly focused, we probably trade off processing power by shutting down some channels. I know when I'm highly focused on one item, I lose outside processing, such as keeping track of time. I used to have cron jobs send messages to my pager to tell me when to go to lunch and dinner.
Maybe somewhat our consciousness is diminished. But there are some where we aren’t self modeling but are still conscious. Some of our most intensely conscious states--e.g. orgasm--occur when much of the brains self modeling is shut down
I would say the opposite. Orgasm and such, are when we're highly focused, and oblivious to the outside world. Look at rutting deer, they'll run smack into the side of your car chasing does in estrus.
You are right when you say you might be misunderstanding Eliezer's argument. It doesn't appear that you engage with his actual core point, and perhaps cannot see it. (His actual point is correct and profound; it's behind e.g. his "you implicitly imagine" in the linked discussion. I notice often that folks quibble with low level aspects of his various arguments without appearing to notice the real point.)
What do you think his argument is that I am not adequately grasping? I imagine that pigs feel pain, for instance, because there's lots of evidence that they do.
"[W]hat you think of as the qualia of an emotion is actually the impact of the cognitive algorithm upon the complicated person listening to it, and not just the emotion [cognitive algorithm] itself. "
You ignore this and then write at length proving that pigs have cognitive algorithms. That was never in doubt, and we might wonder why you would suppose that a brilliant fellow like Eliezer would be ignorant of this.
I was speaking of his point, not searching for something you would recognize as an 'argument'. What would you prefer to call it when someone smarter and more insightful than yourself invites you to consider that the way you are looking at a problem is flawed, rather than 'argue' within your frame? Whatever that is, that's what he's doing. Go back and read the whole thing again with an open mind instead of looking for little items to refute, and you might understand.
Quote: Something has gone deeply wrong when you conclude with almost total certainty that a position disagreed with by almost all experts, with no evidence for it, that produces genuinely crazy results and conflicts with lots of empirical evidence is true.
Bentham's Bulldog, I read your substack because you are a deep thinker and it's always interesting to understand how you think. But, to be fair, this could just as easily describe your views on animals.
It's the plurality view both among normative ethicists and meta-ethicists. To call this cherry-picking just shows that you are not familiar with even the basics of the literature on moral philosophy.
Great article. I also hang out with rationalists sometimes, and I am too frustrated by their absurd overconfidence in the "LessWrong canon". One could write similar articles on various topics (like decision theory), but, honestly, this just makes me sad about the whole premise of trying to think better.
And it's not like the other group trying to think better, analytic philosophers, is faring well. Maybe the sad truth is that humans are really bad at philosophy, and there is just no reliable way to fix that.
> Rationalists seem to have total confidence in physicalism about consciousness, despite the many objections to it. I like Rationalists a lot. They are much better at acquiring true beliefs than most normal people. But unfortunately, they are much worse at acquiring true beliefs than they think they are. This often results in absurd overconfidence in very tenuous views—views for which there are not decisive arguments on either side.
As a rationalist, this is a bit distressing to hear. Not because there's anything incorrect about what you've said, but because it *shouldn't* be correct. After this you describe a Yudkowsky who seems to have come up with a hypothesis via his internal intuition, and then failed to update his views according to available evidence. Yudkowsky's intuition is usually pretty good, but he himself wrote:
> The third virtue [of rationality] is lightness. Let the winds of evidence blow you about as though you are a leaf, with no direction of your own. Beware lest you fight a rearguard retreat against the evidence, grudgingly conceding each foot of ground only when forced, feeling cheated. Surrender to the truth as quickly as you can. Do this the instant you realize what you are resisting, the instant you can see from which quarter the winds of evidence are blowing against you. Be faithless to your cause and betray it to a stronger enemy. If you regard evidence as a constraint and seek to free yourself, you sell yourself into the chains of your whims. For you cannot make a true map of a city by sitting in your bedroom with your eyes shut and drawing lines upon paper according to impulse.
Interestingly I feel the same as you regarding qualia/consciousness with respect to Yudkowsky: I'm inclined toward moral realism, and Yudkowsky's view seems incorrect even just based on how humans work. If I stub my toe really hard, I believe I am experiencing qualia in that moment, but not that I'm experiencing "reflective self-awareness" in that moment. I don't understand how "reflective self-awareness" is supposed to create this qualia, such that without it there is no qualia. I similarly disagree with EY about the danger of AGI, where his reasoning that the very first AGI will probably kill us all seems to have skipped some logically necessary steps.
I don't see a problem with *rationalism* per se, because it seems to me that Yudkowsky laid out some very good principles. It seems to me that the main problem going on in rationalism is that some rationalists including Yudkowsky himself don't always seem to follow those principles. It may also be the case that there are some additional principles that are an important part of being rational, but which aren't included in Yudkowsky's Sequences, and we should expect this to be the case because reality is extremely complex relative to human mental capability, and therefore it would be surprising if any single person were the font of all truth. And indeed, my understanding of the twelfth and final virtue of rationality is that "there are some additional principles of rationality I haven't laid out here, and I myself don't know what they are, but they're important too".
Fully agree!
Reminds me of this blog post by a philosophy professor, where he basically says that Yudkowsky - in the paper he had submitted - is totally sure that he is right about decision theory, despite completely misunderstanding basic points
https://www.umsu.de/blog/2018/688
Both sides of the debate seem to mostly assume that consciousness is one thing , but it's also possible that the words "consciousness" is an ambiguous, gerrymandered word. If phenomenal.consciousness and reflexive self awareness are two things that don't have to co.occur., there is a further issue about which is more.ethically relevant.
And physicalism has nothing to do with it.
I found this an interesting way to explore ideas from the philosophy of mind, and I've saved your piece on physicalism to read later. As you touched on neuroscience, particularly in relation to consciousness, I'd recommend The World Behind the World by Erik Hoel. Erik is a popular writer here (for good reason), so you've likely come across the book already, but it makes some important points related to the pre-paradigmatic state of neuroscience and the extreme limits of the neuroscientific literature that are relevant to this essay.
Totally sidestepping all the substance in your article, I have never seen anyone write FaceBook before instead of simply Facebook.
Will fix!
I have you with about 90% of your points.
But disagree with your view of diminished consciousness coinciding with high focus activity. Here I'm with Yudkowsky, when we're highly focused, we probably trade off processing power by shutting down some channels. I know when I'm highly focused on one item, I lose outside processing, such as keeping track of time. I used to have cron jobs send messages to my pager to tell me when to go to lunch and dinner.
Maybe somewhat our consciousness is diminished. But there are some where we aren’t self modeling but are still conscious. Some of our most intensely conscious states--e.g. orgasm--occur when much of the brains self modeling is shut down
I would say the opposite. Orgasm and such, are when we're highly focused, and oblivious to the outside world. Look at rutting deer, they'll run smack into the side of your car chasing does in estrus.
We're less aware of external things, but that's because of the vividness of our current experience.
You are right when you say you might be misunderstanding Eliezer's argument. It doesn't appear that you engage with his actual core point, and perhaps cannot see it. (His actual point is correct and profound; it's behind e.g. his "you implicitly imagine" in the linked discussion. I notice often that folks quibble with low level aspects of his various arguments without appearing to notice the real point.)
What do you think his argument is that I am not adequately grasping? I imagine that pigs feel pain, for instance, because there's lots of evidence that they do.
"[W]hat you think of as the qualia of an emotion is actually the impact of the cognitive algorithm upon the complicated person listening to it, and not just the emotion [cognitive algorithm] itself. "
You ignore this and then write at length proving that pigs have cognitive algorithms. That was never in doubt, and we might wonder why you would suppose that a brilliant fellow like Eliezer would be ignorant of this.
That's no an argument, that's just an assertion.
I was speaking of his point, not searching for something you would recognize as an 'argument'. What would you prefer to call it when someone smarter and more insightful than yourself invites you to consider that the way you are looking at a problem is flawed, rather than 'argue' within your frame? Whatever that is, that's what he's doing. Go back and read the whole thing again with an open mind instead of looking for little items to refute, and you might understand.
Quote: Something has gone deeply wrong when you conclude with almost total certainty that a position disagreed with by almost all experts, with no evidence for it, that produces genuinely crazy results and conflicts with lots of empirical evidence is true.
Bentham's Bulldog, I read your substack because you are a deep thinker and it's always interesting to understand how you think. But, to be fair, this could just as easily describe your views on animals.
No. For one, I give reasons in support of the view that it's wrong to eat animals. For another, it is the plurality view among normative ethicists.
I'm not sure who considers normative ethicists authoritative.
Who are supposed to be the relevant experts on the morality of factory farming, if not normative ethicists???
And ethicists at large? Anyone can cherry pick a group
It's the plurality view both among normative ethicists and meta-ethicists. To call this cherry-picking just shows that you are not familiar with even the basics of the literature on moral philosophy.
no it couldn't. that it's wrong to eat animals is the plurality view among normative ethicists.