Discussion about this post

User's avatar
TheKoopaKing's avatar

>It just seems intuitively clear that an experience—say, a phenomenal presentation of neon green—is distinct from any physical state of your brain/body.

It's also "intuitively" clear to me that videos playing on my computer screen are distinct from registers moving electrons billions of times a second in my CPU. But literally everything else in the field of computer science stands against this "intuition." Likewise everything else in the cognitive sciences stands against this intuition, and I don't think there's any conceptual impossibility in designing a cognitive system that will think certain parts of its inner computation are distinct from other (or the same) parts of its inner computation.

>it seems utterly self-evident to the light of reason—that being round is not the same thing as being red

This is really just a statement about how you're prone to use the concepts "round" and "red." In reality, when we zoom in, we don't discover "roundness" in any particles, we discover probabilistic interactions between wave-particles that don't exactly have a size or shape; rather, approximate sizes and shapes are constructed out of various physical parameters we alter in interacting wave-particles we fire at them. Meaning, there is no undifferentiated thing "x" such that we can just say "x" has the property of being "round." And when we zoom in things aren't actually painted over by colors such that we can say an undifferentiated thing y has the property of being "red." There are much more complex stories that need to be told to develop an accurate model of properties like "roundness" and "redness," but the (meta)physics will probably not fall on manifest determinations based on a priori arguments or human concepts.

>These arguments start with an epistemic gap between the physical and the phenomenal

Knowledge and concepts are agent relative, so epistemic gaps aren't meaningful. There used to be an epistemic gap to many people between the morning star and evening star (and still is to people who have never heard this example), but they're both actually Venus. But wait, if we accept epistemic gap arguments then they're not both actually Venus. And wait, why do so many people say that people "used" to think they're not identical - if epistemic gaps entail metaphysical gaps then this could never happen.

>There does not seem to be a similar intelligible connection between physical brain activity and the occurrence of an experience, or the fact that we have a reddish rather than a greenish experience.

This is asserted with no argument and references an underspecified group of people to who this "seems" to. I imagine that it doesn't even seem like this to most philosophers, let alone people working in the cognitive sciences, let alone the people of the world or all people all throughout history.

>why doesn’t that processing go on “in the dark”?

Why can the clock on your CPU tick accurately enough to keep track of time even when not plugged into a power source, but you can't play Fortnite with the computer unplugged? Because of various design implementation details that are super complex.

>It’s conceivable that someone has physical profile P without being conscious, or while having experiences different from Matt’s actual experience—e.g., color inverted experiences.

It's also conceivable to play Fortnite without having it powered by a CPU - just imagine the movements on the monitor without a CPU. Have I just made a breakthrough discovery in the theory of computation?

>Or consider a China-body functional isomorph of Matt, where the people of China send radio signals to one another in ways that functionally duplicate the signaling patterns in Matt’s brain. Regardless of whether such systems would actually be conscious, it’s obviously coherently conceivable that such functional isomorphs exist without consciousness.

It's also conceivable to most people that you could never play Fortnite on a simple Turing machine - but Turing machines just are universal computers.

>But pain is essentially a feeling, so if there is a possible scenario where a brain state occurs without an associated feeling of pain, this is simply a possible scenario in which the brain state occurs without pain.

Winning the Battle Royale in Fortnite is essentially a network request between the last surviving player and the Fortnite server. Imagine a network request between the last surviving player and the Fortnite server (and make sure you do it ideally, where you imagine all the network infrastructure and classes in the code and physical particles moving around). Did you also realize after that imagining that that player would win the Battle Royale? Why not? Probably because you (and I) have no clue about any of the implementation details and how they relate to manifest situations like "winning the Battle Royale in Fortnite."

>Mary doesn’t know all the facts about human color vision. In particular, she doesn’t know what it’s like to see red.

She would know, just like if she knew all the computation facts she would be able to tell whenever someone wins a Battle Royale in Fortnite; but these implementation details are opaque to us so we can't imagine anything that would satisfy the prompt due to our epistemic ignorance.

>One-one: Identity is a one-to-one relation, not a one-many relation. You can’t be numerically identical to two distinct people.

These stipulations of "identity" are far too impoverished to cover the whole of what is possible in psychology. They rule out by stipulation that anybody could experience multiple personality disorder, when this is an empirical question.

>But it seems crazy to hold this view about your own identity over time. Imagine you are about to undergo an operation that will replace some large-ish fraction of your brain with new materials. Someone will wake up and live a happy life after the operation. Will it be you?

It will depend on the implementation details. You can just add an identity-inverter to your existing cognitive structure that will intercept every personal identity claim and you will believe that you are not yourself.

>If sense data theory is true, then physicalism is false.

Only if you a priori rule out the ability of virtualization or complex abstractions. Consider: If your computer has a recycling bin icon on the desktop but no recycling bin exists in your harddrive, our theory of computation is wrong.

>the “special composition question,” namely: if you’ve got two or more things, what would you have to do to them to get them to compose a further thing?

Things are useful abstractions, they're not units you add or substract absent your goals and interests. I don't even know what a fact of the matter would like here - either you can abstract some things successfully to suit your goals and interests or you can't. Much of philosophy is asking malformed and pointless questions.

>the concept of consciousness does not permit us to conceive of genuinely borderline cases of sentience, cases in which it is inherently indeterminate whether a creature is conscious: either a creature definitely is conscious or it is definitely not

Compare: this integer variable either has value 3 or it doesn't. This seems plausible until you're introducted to parallel computation, the "volatile" keyword or equivalents in programming languages, out of sync caches, etc.

>We know lots of stuff by rational intuition

Philosophers have a bad habit of positing rational faculties and divine senses. Why has no other field studying cognitive sciences - especially empirical ones - discovered these things? My bet is because they're just made up bullshit.

>the fact that our mental states are about things in the world, and have truth-conditions or satisfaction conditions

I don't think this is anything other that a grammatical reflection. Some people who speak English in our current timeframe are comfortable saying that mental experiences are "about" things. What stands or falls with this?

>Some experiences, like excruciating pain and euphoric pleasure, have “final” value or disvalue (i.e., they are good or bad, not just instrumentally, but in themselves).

There's no such thing as noninstrumental goodness or badness. Things are good or bad because they make others feel pleasure or pain. Just like how things are intrinsically tasty or intrinsically tall without reference to agents or environmental factors.

>I claim that even if we imagined the situation in arbitrary physical detail, it would still seem that nothing very bad is going on.

You shouldn't claim things like this without empirical evidence.

>The mind is indivisible.

This again rules out things like multiple personality disorder a priori, when obviously this would be adjudicated empirically.

Overall: Makes a lot of empirical claims from the armchair that aren't substantiated with empirical evidence, makes too many simplifying assumptions about our current state of knowledge about roundness and the mind that leads to malformed questions being asked and absurd answers within the malformed frame being posited, and doesn't engage with the bulk of work performed on the mind via psychology, neuroscience, or computer science. Verdict: Yep, it's analytic philosophy.

Expand full comment
PhilosophyNut's avatar

The world's best philosopher joins forces with the world's best philosophy blog! Thus begins the golden age.

Expand full comment
55 more comments...

No posts