61 Comments
User's avatar
Arturo Macias's avatar

I will answer to this in some detail, while of course for me more than 3 pages is malpractice:-)

Let suffice now to say that if “pain” is penalty in the utility function of the neural network, you are obviously right.

A different thing is if there is a “self” to suffer that penalty. I can program a neural network with extreme aversion to this or that, but the consciousness of the structure depends on the network to be complex enough to have a self able to suffer.

No amount of behavioral evidence would convince you that a Boston dynamics dog suffers, and the reason is that you need more than behavior. As commented in the Shrimp post, a broad extension of the moral circle needs a broad theory of consciousness. If you are a naturalistic dualist, and think that consciousness is epiphenomenic, the difficulty is big and metaphysical.

Bentham's Bulldog's avatar

Sure, but in every case we only figure out conscious states through behavior--we can't observe them directly. At some point, when a thing looks and acts like you'd expect it to if it was conscious, it's reasonable to assume it is.

Arturo Macias's avatar

No, because part of our intuition of consciousness is related to complexity. All the destruction avoidance/reproduction pursuit you describe are a result of being a product of evolution. You find similar examples in cell behavior. Nothing of this tell us if there is a self on the other side of that penalty.

Our intuition is that consciousness comes from complexity and information integration, and its intensity depends on those characteristics.

Now, the size of the cockroach neural network is 0.1 % of human… and complexity is often considered to be super additive.

Having penalty without the self imply there is no pain.

Bentham's Bulldog's avatar

Why think more complex brains (measured by neuron counts) experience significantly more intense experiences?

Arturo Macias's avatar

Why not individual eucariota? Why not electrons? If you they are simple, try to solve the Schroedinger equation as they do.

Conciousness is noumenal. I am as much a panphychist as you, but at the end you need a theory of consciousness, not a theory of life behavior. For me is clear that information integration and representation creates conciousness, and as integration information machines we are massively bigger than insects.

But of course, consciousness is noumenal, and what we know about consciousness comes from pure extrapolation: this is easy for other humans, and already imposible for the bat.

Bentham's Bulldog's avatar

Because eucariota and electrons don't behave as if they're conscious--having flexible behavior to avoid harmful stimuli, for instance.

Arturo Macias's avatar

Now there are lots of video games that have a complex behavioural repertory. Even algorithms that can speak like a politically correct professor

NegatingSilence's avatar

I agree with this other commentator's line of reasoning, but to add: I also think that even when there is a subjective self, it's normal to suppose attenuation of the "vividness" of the conscious experience as the nervous system decreases in size and complexity, even if there are unknowns as to how this works.

One can see this even in one's own experience. Dulling of the nervous system (drunk) or lacking emotional reflection upon the pain can already help a lot, even with the same brain and functional reactions.

JG's avatar

This is one of my favorite things you've ever written. Incredibly informative.

Re the relationship between neuron count and intensity: You raise good arguments, yet I'd still be surprised if people who study this stuff concluded a single neuron could feel pain. It seems like there is a relationship between neuron count and pain (as well as sentience more broadly), just not a linear one, and not one we understand well.

Marlon's avatar

<Over time, more and more creatures have been recognized as sentient, such that prior to the 1980s, it was widely believed that animals didn’t suffer at all.

:c

FLWAB's avatar

I made a post about this recently (https://flyinglionwithabook.substack.com/p/why-arent-animal-welfare-activists), but do you support restrictions on abortion after 16 weeks? A fetus will withdraw from being touched as early as 7 weeks. At 16 weeks if you poke a needle into a fetus he'll move away vigorously and have increased stress hormones, comparable to the kind of hormonal increase seen in children and adults who are in pain. Giving a 16 week old fetus anesthesia stops both the reactive movements and increases in hormones.

Dr. Gary L. George gave the following testimony to the Ohio Legislature about his experience with potential fetal pain:

"While doing my first ultrasound rotation, I observed my first “selective reduction” procedure. A woman had undergone IVF treatment for infertility. She was pregnant with triplets. She and her husband decided that they could only handle having twins and wanted to undergo a “selective reduction” of one of the triplets at about 14-18 weeks. I observed while the ultrasonographer scanned the three babies and provided live images so that the obstetrician could aim a long needle through the mom’s uterus into the chest of one of the baby’s hearts in order to make a lethal injection. As the sharp needle touched the baby’s chest, the baby immediately withdrew and started to rapidly move his arms and legs. The needle was unable to penetrate the chest. The mother started crying when she saw the horrific live images on the screen. Her husband told her not to look and the obstetrician instructed our tech to turn the screen away from the mother’s view to hide the reality of what was happening. The obstetrician made a second and third attempt on the same baby with the same immediate withdrawal and flailing about by the baby but was again unsuccessful. Clearly, the baby was fighting for its life. At that point, the obstetrician decided to try and target another one of the triplets. It was terrifying to see this small human fighting to stay alive. I felt physically ill. A wave of nausea swept over me and I thought I was going to vomit and left the room. I know from talking to the ultrasonographer that the obstetrician was eventually 'successful' in penetrating the chest and heart of one of the triplets. I also know that from that point on, I was no longer ambivalent about abortion. The baby that I saw that day felt pain and suffering. This was not just some automatic reflex. "

(https://www.legislature.ohio.gov/legislation/133/sb23/committee)

Given all this shouldn't we assume that a human fetus can suffer at 16 weeks gestation, if not earlier? If so, would you support restricting abortions past that date? A little over 35,000 surgical abortions occurred in 2021 on fetuses past that developmental stage. The procedure involves either cutting or tearing the fetus to pieces, and then sucking those pieces down a narrow vacuum tube. No anesthesia is administered, even though the American Society for Anesthesiologists recommends fetal anesthesia for fetal surgeries at this same level of development. If they can suffer then I imagine they suffer quite a lot from being ripped to pieces.

What's you opinion on passing laws to either ban abortion after 16 weeks, or require anesthesia for the fetus during the procedure? As it stands Utah is the only state that requires it (though since Dobbs it's become a moot point since abortions after 18 weeks are now illegal in the state).

Bentham's Bulldog's avatar

I don't know the exact threshold but 16 weeks might be right. I'd have to look more into the science.

Arturo Macias's avatar

The fetus and shrimp thresholds shall be consistent! Good luck.

Marlon's avatar

"more animals are conscious rather than feer."

JQXVN's avatar

> Most shockingly, crayfish self administer amphetamine, which is utterly puzzling on the assumption that they aren’t conscious. Hard to imagine that a non-conscious creature would have a major preference for drugs.

It's not as shocking as you think: amphetamine 'hijacks' reinforcement conditioning, a process which uses dopaminergic neurons (where dopamine transmission is, sanding off a lot of complexity, the signal to do a thing again) even in the sea slug Aplysia, an organism boasting 20,000 of them total. Amphetamine binds directly to dopamine receptors, and so is self-reinforcing via the most direct mechanism available. It turns out that the circuitry for learning about what's helpful and harmful in your environment is really old and fairly well conserved.

That said, some of the most reinforcing substances to humans don't particularly affect consciousness or even feel good (nicotine is a great example), and some of the most conscious-altering aren't reinforcing at all (psychedelics). I'm not sure how much the presence of reinforcement conditioning in aplysia or any other organism, on its own, changes my estimation of the likelihood it's conscious.

Personality is another factor that might just ride along. It's dependent on both complexity and interspecies variability, and when the latter is constrained personality vanishes no matter how complex an organism's cognition or robust its conscious experience is.

I think the evidence about nociception is most compelling and establishes at minimum something like a jointly sufficient condition for pain. If we have compelling reasons to think an organism is conscious, the presence of nociceptive circuits strongly suggests that organism feels pain. I also find the "how much can you fit in your consciousness" argument compelling vis a vis pain intensity. (It reminds me of one of my favorite philosopher anecdotes, although I can't remember who the philosopher in question was...the upshot is that, knowing both how much his cat liked sitting in front of the fireplace and how few the cat's pleasures were, he would let himself go cold, which bothered him less, so that the cat could keep warm in the spot closest to the fire.)

You don't cover a ton of neuroscience evidence here but I've had the same directional inference experience in my exposure. This was a nice synthesis of empirical and philosophical considerations BB, I like this and all the other shrimp work very much.

Bentham's Bulldog's avatar

Thanks!

Seems weird that a non-conscious creature would get addicted to drugs. There are no examples of clearly non-conscious creatures doing this, while we know that the normal way it works is creating a desire to ingest the substance.

JQXVN's avatar

Getting addicted to drugs is just a special case of reinforcement learning, though, and because neurons work through chemical signaling, the ability to get addicted to drugs is practically a necessary property of the implementation of an RL algorithm in neurons, and RL algorithms can be quite simple. Desire (maybe more accurate to say urges) is a prominent feature of addiction in humans because the drives associated are very powerful relative to other learned things (that's why addiction is so maladaptive) but it's far, far from the case that everything learned via RL is mediated by conscious experience. The face of addiction in humans is misleading for this purpose, and it's helpful to understand that, as mentioned, addiction to drugs is just a special case of addiction in general, which is a special case maladaptation in a very general learning mechanism. (This actually clears up a lot of popular confusion about addiction, and what it is that people are trying to talk about when they talk about dopamine these days, but that's another overlong comment or two.)

Bentham's Bulldog's avatar

But the fact that they have reward learning algorithms that make them ingest substances is predicted if they're conscious, surprising if they're not.

JQXVN's avatar

Oh I see what you're thinking. Some responses are more stereotyped than others. But "ingest thing more" is a pretty predictable parameter to be able to modify if you're a thing that eats.

Bentham's Bulldog's avatar

Right, sure, you can always invoke the explanation that it responds behaviorally to rewards but isn't conscious. But p(respond behaviorally to rewards like drugs)|consciousness>|~consciousness

JQXVN's avatar

I just think your intuitions about what must be the case for RL occur are misplaced. I don't think it's worthless as evidence, but I find other behavioral and neurological evidence more persuasive.

Basically, the ability to get addicted to drugs is something I would expect in almost any organism that can learn to modify its behavior in response to environmental contingencies in more than a couple of ways. If you think that kind of behavioral flexibility is good evidence for consciousness, I don't think addiction capacity shouldn't take you much farther, and I also think there's better ground to stand on.

Arie's avatar

More generally I am skeptical of empirical investigations of consciousness, since every phenomenon can always be explained materially.

Bentham's Bulldog's avatar

This means we can never decisively prove some set of NCCs, but we can still get decent evidence.

JQXVN's avatar

There's a worthy critique of the limits of the approach here, but I don't think it's fair to say I just explained a phenomenon away materialistically. I explained the material basis and then showed that material basis is dissociable from the experiential phenomenon of interest. If the example had been opioids rather than amphetamines, I wouldn't have quibbled because opioids activate pleasure circuitry, not (only and more indirectly in this case) reward learning circuitry.

Anlam Kuyusu's avatar

<<I’ve also argued that pain in simple creatures is likely pretty intense. If pain serves to teach creatures a lesson, then simpler creatures would need to feel more pain, and creatures with simpler cognition would have their entire consciousness occupied by pain. In addition, the behavioral evidence reviewed by Rethink Priorities seems to suggest that it’s likely that many animals experience a lot of pain—they react quite strongly for stimuli, not like a creature in a dull, barely-conscious haze.>>

One hell of a job your God is doing. I am sure suffering of insects leads to some great Soul-Building. Congrats to Her.

SolarxPvP's avatar

While I don’t find most of your criticisms of Eisemann prima facie convincing, a lot of this information about insects is both new and surprising to me. I took Huemer’s claims about insect suffering and citation of Eisemann in “Dialogues on Ethical Vegetarianism” at face value, and when I looked into it I was skeptical of people denying his claims (from the vocabulary being used it sounded like they took any kind of reaction to harmful stimulus as being “pain,” which would make microbes morally relevant). This is much better!

I mostly skimmed because this was a long post. I’ll have to read the whole thing later. Most surprising though is the idea that insects actually react to severe injuries, because I’ve paid attention to how insects react to losing limbs, and they don’t seem to care. I think dismissing this as an anecdote is like dismissing claims that the sky is blue is an anecdote!

Bentham's Bulldog's avatar

When Mike Tyson gets hit in the head, he seems to ignore it. Does this mean he doesn't feel pain?

SolarxPvP's avatar

Though I do love the idea that bugs are all little clones of Mike Tyson. This is an entertaining and intellectually compelling narrative that Big Pest Control wouldn’t want you to know about

CB's avatar

This is hilarious! XD

SolarxPvP's avatar

Does Mike Tyson lose entire limbs in combat? Half of his body? Also, Mike Tyson is in a conscious combat scenario. If Tyson got sucker punched randomly, he would likely react differently, though perhaps not as strongly as someone who didn’t have normal combat experience. Bugs don’t have Mike Tyson’s training or adrenaline when they randomly experience extreme injuries.

Bentham's Bulldog's avatar

But you might have evolutionary pressures for insects to feel pain weirdly or ignore certain kinds of pain.

SolarxPvP's avatar

“Might” is still “might.”

Bentham's Bulldog's avatar

It's likely you would if they have simple brains that can't focus on many things at once that they'd focus on the most salient thing (e.g. sex).

SolarxPvP's avatar

Maybe, but I don’t have the knowledge to say that or otherwise.

Sei's avatar

Is there a reason we are certain that, say, paramecia do not feel pain? They avoid noxious stimuli and are capable of associative learning. A lot of the definitions of pain revolve around having a nervous system, but if we discovered a creature that acted just like a human despite having some sort of strange distributed chemical information processing rather than a nervous system we'd probably say it feels pain.

Bentham's Bulldog's avatar

They have no brain or central nervous system to integrate the information from the various signals.

Sei's avatar

Is a brain or a central nervous system required to have subjective "experience"? Presumably associative learning means that information is being stored and recalled somewhere, whether it's chemical or electrical.

Woolery's avatar

What an excellent argument. Detailed, clear, non-inflammatory, rigorous. Very persuasive stuff. More like this please.

Venkateshan K's avatar

This is a very interesting post. I agree that the likelihood of several species including insects and crustaceans being sentient and capable of valenced experience appears to increases the more we investigate and discover their behavior. While there are plausible counter-arguments against this view (some of which have also been described in these comments here), these might just end up being vestiges of an effort to cling on to an a hypothesis that is soon going out of favor.

Nonetheless, I am much less convinced about the intensity of the pain argument you have put forth. Even if one accepts your line of argument that pain probably plays a more important role in a less intelligent species or that the neural correlates of pain represent a far greater fraction of all neural processing in an animal with fewer overall neuron count, the subjective experience of pain depends on the nature of the phenomenal consciousness that emerges in the species. And while it is true that naive assumptions about how sophisticated consciousness is based on absolute neuronal count is probably incorrect, it is also unlikely that overall neural complexity is *entirely* irrelevant for consciousness consideration either. We just don't know how a single unified notion of consciousness emerges (well, while we are at it, why assume it is even unified in a different species?) and the characteristics of that consciousness may well depend on the underlying complexity of the neuronal wiring structure/firing patterns/degrees of freedom.

CB's avatar

Thanks for the post ! This is vey valuable.

I find the argument convincing: why would evolution not include something as similar as pain in autonomous animals, and why would they have such similar behavior in pain if they were not conscious?

Nietzsche's Stache's avatar

Why does your arguments preclude oysters and plants btw? Your inductory generalization does increase the probability that they too will one day come under these categories. It doesn't seem to me that locomotion is necessarily linked to pain-reception. In fact historically Jains have philosophically argued this about organisms and that we have an inherent bias for animal-like pain.

Bentham's Bulldog's avatar

They don't have brains and the things that I describe here as specific evidence don't apply to them. When confronted with an inductive trend of underestimating X, you don't just immediately assume everything has X.

Nietzsche's Stache's avatar

I mean then the question boils down to whether phenomenologically pain is necessitated by brains. Which by past induction, should reduce your confidences on what kind of physical structures are relevant to consciousness and sentience. Now that reduction is probably very minimal because it requires a larger paradgimal shift which I think is the better way of phrasing it? Since say, saying an insect is conscious is still neuronal activity while a plant would require a fundamentally different physical theory.

I would think the problem is a tad bigger for non physicalists, especially those who already believe in immaterial minds and theism. Like I said, Jainism has a viable model of universal consciousness that also includes non-animal sentience.

User's avatar
Comment deleted
Nov 20, 2024
Comment deleted
User's avatar
Comment deleted
Nov 20, 2024
Comment deleted
Bentham's Bulldog's avatar

If your point is just "maybe these creatures are behaviorally like creatures in pain but aren't conscious," that's definitely possible but a pretty bad explanation of why so many things align as if they were in pain.

Jacob3's avatar

Oh wow, it gets worse. Just saw the demented strawman.

"like trying to divine the contents of a program with vague ideas like: a REST API is involved in communication, programs communicate, therefore every program has a REST API in it. This is just nonsense."

Such a clear case of straw manning. So uncharitable & snarky too. Disgusting. 🤮🤮🤮

Can't even accurately represent Matthew's reasoning. The bare minimum. Such a shameful, and demented, caricature. Ideology & motivated reasoning rotting the brain.

Ape in the coat's avatar

Regardless of the snarkiness, where is the strawmanning here, though? What exactly does KK's analogy misses in Mathew's reasoning?

As far as I can tell Mathew's reasoning is about noticing that we are conscious and behave in a particular way. Then noticing that clearly non-conscious things do not behave in such way and then inferencing that other things, whose consciousness status is debatable, and which also behave in such way are indeed conscious.

And the REST API analogy seem to be working in exactly the same way. We know that some programs use REST API and behave in a particular way. We also know that some things that clearly do not use REST API do not behave in this way. Therefore the same reasoning pushes us to a conclusion that everything that behaves in this way and may or may not use REST API is likely to be using REST API.

What am I missing?

Jacob3's avatar

Assuming you are approaching in good faith, Matthew gave 5 different arguments. You can read the article to see what they are. None of them correspond to the reasoning pattern:

"like trying to divine the contents of a program with vague ideas like: a REST API is involved in communication, programs communicate, therefore every program has a REST API in it. This is just nonsense."

You can verify this yourself. Look for this reasoning pattern in Matthew's article. You'll never find any paragraph expressing this.

And, to be clear, properly shaming retarded strawmen is not "snarky".

Ape in the coat's avatar

> Matthew gave 5 different arguments. You can read the article to see what they are. None of them correspond to the reasoning pattern

Second argument is explicitly about that. And I'm surprised that you don't see it. Let's try to look together to figure it out.

> Second, these creatures act in most ways like they’re in pain. If an insect or shrimp is exposed to damage, they’re struggle with great effort and try to get away. Either they are in pain or they evolved a response that makes them behave like they’re in pain. But if a creature struggles and tries to get away, a natural inference is that they’re in pain.

In other words:

1) A huge class of creatures C, including insects and shrimps, try to get away from the source of harm

2) humans do it due to pain

3) therefore every creature from C, including insects and shrimps, feels pain

The argument seem to be isomorphic to the alledged strawman:

1) A huge class of programs C communicate

2) REST API programs do it due to REST API

3) therefore every program from C uses REST API

So, what am I missing?

User's avatar
Comment deleted
Nov 22, 2024
Comment deleted
Jacob3's avatar

Couple of ambiguous terms there. What's meant by "undifferentiated mess"?

Jacob3's avatar

Typically when you drop cheap zingers like “that’s nonsense!” etc you need some reasoning why it’s BS. So do you actually have any arg that “Your ideas of pain and consciousness are undeveloped”?

@BenthamsBulldog People like @TheKoopaKing just offer emotional sperging & vague shit talking hoping smarter folks don’t notice they lack a critique. I’d either force a retraction or extract the premise-conclusion arg.

User's avatar
Comment deleted
Nov 22, 2024
Comment deleted
Jacob3's avatar

"Matthew doesn't explain what pain or consciousness concepts he is using in this post, so his ideas of them are simply undeveloped."

So the view is that if you don't explain what you mean, it is "undeveloped"?

By this standard, most of what you said is undeveloped. Your comment used words like "nonsense", "fundamentally confused", "concept", etc. I didn't see you provide any explanation of what "fundamental" means, for example.

Just have reasonable standards. Einstein's theory of relativity makes various truth claims. We wouldn't call Einstein's relativity "undeveloped" because it doesn't specify a theory of truth.