Hacker News new | past | comments | ask | show | jobs | submit login

>> Equating reaction to stimuli with pain is too sinplistic.

Just to clarify this- my comment says that animals react to stimuli and that pain is a stimulus that they use to avoid dangerous situations (or behaviours). Not that reaction to stimuli entails the ability to feel pain.

So I'm asking: if (some) animals don't feel pain, then how do they know to avoid dangerous behaviours? What is the mechanism that keeps them from, say, jumping into the fire?

Does that help clarify what I mean?




> if (some) animals don't feel pain, then how do they know to avoid dangerous behaviours? What is the mechanism that keeps them from, say, jumping into the fire?

This is exactly why the robot analogy originally raised by Hendrikto is so powerful. Robots/neural nets/etc illustrate that it is possible for an agent to avoid dangerous behaviours without being motivated by feelings.

I agree that this doesn't prove that insects don't have feelings - it shows that an agent doesn't need to be conscious to exhibit these sorts of behaviours.

Where I disagree is your claim that it has no bearing at all on the argument. It does bear on the argument, because it presents an alternative hypothesis that doesn't involve attributing mental life to insects. Very (very) roughly speaking the alternative is something like "an insect brain is a neural network, which, while not created by conscious designer, was chiseled by evolution to respond to stimuli in such a way that dangerous behaviours are avoided - in much the same way that an atari-playing artificial neural network trained by reinforcement learning avoids dangerous behaviours that will end the game - and just like the ANN playing atari, no consciousness is required - just the right model parameters. And similarly, no conscious designer ever explicitly told the neural network what to do. The programmer just set up the right incentives, in the same way that nature set up the right incentives for the insect."

I can see the arguments on both sides, but I think the "insects are kinda like robots" hypothesis is more plausible because an insect brain has a more similar number of model parameters to a deep q learning network than it does to a human brain.


> I agree that this doesn't prove that insects don't have feelings - it shows that an agent doesn't need to be conscious to exhibit these sorts of behaviours.

This conclusion requires an understanding of what endows something with consciousness. (I won't use the word agent, because the relationship between consciousness and agency isn't clear. Agency may require consciousness, but consciousness needn't necessarily require agency.)

We do not know what part of the human brain is responsible for human consciousness. Anything which we create may also, as a byproduct or an emergent property, create consciousness. Perhaps the only thing consciousness needs is an ability to sense and respond to stimuli.


If a doorbell is conscious then we've made "conscious" into a useless term.


That's like saying if a doorbell is made of atoms, we've made atoms a useless term. A doorbell either has consciousness or it does not have consciousness. (I might agree that if a doorbell has agency we might be making agency a useless term.)

Consciousness as we understand it is defined in an ineffable way. We have individual experiences which we call 'consciousness' which roughly coincides with our subjective experience. As it stands, the term is fairly useless. Where we use it, we generally make the assumption that it applies or doesn't apply; almost exclusively in a self-serving way.

If a doorbell is conscious and we can prove it, we will necessarily have a much greater understanding of consciousness and will have refined the term consciousness, making it more useful.


"atoms" isn't a useless term, but describing real-world observable objects as "atomic" is useless.

In the same way we can talk about a spectrum and study of consciousness, but if the definition is to sense and respond to stimulus then the yes/no of "conscious" is always yes and therefore useless.

I'm not sure what you mean by "prove it". A doorbell very clearly responds to a stimulus. ...if you mean the definition of the word, no word's definition has ever been "proven". People assign definitions to words.


You're conflating the definition with the correlated properties.

> if the definition is to sense and respond to stimulus then the yes/no of "conscious" is always yes and therefore useless.

This isn't the definition. This is a hypothetical cause. The cause might be anything; we don't know (yet?) what objective indicators there might be of subjective experience.

Consider seeing. In order for a thing (be it robot, animal, or human) to see, it must possess some type of sensory organ and a method to interpret the signals from that organ. So, to test for the ability to see, we can look for those things. We can be pretty sure that fish can see, because they possess eyes, brains, and respond to visual stimuli. That doesn't mean the properties has_sight_organ and has_brain are seeing.

In the same way that physical matter is all made of atoms (loosely; let's not be pedantic), we might find that all things possess consciousness. We might not. But in neither case is the term made useless. Sure, if we find that to be the case, then describing a thing as conscious or not becomes similar to describing an object as atomic. But then we're just moving goalposts.


> This conclusion requires an understanding of what endows something with consciousness

Either you mean A) a "full" understanding or B) a partial understanding. If you mean A then I disagree. We can talk sensibly about markets even though we don't fully understand them. If you mean B, then I think we have it. I'll be the first to admit that it's pretty shoddy, but its not unusable. I love Dylan16807's point:

> If a doorbell is conscious then we've made "conscious" into a useless term.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: