Hacker News new | past | comments | ask | show | jobs | submit login

K sooo to (kinda) answer the question: depends. What's your definition of consciousness?

Now as for the implication that anesthetics sedating plants has ANYTHING to do with this/whether or not they are conscious: no. No sorry, not buying it. Need more science.

This just suggests we may have similar inhibitory biochemical responses to certain organic compounds.

That's not surprising since were all made out of hydrocarbons & many anesthetics have relatively simple structures.




Yes, all those discussions dissolve once you use a precise definition, but there is no generally accepted one.

The definition must apply to humans. It must not apply to rocks. It should probably not apply to plants, fungi, bacteria, and worms. It will probably apply to dolphins, dogs, and pigs. Will it apply to companies, robots, cities, ant swarms, chicken, or fish?


What scale of "rocks" are we talking about?

I'm reminded of the xkcd comic about making a computer out of tons of rocks in the desert[1]. Sure, one rock is crystalline (enough) and has essentially no stimulus-processing power. But a pile of rocks, arranged just so, can be set up so as to respond to certain stimuli that, to an outside observer, seems like thought/consciousness. This is the Chinese room problem[2] recast.

By approaching it in this way, we are forced to reconsider: what's so special about humans that makes how you define consciousness more applicable to them than a rock or plant? Why exclude rocks at all?

I'm firmly in the camp that any attempt to quantify consciousness will necessarily be a matter of degree, with no system exhibiting zero consciousness except a perfect crystal (i.e., zero entropy).

[1] https://www.xkcd.com/505/

[2] https://en.wikipedia.org/wiki/Chinese_room


I agree in general, but a degree seems even harder than a binary condition?


Sorry this is late, but check out Dr. Giulio Tononi's mathematical apparatus called Integrated Information Theory. In a nutshell, it considers a system and all the causal relations within it, measures some quantity related to the number of states that system can take, and then considers the graph cut that minimizes this quantity in the subsystems. The difference between the uncut and cut graphs gives a degree of "integrated information", which can be argued is a useful analogue or even the very definition of consciousness.


Thanks for the Chinese Room experiment link. I had never heard of it. I don’t suppposr anyone has any good book recommendations around this train of thought?


By your logic then plants should have consciousness too.

Alternatively are you implying that human consciousness is meta physical in nature?


> What's your definition of consciousness?

How about - ability to adapt to environment and select good actions depending on situation, learning from reward/loss signals.


That seems more like a definition for intelligence or learning.

My favourite definition for consciousness is the following:

If it is something to be like a particular thing, then that thing is conscious.

If it is something to be like a bat, then a bat is conscious. If it isn’t something to be like a bat - the “lights are off” so to speak - then it’s not conscious.

Edit: clarity


You probably couldn't use this definition even if you had a magical device which transforms you into a bat and then back again.

After you are yourself again you'll need to interpret the acquired memories. You remember echolocview of the room (bats have memory, it's testable), hunger, disorientation and some other sensations. Does it mean that bat is conscious? What will you remember differently if bat wasn't conscious? You'll remember nothing? But memories are there and you should have access to them. The lack of self-awareness? But you don't know how you would remember the presence of self-awareness in a bat.


But the definition isn’t if it is like something for a human to be a bat. Human experience doesn’t really come into it.

It is like something to be a human. But the conscious experience of a bat (if it exists) may be completely alien to us in every way. That’s not relevant to what it is to be like a bat. The only important thing is if it is like something to be a bat.


What's good in a definition you have no conceivable way of applying? How would you go about testing whether a bat is conscious according to this definition?

It is based on our illusion of understanding what it is like to be a human, but upon closer inspection it bears no real content.


I think you could still use the definition. Because your still conscious, your just something that is aware of its ability to change shape and maintain its awareness through that change.

Nothing is really static in life. So nothing ever really is always just a thing, it was one thing, now it's a slightly different thing.


From Thomas Nagel's "What is it like to be a bat?", right?

At least as you've stated it, that's a circular definition.


It was from a conversation between two neuroscientists. They were talking about Nagel earlier in the discussion. I don’t know if Nagel came up with that definition though. Could you explain how the way I stated it is circular?


If it isn’t something to be like a bat - the “lights are off” so to speak - then it’s not conscious.

I guess I'm focusing more on the "lights are off" part, although that's just an aside. "Conscious" and "lights are on" seem like alternate ways of saying exactly the same thing.

Nagel's "is it like something to be" terminology has always struck me as wilfully contorted and confusing, but amounting to the same thing again -- you're conscious if you're conscious. It doesn't help us explain consciousness and it doesn't even help figure out what things are conscious and which aren't. Is a plant conscious, or a rock? Writers can certainly imagine what it might be like to be those things.


Yeah. How do you slice consciousness?

Consciousness, like Artificial Intelligence, is a slippery concept. They both tend toward being unattainable by anything-other-than-human:

i. As we implement progressively more abstract concepts in computers, those layers of abstraction become mundane things that computers can do.

ii. As we learn more and more about how non-robot-like other organisms are we can quite easily push consciousness further in to the human-only corner by just adjusting the definition of consciousness.

Insofar as consciousness can be defined as the state of being aware of and responsive to one's surroundings the yes, plants certainly are conscious.*

Do they posses self-awareness?


Consciousness described as a physical process is different than explaining how you, visarga, are able to experience reality. Understanding how that idea of internal experience, of being, could relate to a plant, seems the revelation here.


So a Roomba is conscious?


Why not? And if it's power source runs out, the Roomba loses consciousness.

That seems closer to the everyday definition of consciousness most people operate by, than to the hopelessly bikeshed attempts at making it a uniquely human thing.


Humans have a bit more elaborated internal representation of themselves than a Rumba (which doesn't have any. Early versions even had no representation for the room they operate in).


We don’t have point of reference outside of Wall-e to presume a robot is conscious in any way. However a living organism like a plant may experience being.


Ooh, a new one. I need to start collecting these. There's just such incredible variety.


Not particularly new.

The dictionary I have close at hand claims:

consciousness |ˈkɒnʃəsnɪs|

noun

1 [ mass noun ] the state of being aware of and responsive to one's surroundings: she failed to regain consciousness and died two days later.

2 a person's awareness or perception of something: her acute consciousness of Luke's presence.


OP's definition doesn't include "awareness" at all, which I think is an essential part of both of those definitions. Although now I've said too much.


I've been meaning to back to this, apologies for the delay.

If a bacteria senses a change, say in salinity, and moves away, can we say the bacteria is / was aware of the change.

What does awareness mean. It sounds at least as slippery as conscious.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: