Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Complexity isn't what makes something intelligent or conscious after all.

Why not?




This shows that you can't just say "complexity makes something intelligent and conscious" by itself, you have to explain how it works, and "complexity" isn't an explanation, it's just a label. I agree.

But that's equally true of saying "complexity doesn't make something intelligent and conscious". All right, then what does? "Not complex" is a label just as much as "complex" is. Either way you haven't explained anything. That's what I was trying to get at.

Also, saying "complexity doesn't make something intelligent/conscious" is not the same as saying "something intelligent/conscious doesn't have to be complex", which is what I think the statement I originally responded to, taken in context, was really intended to mean. Do you really think intelligence and consciousness can be present in simple things like rocks? Might there not be a reason that the human brain is three pounds of complex matter, not three pounds of jello?


Yes, you are right, it is likely that anything worthy of being considered conscious would be fairly complex. (Though I believe it is possible to create an intelligence that is initially fairly simple. AIXI for example is the ultimate intelligence and could be created in a few dozen lines of code. It would just take nearly infinite computing power to do anything interesting and I wouldn't really consider it "conscious". Even the processes going on in your brain, at least the very basic function that makes us "intelligent", stripping away all the uninteresting stuff or additions to that, could probably be specified in a very small space. A lot of neurons may seem complicated, but you only need to understand the programming of a few, than just copy them a billion times. But all of this is beside the point.)

Anyways in the original context of where I said that, I was trying to come up with a satisfactory way to distinguish consciousness from non-consciousness. Complexity obviously isn't the way since lots of complex things are not conscious. It may be that all conscious things are also complex. But then complexity isn't the reason it's conscious, it just happens to correlate with it.

Setting a variable in a computer equal to "happy" obviously doesn't make the computer experience happiness. But then what possible sequence of commands and state changes would? If you accept that there is no possible sequence that would, then you have to accept that humans can not experience consciousness either. Because for all we know we are running in a computer too. Even if we are not, the process our brain follows easily could. The fact that it runs on atoms bouncing into each other and not electrons flowing through logic gates makes no difference.


AIXI for example is the ultimate intelligence and could be created in a few dozen lines of code.

Really? Show your work, please.

A lot of neurons may seem complicated, but you only need to understand the programming of a few, than just copy them a billion times.

This assumes that they are all "programmed" the same. Why do you think that must be the case? Also, you're leaving out all the chemical processes that contribute to brain function.

I was trying to come up with a satisfactory way to distinguish consciousness from non-consciousness. Complexity obviously isn't the way since lots of complex things are not conscious.

True.

It may be that all conscious things are also complex. But then complexity isn't the reason it's conscious, it just happens to correlate with it.

The connection might well be stronger than "just happens to correlate". Even if "complexity" by itself isn't the reason it's conscious, it might well be that consciousness requires a complex substrate.

Setting a variable in a computer equal to "happy" obviously doesn't make the computer experience happiness. But then what possible sequence of commands and state changes would?

Um, a much more complex sequence of commands and state changes?

If you accept that there is no possible sequence that would

I don't. I just think there is no simple sequence that would.


I really do believe that consciousness doesn't really have anything to do with intelligence. Even if humans are not simple. It's conceivable you could create something like a human with very simple machinery, just massively scaled up.

If that is too complex, you could create an even simpler algorithm which produces that. Maybe by creating a genetic algorithm which "evolves" human-like intelligence after billions of generations.

In both cases the end result may be complex, but the algorithm that creates it is not. Whether it be a network of billions of interconnected neurons that arise from simple programming of a few neurons, or a complex intelligence optimized by simple random mutation and selection.

This is the idea of emergence. That complex seeming behavior can "emerge" from simple rules and a simple process.


I really do believe that consciousness doesn't really have anything to do with intelligence.

The problem is that both of these terms are really too vague. What do you mean by "consciousness"? Some people claim insects are conscious; some claim bacteria are conscious; some even claim electrons are conscious.

Also, what do you mean by "intelligence"? By some definitions, the process of evolution by natural selection is "intelligent".

If by these terms, you mean "the properties humans have that we call consciousness and intelligence", then why do you think there is no connection between them? Does your own intelligence really have nothing to do with your consciousness? Or vice versa?

It's conceivable you could create something like a human with very simple machinery, just massively scaled up.

Conceivable, yes. Likely, I don't think so. But that's really a matter of opinion; it depends on how much of the complexity in the parts of the human brain (neurons, chemicals, etc.) is necessary for human consciousness or intelligence, and how much you think is just an artifact of the implementation, so to speak. We don't know enough about how the brain works yet to really judge.

the end result may be complex, but the algorithm that creates it is not.

In other words, the complexity is still there.

That complex seeming behavior can "emerge" from simple rules and a simple process.

Wait a minute--complex seeming? Or complex?

If you're trying to argue that the complexity may not be in the algorithm itself, but only in the actual working out of the algorithm in time and space, I'll buy that.

But if you're trying to argue that it isn't "really" complex because the algorithm is simple, I don't buy that. I never said the algorithm had to be complex; I just said there has to be complexity somewhere for there to be (human-level, to avoid all the definitional issues I raised above) consciousness and intelligence. That's perfectly consistent with the complexity ultimately being built out of simple parts--just a lot of them with a lot of interactions, so the complexity is in the interactions, not in the parts themselves.

(I think it's likely that the parts themselves have significant complexity too, as I said above, but even if that's true, there will still be some lower level, possibly much lower, where there are simple parts, just a huge, huge number of them with a lot of interactions.)


That was actually a typo. I meant to type "consciousness doesn't really have anything to do with complexity." In the sense that you could probably define an intelligent being like a human in very little space.

>In other words, the complexity is still there.

Well yes in a way. By the definitions used in information theory, complexity is just the amount of bits you need to accurately describe something in some language. By common usage complexity just means the amount of concepts you need to learn to understand something (about the same thing.) But it can also mean the number of moving parts a machine has, so to speak, which could be very large while simple enough to understand or write on paper. Like a computer display which has a few thousand pixels each of which is almost exactly the same.

I don't know if you would call a computer display complicated, at least not much moreso than the invidual pixels that make it up. The same is probably true of humans. Even if humans are fairly complicated, someday a connectionist approach or maybe a rough approximation of a human brain might succeed at creating something arguably conscious.

>If you're trying to argue that the complexity may not be in the algorithm itself, but only in the actual working out of the algorithm in time and space, I'll buy that.

This may be a much better way of describing what I just said above. So yeah.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: