> "really understanding" is just something a computer program thinks it can do once it gets complex enough to be conscious.
At the same time, consciousness might not be a requisite of higher intelligence at all; it could merely have been
evolutionarily advantageous early on in the development of complex brains because of our natural environment... it's hard to imagine an intelligent animal with no "me" program doing very well.
But maybe a digital intelligence (one that did not evolve having to worry about feeding itself, acquiring rare resources, mating, communicating socially, etc.) would have no use for a central "me" program that "really experiences" things.
At the same time, consciousness might not be a requisite of higher intelligence at all; it could merely have been evolutionarily advantageous early on in the development of complex brains because of our natural environment... it's hard to imagine an intelligent animal with no "me" program doing very well.
But maybe a digital intelligence (one that did not evolve having to worry about feeding itself, acquiring rare resources, mating, communicating socially, etc.) would have no use for a central "me" program that "really experiences" things.
Such a creature is kind of eerie to think about.