I would expect a hybrid meat-computer personality to notice that at some point in the transition. If not -- if consciousness is so profoundly ethereal that a person's own consciousness can be gradually replaced with a simulated one without the person themselves even noticing -- then what's the difference between the "real" consciousness and the simulation?
>then what's the difference between the "real" consciousness and the simulation?
Well, isn't that the whole idea? Objectively, there is no difference. If there was, uploading wouldn't ever be possible.
The only potential difference is the subjective impression of "this consciousness that I am now is the same consciousness that I was yesterday" versus "this new consciousness is an exact clone of mine, but is not really 'me'".
This incremental upload technique simply tricks your subjective self into thinking there is absolutely no change in continuity.
No - rather there is no change in consciousness but if you do the upload in one jump, your subjective self sees the discontinuity and mistakenly thinks there was a change in consciousness. You have to do the process gradually to trick it out of its error.
These are tiny, atomic, piecemeal changes that happen slowly over time.
If one of your neurons randomly dies right this instant, you would probably not subjectively notice, even though that neuron composes your subjective consciousness. Your neural network is highly redundant and fault-tolerant.
Yes, neurons in our brains are dying every day, but it doesn't affect our subjective consciousness. Are you simply saying that uploading, if it were done carefully, would not make any more difference to our subjective consciousness than neurons dying every day?
You wouldn't notice because "you" would no longer exist, by hypothesis. But I would expect that there would also be a huge objective difference in your behavior, easily noticeable by others. The "self-awareness part of your brain" is not disconnected from the rest of your brain and body; if it dies, the rest of your brain and body is going to be drastically affected.
Well, you'd have 2 mental clones of yourself, essentially.
It depends where those neurons were sent off to in the meantime and what they were doing, but with such a big jolt, it's quite possible they would perceive themselves as "clone" (after understanding the situation at large) while you in your new machine self would be the "original you".
Why would it be a clone, and why would it perceive itself as a clone, considering that it is made of up the original matter and in the original configuration? I would assume that a technology sufficiently powerful enough to re-assemble disintegrated matter into something as complex as a brain would also keep the state identical to it's original form as well. Assuming that thoughts and perceptions are completely physical, and the brain is the same state as before, there is no reason for it to experience a jolt or anything else that indicates something special occurred.
The difference would be in the case of a simulated consciousness, I would been dead and wouldn't get to experience any of the pleasure that fake-me experiences, for making a simulated consciousness is the same as making a philosophical zombie.
A philosophical zombie is an incoherent concept. The idea that your consciousness could disappear while leaving all of your external behaviors unchanged makes no sense. Your consciousness affects your external behaviors; if you were not conscious, your external behaviors would be different.