It wouldn't be me that woke up as a copy. It would be a copy that feels just like me. However, since he would have my intellect, he would reason that he is not the original, and if it had died, he would feel sad. On the other hand, he might be excited if he were in a machine body as that fate might not befall him for a very, very long time.
Here is where you're wrong. You presuppose the "he" and "I" when they are perfectly interchangeable right up until the cloning process. You can't presume that because you're the original now that you will be after the cloning process, because that has a 50% chance of being wrong. Right now, you and the clone are one and the same. Any reasoning you do about "me versus him" before the cloning is shortsighted. You have to reason like this: "If I wake up as the clone, then X. If not, then Y." It's like calling fork() in your code and then writing the code after that to presuppose that you're the parent process without even checking the return code. You don't know that you're the original after the cloning unless you see evidence that you are.
Think of it this way: let's say instead of a brain copy it is a full body copy. You walk into a dark room where you can't see anything. They knock you unconscious, and then you wake up lying next to what appears to be you. How do you know if you're the original? Because you were the original before? Sorry, but no.
Each running instance of me is a separate, alive, person. Forget about the memories for the moment. If you kill that person, they are dead. It doesn't matter who is the copy. I could be a clone for all I care. Doesn't matter. If you kill this body--the one I'm IN then my stream of thoughts ends. Other similar streams of thought offer me no comfort.