I don’t understand where the “paradox“ part comes in:
You have a mind and are conscious. You now create a copy of your mind and let it run on some other hardware, e.g. the teleported body. You are still conscious in your old body and a new entity is created in the other body. You can also create ten new copies, with ten consciousnesses , you will still be conscious in your old body. You now destroy your body - you’re dead.
How can you say for sure you are in the old body? After all, the new body is an indistinguishable copy of you, indeed the old body has now changed and is more different than the you at the time of cloning than the new one. “You are in the old body” might be the common sense answer, but short of invoking quantum woo there is no reason to believe it’s true.
Even the ancients struggled with the question “why does a soul stay connected to a particular body?” And they were playing on easy with their belief in the soul, this problem feels so much harder for the materialist.
>The old body has not changed. The same software is still running on the same hardware, what would change?
But bodies do change over time; cells use molecules from food to create new ones, old cells die, people lose limbs. Now we're dealing with a Ship if Theseus[0] problem. And what if somebody has their arm cut off? Are they not the same person, but a fraction of their previous identity? Is their arm the other fraction?
If you want to argue that the mind defines the person, then what about dementia? Some people will say they aren't the same person as before, but where's the line? Is a forgetful person doomed to never be the same person day to day?
In my opinion, the only consistent belief is that the "self" is a useful mental model and nothing else. A leaky abstraction we shouldn't use for solid reasoning. Like paper money only has value because we agree to use this simple concept, "self" only has meaning because it helps us survive and achieve goals.
The 'paradox' here is not how the person changes over twenty years, but what happens in the very moment of creating a copy.
We're not dealing with a Ship of Theseus problem, that's something else entirely.
I personally am horrified by possibly developing dementia one day, exactly because of that. But this is also a different problem because it affects the original body.
Also the moment all these copies are created their states would start diverging. It would not take long for them to become different people.
Tangent: this line of thinking in reflection has led me to become profoundly forgiving. Different paths could have led me to be homeless, in prison, an insufferable bastard, insane, and so on. Other paths may have made me a better person than I am now. Even worse it’s hard to tell what those paths might have been except far in retrospect.
When we judge others harshly we really are being profoundly small minded and provincial.
Sure, but that's not related to the question at hand. I don't think it's controversial that people with different experiences will develop differently, and that twins are, in fact, different people.
it doesnt work that way. you have that copy of your "consciousness" as long as you are awake. when being in deep sleep - opposed to REM phases where brain waves are measurable, or when highly dosed with narcotics such as for heavy surgery you (in the sense of your consciousness) do not exist anymore - no brainwaves. when you wake up, to be precise when you enter REM phases, your mind slowly boots up again. awake, you're reaching out to you assets, read memories, and cross checking references allows you to re-establish confidence who you are.
think of firefox. you start it and it "lives" in the memory of the computer. however, turned off it is gone. the process is eliminated and purged from memory. when restarted, however, it remembers your browser history, sessions and passwords. still it is a new process.
basically this is what happens each night when we sleep. every morning a new you wakes up.
so when a cloned asset of cross linked (aka relational) memories is spinned up, the resulting process does not have a distinct possibility to determine if it is being cloned. it just have it same associative pathways in its wetware as the original. yet it will believe being the original and also act accordingly. when you ask outsiders who know you, they wont be able to make out a difference, and label the clone as original. the clone (depending on the age of the information) will also believe it is original as it knows your passwords, your mental setup (and lives by that because of reasons) and it remembers whatever has happened to you, too.
this can only mean one thing: our so called consciousness is a fake conclusion. yes, cogito ergo sum, but only for the moment. new day, new me.
You are assuming there is no contiguity of self during sleep, but present no evidence for this idea. There's plenty of evidence, however, that sleep is closer to an altered state than a cessation/reset of state. The brain is intact and active during all stages of sleep. We dream every night and merely forget most of them, and remembered dreams are certainly contiguous with waking consciousness.
The analogy between "sleep" in biological brains and "sleep" functions in modern operating systems is only that - an analogy.
Think of Firefox, same hardware (body), same profile (nerological pathways), yet a new process when restarted, b/c no brain activity measured in deep sleep, only with REM phases.
It's a bit more complicated. The human body is not simply a CPU and memory executing dumb instructions (well it KIND of is at some level), it's impacted HUGELY by environmental variables. Suppose you're from the 1700s and your clone behaves badly in his diet (fast food is helluva drug), develops hypertension, some form of brain cancer and starts hallucinating. It's most definitely the same you. Because in the 18th century there were no fast-food chains in the 1700s.
My main point is it's not as cut and dry. Yes the body is essentially a machine "executing" your consciousness. But that consciousness is self-modifying, and modified by the environment.
Well of course once they start interacting with the environment they will become 'different people' - surely you don't think I think twins are essentially only a single person.
How is that a paradox that people develop differently when making different experiences?
The important question is: if the transporter works as originally designed - dematerializing you in the process - is this just moving you, or is it killing you?
The option of modifying it to not dematerialize you, but copy you, is how this question stops being tautological. Imagine there's a "copy/move" mode switch. Is the act of switching it to "move" a decision to murder the "input you" (or commit suicide, if you're operating the device yourself)?
EDIT:
Downstream from that come the legal questions like, does the person leaving the transporter at the destination have rights to your identity, accounts, assets? Are they married to your spouse? Does this depend on whether the switch on transporter was on "move" or "copy" position? If so, why? And how does this square with ethics?
No, that switch, even if it's only hypothetical, completely resolves the paradox, as I discuss above. You destroy your body, you die. Someone else does it for you - they murder you. There is no paradox. Sure, it may give a a warm fuzzy feeling that someone else with your background will continue to live, taking your memories with them, but you press the button and your light goes out.
Well, I'd maintain that your memories 'live' on, but you're no longer there to enjoy them, that's someone else. If that's enough for you, great for you. I personally enjoy having experiences myself.
> What does that even mean?
What does what even mean? To stop existing? Is that a serious question?
I don't understand your point. Are we 'same' (your first sentence) or are we not 'same' (your second sentence)?
My copy and I are different persons because my copy has its own mind in its own brain in its own body. Just like you and I. Why would there be any more reason to think my copy and I are the same self, than to think you and I are the same self?
I read it like this. Alice gets in the machine and creates ten new copies of her. Each of them thinks they are Alice. Are they all right? If so what does the concept of 'me' or 'self' actually mean? Are you a unique individual? Do you have (or are you) a soul or something that exists independently of the arrangement of atoms that makes up your body? Or is your identity and sense of self completely the result of the way the neurons between your ears are connected?
I think this kind of question is particularly important to people brought up with strong religious beliefs.
If we confirm that the copies are ‘perfect’ (which is implied when we think about teleportation), then every entity has every right to consider themself ‘the original’, and at t=t0 they are indeed by definition indistinguishable. They all have a self, but of course it has to be separate from the original. The only alternatives are that they develop no self, then by definition they are not proper copies, or they would all share a single mind which makes no sense whatsoever.
If I install the same piece of software on ten identical computers, they will all think they’re the same, because they have the same serial number, hash, etc. but obviously they’re different instances.
I see that accepting this could be problematic if you think you need some divine spark (how are twins reconciled? They’re one individual split in two, after conception). However, I still don’t see a paradox.
There's a great pair of novels by Peter F Hamilton, Pandora's Star and Judas Unchained. In them, there's an alien, MorningLightMountain, that expands by creating copies of itself. If it can't maintain the neural link to those copies, they diverge. A problem that in its early spaceflight experiments led to disastrous consequences.
More on-topic, how is teleportation different to losing consciousness? You "know" you're the same person that went to sleep, but how do you know? Memory. The copies will have the same memories, and thus think they are the original. (as illustrated by some big blue guys in the great and gory animated show Invincible) You don't consider going to sleep to be "dying".
Likewise, if you were to get teleported, you'd be fine, as long as the original was destroyed. As long as there's only one "me" as defined by the sum of my memories existing in a single body, then we're OK.
Your consciousness example only works backwards: you can create ten copies of yourself, and they all will have a merry good time and think they're you. But how would that affect the conscious you? Not at all.
You can give your car (your memories) to someone else, but that doesn't imply that you're the one to enjoy it/them.
How would your model work if you created ten copies? Would you be a hive mind? Now you teleport one of those, how many people do you have to kill before your postulated magical mind jump occurs? And how does it know where to jump to?
The simplest answer, that resolves all paradoxes is: it doesn't have to because it doesn't jump.
// you can create ten copies of yourself, and they all will have a merry good time and think they're you. But how would that affect the conscious you? Not at all.
I beg to differ. I think id have quite a hangover, a sense of impending doom and a sense of transgression against god, and id be terrified where my clones are and what is happening to them.
Maybe athiests arent bothered at all. Maybe theyve only convinced themselves that
If i go out and black out and embarrass myself but don't remember what happened, according to you and other posters i would feel no anxiety or remorse. I think that would be true if i had a social circle that made me feel fine, but sans that i would probably worry!
The question then is why are you "actually you" whereas the other people "only THINK they're you (but really aren't)"?
They look the same, have the same fingerprint, have the same genetics, the same behaviour, the same memories, they all remember being the original and stepping into the copy machine. Your distinction that you are more special and more genueinly "you" than "they" are has no basis to it. Yes it very much does imply that you're the one to enjoy the memories, because you are the memories. without any telepathy or soul transference, but because the copies are you.
No that's not the question. The question is who do you think you are, that's the only thing that matters in the transporter.
I don't need to be special for this argument to work. I could be the tenth or one millionth nautilius for this. You can make further copies and create new instances, but that doesn't change who you perceive as your 'self'. If I'm the 'original', I will perceive the 'original' as myself, if I am No. 10000, I will perceive No. 10000 as myself.
Also, it absolutely does not imply that only the original can enjoy the memories, they all have them (as you accurately write), so they all will enjoy them.
I think you are going down the route of "will they be a hive mind", whereas I am going down the route of "the definition of what it means to be 'you' will have to change. You say "Also, it absolutely does not imply that only the original can enjoy the memories, they all have them (as you accurately write), so they all will enjoy them." but if everyone has the same memories, aren't they all the original enjoying the memories? The concept of "original" only applies out here from our discussion. Inside the scenario there is no meaningful distinction between original and not.
If "you" are "the original" then stepping into the teletransporter and being disassembled killed "you". "you" died. If "you" are the clone, then stepping into the teletransporter moved you. You didn't die. If the original and the clone both walk out of the machine, the single lifeline branches into two, and English and law and conversation don't have common words to describe what happens next.
To me the problem is not "do the two people telepathically have the same thoughts", the problem is that from the viewpoint of the original I walk into a teleporter and walk out of it. Someone on Mars alledgedly exists but I have no knowledge of their experience so they can't be me. From the viewpoint of the clone, I walk into the transporter and arrive on Mars. Someone on Earth alledgedly still exists but I have no experience of it so they can't be me. I set out to travel to Mars, arrived, and so I clearly am the original. Both situations are individually coherent, but they shatter my expectation that "I" will have a single future.
Both people think they are the original. In the case of this hypothetical, the clone is identical in every way so even God pretends not to be able to tell the difference.
It's language, society, law, convenition which breaks down. Which one gets to sleep with nautilus' partner? Does the partner have any reason to prefer one or the other? Does the partner have any right to accept one nautilus and spurn the others? If the transported are different people then a marriage contract applies only to the original. If they are all the original and all the same person, what happens.
To me it is an argument about word definition, and about subjective experience, not about spooky action at a distance and soul transference.
My personal conclusion is that I need to set my thoughts up pre-branching so that I would be equally happy to walk out of the transporter on Earth or on Mars, because I will do both. See also stories by Greg Egan exploring this theme, like the guy who was mind-uploading himself into a computer, but his thought process was always that the person outside was "really him" and so when he woke up inside the simulation he was bitter that he was an inferior copy and the original was "free" outside. Or the one where the characters had a machine (jewel or dual) implanted into their skulls at birth which cloned their brain patterns into itself as they grew up, then the brain was removed and the jewel carried on so they could be immortal. People who always identified as the meat brain felt like brain removal was death and an imposter was taking over. People who identified as the dual, merely "learning" from the meat brain felt like it was no problem. From the outside, nothing measurably changed before/after the removal of the meat brain.
I make no fundamental distinction between an Original and a copy, maybe that nomenclature is misleading. I use “original“ for the entity to be copied; that entity could be the tenth copy of a copy of a copy. It makes no difference. There is nothing special about the zeroth member of that line, except that that person accrued the memories that are then copied and experienced in other entities as their own. And why wouldn’t they, they’re copies.
So in your thought experiment, why (and how?) would your “self” be transported after copying? Why would you assume that? And what happens to the “original self“? You describe the experience of stepping into the teleporter and then stepping out, but this experience would require some magic to transfer the self. The much more plausible solution is that you step into a teleporter, you die, a copy of you continues your life happily on Mars but you have nothing to do with it. What would happen if you created several copies? Your model has no explanation for that. The only solution for your model is then the hive mind; the much more consistent answer is that each copy has their own mind, and does indeed think they’re original.
Your two paragraphs are confusing me. If you make no fundamental disctinction between Earth nautilus and Mars nautilus, doesn't that make them "the same person"? How can you say both "nautilus died" and "nautilus goes on living, but nautilus has nothing to do with it"? If one is alive and one is dead, that seems like a fundamental distinction, doesn't it? If the person on Mars is nautilus, then you have everything to do with that life, don't you?
I agree with your first paragraph: I don't think there is a "self" separate from the arrangement of matter and energy, no magic is needed, there is no hive mind, body on Earth cannot see through Martian's eyes. I agree with the end of your second paragraph, a consistent answer is that each copy has their own mind and thinks they are the original, but moreso - each copy is the original. Then, I must disagree with the middle bit - you step out on Mars and you have everything to do with that life. A copy of nautilus and nautilus himself step out on Mars, no fundamental distinction between them. Not in a hive-mind way, just as a necessary conclusion of your position.
No they're not the same person, all entities are individuals, all have their own mind and self. The hardware is the same, though (at least at t=t0). But the hardware is genetically identical between twins as well, without making them the same person.
Your second paragraph makes no sense: if each copy has their own mind, how can they simultaneously have your mind? And then you just wave away the necessary implications of that (hive mind). How else could it be in your model?
They're either separate entities, or they're not. If they are, the copy is a different person, and the Mars nautilius self is different from Earth nautilius self, running on the identically copied hardware. This position resolves all possible contradictions.
How would you step out on Mars without magic to replace the self of both Earth-you and Mars-you? How do you resolve just making several copies?
> "No they're not the same person, all entities are individuals"
So which one owes the bank the next mortgage payment? If they're not the same legal person then one of them doesn't. Twins have different names, different birth records, different memories, different legal and social identities. If the twins are honest people then they won't pretend to be each other and will self-separate to clear this up. Both Nautilus's, if they are honest people, will be adamant that they are Nautilus and try to use their genetic and biometric and memory records to prove it, neither will want to be the discarded one who must change their name - and neither will have any priority. At t=t0 they are, from the outside, two separate bodies who are the same person.
> "if each copy has their own mind, how can they simultaneously have your mind?"
If they are hardware identical, memories identical, who else's mind could they possibly have (at t=t0)? You copy OpenOffice onto two computers, they both simultaneously have OpenOffice. One doesn't have a different software just because future changes on one will not magically happen on the other. If brain and hardware and memories is not "what it means to be Nautilus", the what does? What else is there? The copy's internal experience will not be the experience of being a copy, it will be the experience of being nautilus with the same continuity of life experience that you have right now.
> "They're either separate entities, or they're not. If they are, the copy is a different person, and the Mars nautilius self is different from Earth nautilius self, running on the identically copied hardware. This position resolves all possible contradictions."
If they are identical in all ways at (t=t0), what specific difference are you pointing to when you say "different person"? And before you say "location", if I move location I don't become a different person. If they are identical then it shouldn't matter if you switch that sentence around and say the Martian is the real one and the Earth one is the "different person". Since you earlier said "you die, someone else goes on living on Mars and you have no knowledge of it" I think you wouldn't be as happy with that phrasing. The weirdness is that the viewpoint "I stay on Earth, a stranger appears on Mars" and "I arrive on Mars, who cares what happens on Earth" are both valid, both people experience actually being nautilus as validly as each other and neither has any better claim to the one true continuity of experience.
>The copies will have the same memories, and thus think they are the original
I don't determine whether I am me by cataloging my observable characteristics.
If I was only the sum of my observable characteristics, there wouldn't be any "me" to evaluate them.
Also, I am not aware of all my memories in one timeless moment. Remembering things is a process. e.g. Proust and his madeleines.
People can be, sometimes are, disoriented when they wake up. Part of it can be remnants of dreams, part of it is just that sometimes threads of memory need to be gathered somehow.
"Paradox: A person, thing, or situation that exhibits inexplicable or contradictory aspects". Depending on how you view it, they are the same person or they aren't the same person, which seems like a situation exhibiting contradictory aspects.
Very few people habitually think in the way "you went on a 12 hour flight, you are older, more radiation damaged, have different memories than before you set off, so you are a different person". Many people think "you went on a 6 month expedition, you had mind opening experiences, it made you a different person", but don't go on to conclude "therefore you no longer have rights to the original person's bank account and posessions and citizenship".
I know what a paradox is, and your application falls quite a bit short from the very definition you give. 'Depending on how you view it', a rainbow could be caused by light refraction or by a farting unicorn. This statement doesn't make a rainbow a 'paradox', there is nothing inexplicable about it.
That you observe two different behaviours of light, one like a particle, one like a wave, and they clash and cannot both be true could be paradoxical. That one person says a rainbow is caused by farting unicorns, is not paradoxical.
We have agreed ideas on what it means "to be" someone, matching genetic information at the scene of a crime with a suspect says the suspect was at the crime scene. We also have agreed ideas that people cannot be in the same place at the same time, so if two people are in different places at the same time they must be different people.
The teletransporter makes it possible to have two people in different places (must be different people) who are also "the same" person by any other test we have. They can't be both the same and different, but paradoxically they must be.
But you just declare that my rainbow is not paradoxical, your argument for the teleporter provides no logic beyond it. Here, I also declare “there is no teleporter paradox”. Are you convinced? A mere contradiction does not imply a paradox.
Other than that, we already have the exact situation you describe: a twin on the crime scene will leave the exact genetic information. Does that mean there is a paradox? No. Does that mean it has to be the same person? No. It’s two different conscious entities that happen to have the same DNA. Just like a copied person. No problem, no contradiction, no paradox.
If a unicorn farted to create rainbows, my ability to see rainbows could still be based on their refracting of light. Both could be true simultaneously and not contradict each other.
If the criminal is honest and wants to admit their crime, the Twins scenario will result in one denying the crime and one admitting it, correctly. If you imprison them both it would be unjust for one of them. If you imprison one of them and the other goes free, no injustice is done. If one human did the shooting, stepped into the teleporter, now two humans are honestly admitting to it, what do you do? If you imprison only one of them - one shooter, one person in prison - is that justice? If so, which one do you imprison and why? The other one experiences comitting a shooting and admitting to it and walking free unpunished, that doesn't seem right. If you imprison them both, then you must at some level agree they are both the same person who pulled the trigger, yes? If you don't agree they are both the same person who pulled the trigger and both need punishing, which one isn't and what's the difference about that one?
It's a funky version of relativity. From the perspective of the original consciousness life was terminated as soon as the machine started up, from the perspective of the duplicate on the other end, you continued without any issues. Make a hundred copies, and each thinks itself as the original.
Of course if you instead build a transporter that uses some funky form of a wormhole, you simply move from one frame to another. In that case you are still the original.
Maybe it's not so much a paradox as a headache inducer.
And perhaps we "die" every moment, even without entering a teleporter, the consciousness existing a moment at a time but with memories of the previous consciousnesses.
Because there's a principle of quantum mechanics called the "no-cloning theorem"[0] which results from not being able to determine the exact quantum states of particles. If you can't determine them, you can't clone them and you can't teleport them. With the essential caveat that you can use quantum entanglement to transmit the information from one particle to another and thereby destroying the original quantum state of the first particle. So the mechanism of perfect teleportation requires destroying the quantum state of the original.
(As far as I know this doesn't really cover whether or not we can do "good enough" cloning/teleportation, meaning, good enough to preserve all our bodily functions and memories, even if not perfect at a quantum level.)
First, really this is a philosophical rather than a technical paradox.
Second, how do you know the requirements of a teleporting machine that doesn’t exist? Seems like a cheap way out of the discussion to simply declare that the technical implementation will have certain properties that make the discussion simple.
Maybe the resolution you mention is not necessary. I can obtain all relevant information in a printed text by taking a picture, the exact quantum state is not necessary at all. Also: clones are already created routinely.
It might be impossible to obtain enough information about the original without destroying it, e.g. by instant freezing and then removing and identifying the molecules one by one.
However once you have the complete information, it should be possible to make any number of copies, all of them starting with identical memories until the moment when the original was destroyed.
This is a philosophical, not a technical 'paradox'.
In your model, why would the ability to make copies, based on the complete information you have successfully gathered, stop the instance you destroy the original?
I have already said that once the information is obtained destructively, making any number of copies should be possible.
All the copies would start with the same memories the original had before being destroyed, but after that they would all live different experiences.
I have replied to the question why the destroying of the original might be necessary.
With the currently existing techniques we can imagine how complete information may be obtained destructively.
We do not know any method that could allow us to obtain complete information about a body non-destructively.
All the known imaging and spectroscopic methods can determine partial information about the internal composition of a solid body but even when using all of them there is no chance of obtaining complete information.
Your sentence “it should be possible to make any number of copies, all of them starting with identical memories until the moment when the original was destroyed.” is not unique, but now I understand what you want to say.
When you reboot a computer, is it the same program running or is it a new program that simply has access to the previous program's stored memory?
When you wake up in the morning, are you the same person who went to sleep, or merely a person with access to that past person's memories?
Indeed, besides your memory, which could just as easily have been created from a copy of you, is there any evidence of your conscious existence before this infinitesimal moment? Who is to say that a new iteration of you isn't created with every cascade of firing neurons and immediately destroyed with the next?
If the sleep is used to retrain your neural network with data accumulated during the day and clear this short term memory then you're clearly a new person every day.
But assuming the new data set is consistent with previous experiences then you are probably converging to some local maxima or oscilating around some point and you change less and less every night as you get older.
In that sense decompensation or PTSD would be symptoms of some retraining with some inconsistent data and having the network diverging.
Indeed. Some may realize they are more than the sum of their parts. Not a popular position here on HN where most find us comparable to CPU memory architecture.
The assumption is that all
memory is stored in the brain and body, for where else could it be stored?
The materialists do not believe in a ghost in the machine. /anecdata
It means if you take us apart and put us back together again the ghost will have left. It will be a huge disappointment for cryogenics.
If people want longer lives the body will have to be kept alive in an induced coma state and heart rate slowed. Or use some novel regenerative therapy.
Otherwise that machine (body) is never coming back to life once its ghost has left.
It means an entirely different type of neuroscience that most scientists are uncomfortable with. Having to do with Consciousness
In my current understanding, after studying material from some "consciousness explorers (Waldo Vieira, Joshua David Stone, Ingo Swann, George King, Edgar Cayce, Jane Roberts, Dolores Cannon, Robert Monroe, and others)", consciousness is a multilayer and very very complex system.
Some of those layers reside in the brain, some exist independently from the brain and remain for eternity even after the brain shuts down.
In that sense there is no paradox at all. The machine would only be able to copy a few "layers" of consciousness. The resulting copy of you would be an empty shell because the "master layers" can't be copied, as they're made of "data" that isn't "finite".
Just my view. This is not absolutely verified by Science, and for me, it will take a long time to happen. Even then, there are some scientific experiments that validate, in part, this view: Controlled astral projection experiments, controlled remote viewing experiments etc.
My favorite thought experiment when it comes to consciousness is: if you could replace your brain’s cells, one at a time, with nanomachines that mimic the function of the original cell, at what point do you die?
Or do you stay conscious, and stay you, but with an artificial brain?
> if you could replace your brain’s cells, one at a time, with nanomachines that mimic the function of the original cell,
I enjoy thought experiments, and have thought about this myself. And have probably read some good science fiction based around the premise. My current position is that a nanomachine which replaces a neuron or glial cell and behaves the same way that cell would is essentially magical thinking.
It’s the everlasting gobstopper of consciousness. The gobstopper faces the issue of conservation of energy: how can it continue to have flavor and not get any smaller? The nanotech faces an analogous issue of “continuation of systemic process (as opposed to divergence)” that I don’t have enough clarity to describe yet. This makes my position weak, and I accept that. I think I am saying “if consciousness is a property of the body system, then introducing pre-built components that originated outside of the system will change it in unanticipated ways.”
I’ll have to give it more thought if we start to create artificial replacements for simpler organs (heart, lungs, etc.) that work better than the originals. I’d like to live in that world and be faced with the question.
I think the first person "you" survives in this case but not in the case of the teleporter. Putting aside the definition of "you" and how "you" can be affected by external stimuli, there's this continuity that only belongs to the "first person" you. This continuity is impossible to prove or convey even for the "first person" themselves, as their understanding of continuity relies on memories, which can be perfectly replicated.
In the end, when you step into the teleporter, you'll come out from the other side. The incomprehensible, unconveyable tragedy happened to the only person that could mourn its victim. The rest of the world carries on as usual.
That paradox implies that it's possible (even theoretically) to "records your molecular composition" and then "re-creates you ... each atom in exactly the same relative position". However, my understanding is that such things are not possible because of quantum Uncertainty principle - one cannot simply discard atom's momentum, as that would kill person being recorded, IMHO.
Not really. The issue is that you aren't the same from one moment to the other in the natural world either. You constantly change, large parts of your body even get rebuild regularly. So demanding 'quantum level perfection' of a copy just doesn't mesh with the ever changing mess that is the real you.
As long as the copy is 'good enough', which can be far away from the uncertainty principle, it really shouldn't make a difference to the though experiment.
Given: If the technology exists to create “good-enough” copies of a human, the tech to create “good-enough” copies of a habitat should also exist.
Experiment:
1) Create X copies of a controlled habitat
2) Create X copies of N humans, where N > 1
3) Place human copies into habitat copies
4) Observe
What is the expected percentage that the humans will behave identically? For how long? It’s probably best to start with one human per habitat, but I think “more than one human” is the more interesting question.
I can’t find the name of the condition right now, but sometimes people fall into loops, where they will ask the same series of questions and give the same responses each time until they recover. It happened to my grandmother once after a mini-stroke, and there’s a Radiolab episode about it which I also can’t find. So that lends credibility to the hypothesis that if the environments are “good enough,” there will be no divergence. I’m curious to know how long that could possibly last.
> I can’t find the name of the condition right now, but sometimes people fall into loops, where they will ask the same series of questions and give the same responses each time until they recover
Anecdotally, my mother did that sometimes during the mid-to-late stages of her Alzheimer’s (before we had to send her to a care home, but after she was failing the Clock test).
My one experience of edible cannabis (in retrospect I don’t think it was just cannabis), my partner at the time said I was repeating the same sentences. Not that I know for certain: myself, I had the subjective feeling of all working memories going direct into long-term memory without touching short-term memory.
More precisely, the “no-cloning theorem” of quantum mechanics implies that one cannot replicate the exact quantum state of a clump of matter in a second clump of matter (although it is possible to “teleport” the exact quantum state by transmitting its information, provided one is willing to mess up the original).
However, although no one knows for sure, it seems likely that the operations of consciousness are purely “classical” in the sense that the exact quantum state is not needed (Roger Penrose’s ideas not withstanding).
Turns out that while the no cloning theorem does prohibit cloning, it does not prohibit perfect replication as long as the original is destroyed. This is the premise of quantum teleportation. Using some clever applications of entanglement it is possible to perfectly replicate the quantum state of a particle in a different location, it's just that the original particle is necessarily destroyed in the process.
So, in short the no cloning theorem does not prevent quantum teleportation.
In this sense, the follow-up in the paradox where you create 10 copies all of which are identical would not work. Quantum mechanics effectively ensures that there's ever only at most one of you.
The uncertainty principle is deeper than that. Not only is it impossible to perfectly measure both momentum and position, but they are not both simultaneously well-defined even in perfect conditions — the uncertainty in each is itself a real property.
I would also be somewhat surprised if the most detailed scan possible under QM is even necessary for a functionally-perfect copy of consciousness: The perceptron model used in artificial neural nets is a mere toy in comparison to a real neurone, yet very effective at duplicating interesting behaviours.
> but they are not both simultaneously well-defined even in perfect conditions — the uncertainty in each is itself a real property.
This also implies trying to "dig deeper" doesn't matter - there's nothing to dig up. From this follows that either 1) "the most detailed scan possible under QM" is at least sufficient to capture the entirety of consciousness, or 2) there's something beyond QM we have no clue about, that makes uncertainty not fundamental, or 3) consciousness is magic and the universe makes no sense.
I find 2) unlikely, and 3) is religion (the "separate magisterium from science" kind).
The premise is already wrong. Any kind of copy mechanism will be imperfect. Even if we just consider classical physics (which is obviously a very coarse approximation), there is no mechanism that could scan every atom of your body at precisely the same time. Invariably, there will be an intervall ("all atoms have been scanned between t0 and t1"). Now relatively basic math tells us that the development of a complex dynamic system can change arbitrarily with small changes in the starting parameters. Maybe there could be an argument about the statistical behavior of the system (for instance, it is very unlikely that the copy would immediately lose an arm or a leg), but the state of mind would definitely not be identical.
If a body is frozen to low enough temperatures, then the movements of all atoms are reduced to vibrations around some fixed average positions.
After freezing, in vacuum, you could remove the atoms from known positions by shooting ions to them and then you could identify the removed atoms.
Of course this would be exceedingly difficult to do for a large body, because identifying the type and positions of all atoms would take a huge amount of time and because if the body is large it is difficult to freeze its deeper interior fast enough to prevent any damage.
Nevertheless, there is no theoretical reason for this to be impossible.
But you wouldn't measure the living body. You would measure a frozen (that is, dead) one.
And even then your level of precision wouldn't be exact but subject to errors. Regardless how small your measurement errors might be, there cannot be such a thing as an identical copy.
Einstein's elevator metaphor did not at all draw any important bits from the impossible construction. A spaceship under constant acceleration works just as well.
A thought experiment that creates a problem out of an impossible premise is just worthless as such. First of all, everything can follow out of a wrong premise, and secondly you can create an arbitrary amount of such experiments, preventing any insights.
Classically you should be able to measure any "information" present in the system. In general a useful machine cannot depend on indistinguishable microstates. If you include quantum mechanics then you may run up against to No-Cloning theorem, but that's a contentious question.
This also should work backwards, temporally, if you want to look at time linearly.
Parfit's extrapolation and commentary on morality is dubious, however there are no citations on this, so maybe the author(s) are projecting?
That aside, it's a big responsibility either way.
From a societal perspective, the UFP has probably had a lot of time to structure rigid laws and clauses to protect its citizens. Starfleet has a non-interference prerogative (a little ironic) which suggests their perspective as a society has shifted to accommodate the tele-tech.
From a moral perspective Robert Angier can swing any direction for his murder or suicide whereas Sam Bell is most definitely a victim of murder.
But from a personal perspective, is Simon Jarrett still at the bottom of the ocean, his battery draining until the inevitable collapse? Or is he among the stars with new friends and a new sense of liberty?
I would argue both, but only if Jarrett is cognizant of his actions. If Jarrett understands that he is condemning himself, then both Jarrett and Jarrett share a personal responsibility to each other.
Perhaps Jarrett is envious that Jarrett gets to live a full and happy life, and perhaps Jarrett feels guilty about this, and the personal duality persists. Maybe Jarrett accepts the decision he made to send himself to space, but Jarrett is still guilty. Is Jarrett still responsible for leaving Jarrett to die?
"what matters" is almost entirely consistent, but establishes a negotiable dissonance where infinite Jarretts have a social bond within the same stream of consciousness.
I don't see any paradox here. The principle could be the same as in computer science: You can move data either by copying or referencing. In a world that having teleportation (like MMO games, or real-world if we figure it out somehow) the principle should not be any different.
1. Moving by copy: Create new data which looks exactly the same (deep copy), and put the data in desired new address. So teleporting by copy is just create the exact same character in the new position, and then make the old one disappear.
2. Moving by reference: The data is the same, it's the pointer to the address we're updating. So teleporting by reference is like there's an external table of characters, and the world is just referencing them with positions. Teleportation is a process that originally the position that points to the character was A, then the teleportation creates a new position B, and reference the exact same character and then remove position A.
In short expect(character).toDeelEqual(teleportedCharacter) should always be true, but expect(character).toBe(teleportedCharacter) should be true if it's referencing, but false if it's copying.
Although it could be either way I don't see any paradoxes in both.
Now let's say it's the player character, you implement the copy mechanism, and play your game.
You launch it, start the process, the code faithfully creates a copy of the player character, then you suddenly see a game over screen as if your character just died.
You had a bug: you created a new character for the player, but didn't update the gameplay code to reflect the change to the copied character.
Not only do you not get into the same river twice, it's also always a different person from one moment to the next.
A concept, that of granular time, by the way, which also is just a tool for the brain to make the outside world easier to navigate.
The sense of temporally overlapping unity of consciousness is a useful trick of the body to think of itself as an acting subject in its model of the world.
The agent of the body's interests in it.
In the brain, there is no distinction between software and hardware.
If you build a machine whose software rewires the hardware and the hardware rewrites its software in real time under the influence of the outside world and some, over thousands of generations more or less hardwired structures, you get closer to what the brain and mind do and are.
So, sorry no uploading of a somehow separate consciousness into a machine.
Even if possible, you would have two beings.
One still pure social/wetware. One, the 'copy' of the former, ended up in an artificial hell.
This whole thing of what's you and what's not you, we may be discovering the real mccoy at some point. Whether if we are really actually a new person every morning, just reloaded with the old guy's stuff, or not. And surgeries and walking up from them, maybe as a new person as well.
I'm not sure it will be of any importance, just like we * have * human rights and yet there's still millions of humans living out there without human rights.
>Parfit's conclusion is similar to David Hume's view and also to the view of the self in Buddhism
This comment is odd to me, because the idea that the self tracks similar physical bodies seems to me like the opposite of my intuitive idea of reincarnation (not that I'm a Buddhist).
If you are reincarnated, you are completely different and yet the same soul, right?
It's like a complete inverse of this idea some people have that interchangeable bodies share a soul.
You may have the same soul - for very approximately defined definitions of soul - but you certainly don't have the same identity.
The idea behind reincarnation seems to be that something persists, and is in some way definitive, but the only available definitions of that something are metaphysical and may not even be comprehensible in human terms.
Many kids appear to show relics of a former identity up to around the age of 4. But even if you assume there are no conventional and mundane explanations for this, there are possible metaphysical explanations that don't require reincarnation.
None of this is very useful scientifically.
All of which is tangential to the original point that we don't really know what identity is. We act as if we do because we have only really have one option - which is that people's personalities tend to be associated with a body we can recognise.
In reality personalities can change drastically because of trauma and illnesses like Alzheimers. So this is just a workable approximation.
In fact we don't even know our own identities. We have a consistent self-symbol but it's not really stable, and elements change dramatically over time. (Example - I like pretty much the same music I did as a teen. I know many people don't, so something that happens to be stable for me isn't stable for countless other people.)
People regularly surprise themselves with unexpected actions. And to a greater or lesser extent everyone has unrealistic beliefs about themselves and others.
So identity usually reduces to a set of observable and consistent behaviours. It should be a set of consistent emotional and mental states. We can't know the latter - not even with test equipment - so we assume the former somehow correlates.
>You may have the same soul - for very approximately defined definitions of soul - but you certainly don't have the same identity.
This is a contradiction in terms for me. I have no idea how soul and identity could be different.
We apparently think differently. Changing tastes in music or finding out one's response to a new situation are not inconsistencies or changes to identity, from my perspective.
I don't understand what you mean about "we can't know" emotional and mental states. I can't know your mental state. But it doesn't mean anything to say I can't know my mental state. Who is "I" then? How can there be a mental state of someone who doesn't exist? If a series of 1000 cars was built, what is the paint code for #1001?
I'm doubtful that actions or behaviors can really be inconsistent. I'd prefer to say that when observing someone else, the totality of their actions can be confusing to where I give up trying to model their mental state. But that doesn't mean there isn't one. And sort of, like, the physical world is defined as being consistent, isn't it?
Identity has to be based on physical processes, and therefore I think I'd even say that can't be inconsistent. In principle, there would be a set of numbers that would fully describe a person over time, and it would have to satisfy physical law, and that's inherently consistent.
What can be inconsistent, I think, is explanations for actions and properties of the self. I think it's good to focus on resolving the reasoning and explanations that are used to rationalize everything, but not to confuse it with one's actions and identity.
The Parfit version is one of the central ideas expressed in Douglas Hofstadter's "I Am A Strange Loop", which is a loose followup to "Godel, Escher, Bach" except centered around Hofstadter's ideas about consciousness and "life". I personally was a little cold on I Am A Strange Loop despite being a great lover of GEB, but I would recommend the book to others who liked GEB, and the Parfit short story stuck with me.
The Hoftstadter/Dennett book The Mind's I from 1981 is relevant in another way: the introduction jumps you into this exact scenario of finding yourself on Mars via a "Teleclone Mark IV Transporter" and getting beamed back before your instance on Mars dies, and raising these questions about identity. https://themindi.blogspot.com/2007/02/introduction.html
Your are comprised of atoms. You drink some vodka and those atoms don’t affect you, they become you. Then you piss them out. Well, you piss yourself out and flush yourself away.
Matter/energy transportation (Star Trek style) would likely kill you and then create a clone of you at the destination that would think it is you. Dimensional transport (Stargate style) is the only type of teletransportation that I would consider. It might also be easier just to tell people to watch The Prestige to understand this.
You have a mind and are conscious. You now create a copy of your mind and let it run on some other hardware, e.g. the teleported body. You are still conscious in your old body and a new entity is created in the other body. You can also create ten new copies, with ten consciousnesses , you will still be conscious in your old body. You now destroy your body - you’re dead.
I surely don’t think Kirk has a great job.