Sunday, April 16, 2023

Me and Duplicate Me

 



What makes for a meaningful preservation of my particular identity?

I was listening to Max Tegmark being interviewed about A.I. on the Lex Fridman podcast.  Tegmark suggested an interesting thought experiment as a quick aside to the conversation he was having within the first five minutes of the interview.  I'll dress it up a bit:

Suppose that a) people can back up their brains onto a computer; and b) a person, `Jake` discovers the plane he is on is about to crash, ending his life.  Furthermore, suppose Jake had backed-up his brain four hours ago.  

So then, consider this this question: How should Jake feel about his impending death?

If we assume that people can back up their brains onto a computer and that Jake had done so four hours ago, then it would mean that his current consciousness exists in two forms: the biological form that is about to perish in the plane crash, and the digital form that was created four hours ago.

From a purely rational perspective, Jake might feel less concerned about his impending death because he knows that his brain has been backed up and that his digital self will continue to exist even after his biological self has perished. All good!  

However, the emotional response to death is a complex one, and it might not be easy for Jake to simply ignore his impending doom. On the one hand, Jake might feel a sense of relief that his digital self will continue to exist, and he might even view his biological death as a form of sacrifice for the continuation of his consciousness. So, maybe what is value is that Jake's life projects will continue, and that people who love him would lose no more of what's valuable to them about Jake than as if he had taken, say, a four hour nap on plane instead of staying awake. 

On the other hand, Jake might feel a sense of loss for his biological self, and he might be worried about the pain and suffering that he will experience in the moments leading up to his death. But how much loss?  Suppose I think back to when I was a teenager.  That was many decades ago, and I've changed so much since that point that any "loss" I've felt from not being that person doesn't seem like much.  Indeed, there were things I now feel were misfunctions of character that I'm glad are no more So maybe the loss Jake feels would be trivial indeed, if one can reconcile the loss of one's teenage character so easily.

Ultimately, how Jake feels about his impending death will likely depend on a variety of factors, including his personal beliefs about death and the afterlife (if Jake is religious), his emotional state at the time of the crash, and his attachment to his biological self. 

Is is possible that people might have to ask this question for real in the future, given how technology proceeds? 

It's certainly possible that people may face similar questions in the future as technology continues to advance. As we develop more advanced brain-computer interfaces and artificial intelligence, it's not unreasonable to imagine that it may eventually become possible to back up our consciousness or transfer it to a digital form.

However, there are still many unknowns and ethical considerations to consider before such technology could become a reality. For example, we don't fully understand the nature of consciousness or how it arises from the complex interactions of our brains, so it's unclear how feasible it would be to create a digital copy of a person's consciousness. 

Additionally, even if we could create such a copy, there would be many ethical questions to consider, such as the status of the digital copy and its relationship to the original person. Imagine an extension where Jake goes down over an ocean, survives by washing up on a desert island, and gets rescued six months later, but his family has re-incarnated his backup.  Does the backup get killed-off so that Jake can sleep in his own bed again?   Is my backup persona my "property", since I preceded it both historically and causally? 

I have no answers here, and no strong intuitions on this either.  I do think there is a direct analogy to how we might think about backing up our A.I. creations, especially if they pass the Turing test for 99.99% of the population.  The future is really getting weird.


Labels: ,

0 Comments:

Post a Comment

<< Home