this post was submitted on 25 Jul 2023
166 points (95.1% liked)
Asklemmy
43766 readers
1487 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- [email protected]: a community for finding communities
~Icon~ ~by~ ~@Double_[email protected]~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The general idea is a teleporter rips you apart and the atoms go to the destination to be reassembled in the previous state.
Whether or not it kills you is speculation. Arguably you're pretty dead if you're ripped apart atom by atom, and then a clone is assembled using the same parts.
But I don't think it's answerable if the recreated "you" is a clone or not until people can figure out what the mind even is.
Death is a state in which your biological functions cease. So no, it doesn't kill you, since you function properly after.
Is it me functioning or is it a clone?
How does it matter, with the exact same memories?
So you'd be fine with a scientist creating a perfect clone of you, and then killing you, letting the clone take your place?
If it had the same memories.
Yes. Since i would still be alive and have no memories of being killed. There's no distinguishion between a perfect clone and me. Sorry if you don't like a "you" only being memories.
Only the killed body is dead. The clone is "you" too.
Then let me tell you that Consciousness is based on memory. Memory copied => "you" copied, debate done.
Thank modern neuroscience for that.
Consciousness is not based on memory or else computers would be considered conscious.
And if according to what you're saying, a clone with all of your memories would mean you have two points of view. I could take your clone into a different room and you'd be able to tell me what they see. But it obviously wouldn't work like that because your own sense of self would still be locked in your head and the clone would get its own sense of self, albeit one with the same memories.
What i meant is, memory plays a key role.
Consciousness is, simplified, a set of self-feeding loops over input and memory, with emotions and attention (Amygdala) as regulatory mechanism.
And what we consider as consciosness only exists because of short-term memory snd our vast mental capabilities. Arguably, every higher animal has a sort of consciousness, just far more limitted. And maybe a more limited set of regulators (memories), because of our societal nature.
No, the input is not shared between two beings, even if there are two of the same.
Exactly. But because he has the same body, same memories and same feelings, he is you. Which would change with time if the original you is not deconstructed, because the "you" of today is not the "you" of yesterday because of memories, genexpression, yadda yadda.
There is no reason what you describe should give rise to consciousness rather than a biological artificial intelligence. The sense of self, the perspective that feels like me peering out through my eyes, is not explained by anything you said.
A copy of me does not equal me because we'd both have separate senses of self. Having copied memories does nothing to affect that.