Worm’s brain mapped and replicated digitally to control obstacle-avoiding robot.

  • @LwL@lemmy.world
    link
    fedilink
    English
    53 months ago

    Think of an alternative scenario, not transportation but rather duplication. The original stays where it was, but a copy gets created elsewhere. To the copy, it will seem as if it got transported there. To the original, nothing will have happened.

    Now you kill the original.

    The only difference is the timing of ending the original.

    • @Warl0k3@lemmy.world
      link
      fedilink
      English
      1
      edit-2
      3 months ago

      Ah, but it wouldn’t be a copy of the original. In a hypothetical star-trek transporter accident that results in a duplicate, there would be an instant of creation where the dupe and original would be truly identical - and then the question would be which one of those two is ‘you’? They’d be experiencing identical things, so how could you tell them apart? What would even be the point, they’re identical, there is by definition no difference between them. The differences would only come once the duplicate is exposed to different (for lack of a better term) ‘external stimuli’ than the original, like different angles of seeing the transporter room or the sensation of suddenly and rapidly growing a goatee. Your perception wouldn’t continue with the duplicate because your experience would be different than that of the duplicate’s (for example, you wouldn’t have mysteriously grown a goatee).

      If you destroyed the original and then made the duplicate, it would start at that moment of total equivalence, but there would be no deviation. There’d just be one version, that was identical to the original, moving forward through time. ‘You’ would just go on continuing to be you. Consciousness isn’t a ‘thing’ - it’s not magic, its just a weird state that arises in sufficiently complex systems. You and I and everyone else in this thread aren’t special, we’re just extremely densely packed networks that have the ability to refer to themselves as an abstract concept.

      It’s a similar thing to the classic “But how do I know that what I see as the color green is what you see as the color green” question. The answer is that the “color green” that we see isn’t real, ‘green’ is just a nerve impulse that hits a network. Each photoreceptor just sends a signal. If we were computers the world would be represented as an array of values, which results in the much clearer “How do I know what I see as G_101.34 is what you see as G_101.34” just isn’t quite as punchy a question.