Comments on: Your Picture Won’t Be Hanging Here? A revolution in time. Thu, 11 Apr 2013 01:11:28 +0000 hourly 1 By: Taurus Londono Taurus Londono Wed, 01 Aug 2012 04:21:14 +0000 BTW:
My apologies if nothing I’ve spewed out here even remotely approaches anything you were getting at.

I realize that what I’m harping on is practically a moot point; for all intents and purposes, you are your brain, period.

I suppose I’m just trying to verbalize some kinf og framework with which to understand human identities as *discrete things* in the universe. It seems to me that the difference between a human and a flatworm is a matter of degrees rather than kind.

By: Taurus Londono Taurus Londono Wed, 01 Aug 2012 04:02:01 +0000 Actually, I disagree…perhaps for largely the same underlying reason.

It is the universe that’s the “cube”, and to the extent that humans are part of the universe, yes, “we” are part of the same fundamental spacetime geometry of the universe.

But “we”…”you” *don’t* exist over time. I think that personal identity as “we” perceive it is a kind of hologram, or something like a projection being cast by the neurophysical interactions going on in our brains.

What you perceive as “you” at any one moment is not the same “you” that might perceive itself at any other moment. That may not be apparent from one day to the next (and that, I think, is merely a consequence of our naturally-evolved perception of time), but on a long enough timescale, I would argue that you are not fundamentally the same individual. It goes without saying that 4-year-old Michael Federowicz is not the same “you” as the Mike Darwin of 2012…and the Mike Darwin of 2032 (or 2232, if you’re lucky) will most definitely not be the same Mike Darwin as the “you” reading these words for the first time right now. For all you know, he might not even particularly like the Mike Darwin of 2012 (hey, I’m just saying).

It’s not *merely* that personal identity changes over time. Personal identity is not like, say, a mountain that merely changes its features over time as its malleability at the atomic scale leaves it vulnerable to the mercy of fundamental forces. The structural features of your brain might change over time; but “you” aren’t really these structural features; you’re a *product* of them.

The situation is similar, I imagine, to software on a PC hard disk drive. It seems to me that the output of that software at any one moment is unequivocally *not* the same thing as the code lying in the hard drive.

In that sense, the fine structure of your brain is like code embedded on that disk; “you” are merely the output at any one time.

What happens if we fundamentally alter that structure using some hypothetical fanciful nanotechnology…if, say, we make it appear as though Angela Merkel’s brain is inside “your” (Mike Darwin’s) body. Let’s say “you” think you’re Merkel…you might not really be her, but I don’t think we could say that you’re Mike Darwin.

Anyway, personal identity can already be altered this way; it is trivially easy, for example, to simply *physically screw around with the brain* to effectively turn someone into, well, someone else. Victims of traumatic brain injury come to mind (no pun intended).

You might (in principal) be able to trace the path across time of the calcium ions in your brain, but if the structures in which your long-term memory are encoded were suddenly changed (drastically) right now, and “you” changed, the exact same calcium ions would still be mediating neurological function, their paths across spacetime still perfectly traceable (in this universe, at least). “You” are not the subatomic particles that comprise the neuronal tissues in your skull; like I said, “you” are merely the consequence of their existence.

It seems to me that the “Mike Darwins” that exist across time are distinct individuals whose personal identities are determined by whatever combination of brain structure and neurophysiology happens to exist.

Unlike the mountain, the Mike Darwin reading these words for the first time simply *won’t exist* in any future; unlike the universe as a whole, *you* are not tenseless. Indeed, your very existence is predicated on the present.

Fundamentally, your existence as a conscious entity only extends as far as “now.” Pull the camera out far enough on the universe, and there is no “now.” That the human mind even has a concept like “now” is very telling, I think, when it comes to the nature of identity.

By: Taurus Londono Taurus Londono Wed, 01 Aug 2012 02:15:46 +0000 Look at it from the clone’s perspective.

You wake up and *remember* making a pre-commitment. The fact that that pre-commitment was made in the past and is only tangible in your memory is very important.

Right now, I can recall memories of thoughts and behaviors that I engaged in five or ten years ago that I would not necessarily engage in today. That *might* also be the case if a thought or behavior (apparently) occurred 10 minutes ago.

If I have a strong desire to survive, why would I kill myself so that some *other* brain in some *other* body can get some monetary reward (much less survive)?

Someone could approach you *this very instant* and tell you that you are a clone, a copy of the “original,” and everything up to the last 24 hours were implanted memories to make you *think* you were the original.

“Aw shucks, guess I’ll agree to suicide then.”

Not me (or any other “me”). I will never *choose* death; especially not for the sole purpose of letting someone else with an identical physiology (original or not) take my place. My clone would necessarily feel the same way.

I suspect most cryonicists (or other so-called “immortalists”) would agree; and I agree with your first statement about the persistence of experience.

By: Taurus Londono Taurus Londono Wed, 01 Aug 2012 01:39:58 +0000 No dismissal would be justified. I’m roughly half your age, and I agree with you entirely on this.

By: Luke Parrish Luke Parrish Fri, 30 Mar 2012 01:10:41 +0000 “As I understand the experiment you propose, what has been set up is a lottery with odds or stakes that the original will survive. That still poses no challenge to the issue of who is really who. There is still really only one “original,” or real me. ”

Perhaps the main point would be clearer if we remove the lottery aspect entirely. Suppose you find yourself in the following situation: In ten minutes you will be cryopreserved and scanned. A molecular copy will be generated who will be thawed and presented with two buttons. If they choose to press the green one, they will be vaporized and the original will be thawed. If they press the red one, the original will be vaporized and they will be released.

Now, if you strongly feel that the original is the only real you, that indicates that it is rational to precommit to pressing the green button. You would want to make up your mind so firmly that the feelings of self preservation of the copy will not override the decision. You would anticipate the feeling of self-destruction, but would anticipate sacrificing that to a greater survival principal. This anticipation would all happen before the ten minutes is up, and would be fulfilled when the copy actually makes the decision.

An even less confusing example might be if you are in a situation where a copy will be made, which will press a button to either award you five million dollars or not, and then be vaporized (with the original being thawed and receiving the money or not). In that case there is no gain for the copy either way, but you would still be inclined to anticipate making the decision (and thus committing to a choice) before being copied.

My point is that the feelings and complex precommitments, in other words decision making tools that function over time, constitute a very major component — arguably the only major component — of identity. If your copy ten minutes in the future makes a decision, it will be using your memories, feelings, values, and precommitments that you possess and form in the here and now. In forming your plans for future behavior, it is completely reasonable to anticipate that you will be the copy, even if you simultaneously anticipate not being the copy with equal certainty.

Physics based objections don’t seem too significant on the basis of time because the aspect of identity we have explored here is also relevant over time. In fact it is utterly dependent on *causal* interactions of matter (just as when a developer zipped up your web browser, transmitted it to a download server, and you downloaded and unzipped it into a functional product, all causal relationships, however abstract and mathematical they may have been, had to be established by airtight physics).

The decisions of your copy to press the button *must* have a very clear causal relationship to your prior thoughts about about what decision you intend to make in that event. Otherwise we might as well be discussing a Boltzmann Brain, which is quite another topic.

By: chronopause chronopause Thu, 29 Mar 2012 10:12:00 +0000 Very, very well said. Your comments about “buttons” in this context remind of an essay by Thomas that has a corollary theme predicting (accurately) the demise of the US manufacturing industry. It’s title was, if I recall correctly, “Just Push a Button and It’s Done.” — MD

By: cath cath Thu, 29 Mar 2012 06:30:30 +0000 “The medium is the message” as McLuhan wrote, and nothing screams ephemeral like a digital image. As the wife of Thomas Donaldson and a professional artist, and as Thomas is the son of two professional artists, it would distress me if his photographic portrait were displayed in digital form. Photographs have a presence and permanence that digital images do not have. In addition, if the portrait is a photographic one by a professional photographer, it is legally and aesthetically an insult to artistic integrity to scan it and display it digitally. The photographer has the copyright.

Max, when was the last time you “pushed the button” (surely a description of dominance of viewer over image if ever there was one) to study the likenesses of the Alcor patients? An image on a wall has primacy, and THAT is the substance of Mike’s impassioned comment, whereas one that is there at the push of a button exists at the behest of the button-pusher or “conjuror”. It can be turned on AND off.

By: chronopause chronopause Tue, 27 Mar 2012 17:06:43 +0000 The purpose of my gedanken experiment was to point out that models of identity that fail to take time into consideration seem unsatisfactory. Indeed, they can be demonstrated to be unsatisfactory as soon as we begin to work with quantum entangled systems that extend into the macroscale world. That is, in fact, exactly what prototypical quantum computers and similar model systems have been demonstrated to do. No full scale commercially useful devices of this nature are required – the proof of principle stands, and in fact it has stood for quite some time – we just didn’t see it. Now, what does this mean? Does it have implications for what constitutes personhood? My guess is it that it does, because it seems pretty clear that a person is not a single “frame” or “slice” of information at any given instant in time. A person is more akin to a wave function.

As I understand the experiment you propose, what has been set up is a lottery with odds or stakes that the original will survive. That still poses no challenge to the issue of who is really who. There is still really only one “original,” or real me.

Where I find things both really interesting and confusing are in gray-state situations where there is substantial loss of information, or abrupt or discontinuous changes in matter content. It is also suggested from interference effects that there may be interactions between the component multiverses, and that makes me wonder just how far identity may be distributed. — Mike Darwin

By: Luke Parrish Luke Parrish Tue, 27 Mar 2012 16:10:18 +0000 As a thought experiment, let’s suppose you wake up in a warehouse in the future. Due to some industrial accident, your (or should I say the original you’s) cryopreserved body was duplicated on the atomic level 99 times. The original was shuffled in with the copies and now nobody has any idea which of the 100 total is the real one.

The amoral AI that runs the warehouse offers you two options. You can have the 99 other copies all thawed to allow them a chance to live out their lives as they see fit. But there’s a catch: the existing person that makes the decision must commit suicide. In that event there is a 99% chance of survival for the original.

Alterately, you can also choose to remain alive as the currently mentally active individual, in which case the 99 viable copies must be destroyed. This gives you only a 1% chance that the original survived.

For the sake of the thought experiment, suppose you desire above all else to survive and don’t care too much about the possibility of killing people. Also we can stipulate that you’ll have plenty of other chances to get duplicated if you want, and a backup of yourself at the moment of death is kept separately, so the opportunity cost of destroying the duplicates is effectively zero.

Which decision makes the most sense to precommit to in this event, assuming you are the original thinking about the possibility beforehand?

Personally I think of an individual as the sum of their experiences, and that it would be *less* moral to suicide and preserve 99 slightly less experienced copies. Allowing a completely undifferentiated copy (be it the original or not) to be destroyed is not a problem for survival purposes as I see it — it is really strictly a matter of preserving the information in an active form.

By: chronopause chronopause Tue, 27 Mar 2012 02:36:08 +0000 Thank you for the clarification. Lots of questions come to mind. How will the balance between analog prints and digital images be addressed? Why not go to all digital images? Who determines which patient’s image is displayed in which format? When you state that, “This is another example of your framing things to give a misleading poor appearance.” My response is, “Considering the way in which you have proceeded, what did you expect?”

For instance, I think that you and others in Alcor management would agree that aside from the underlying strategic philosophical and ideological issues, this change is likely to be a very sensitive personal issue for some of the families and friends of the patients in Alcor’s care. As such, it might seem a reasonable expectation that such a change would have been discussed with the Board, with family members and with the Alcor membership on the pagers of Cryonics magazine, or on a blog-style discussion forum very much like Chronosphere is.

My point is, that before pointing an accusatory finger of blame in this direction, methinks the “strategic philosopher in residence” should get a clue and think about the important philosophical, personal and practical implications of changes of this nature, as well as of how they will be perceived before implementing them. — Mike Darwin