What if consciousness requires an active synapse loop to be maintained, kind of like a magnetic field, with little hysteresis eddies etc, but one that if it loses all power, it disappears forever?
Hi @russfink. That can't be the case since, as mentioned in the debate, there are medical procedures [1] where a person's neural activity ceases and the patients recover their (long-term) memories
This doesn't fix the philosophical issue, though. Is it the same person? Or did that thread of individual conscious perception disappear forever and was replaced by someone "new" who acts the same?
In the end, this is a philosophical discussion because consciousness is not something we can observe in a way to test and falsify hypotheses.
Hm. The cessation of electrical activity is just that which is measured on scalp EEG; ie, we can only really be certain that the most superficial layers of cortex aren't firing and what we are measuring.
Very fair point. There are other examples that go beyond DHCA, though, and point clearly in the same direction. One example is that sometimes people suffer cardiac arrests and lose consciousness for minutes to hours 10.1016/j.resuscitation.2014.06.015. Coordinated electrical activity stops after a few minutes following cardiac arrest, but people can (rarely) be revived with their apparent memories and personality intact.
If a "backup" is made of your brain before you died, and it's restored afterwards to a new body, wouldn't that new copy be totally disconnected from the original. The brain in the new body might behave identically to the original, but the entity (or "soul", for lack of a better word) experiencing it won't be the original. This is trivially demonstrated by using the same backup process to make a clone (ie. restoring the backup but not killing the original). Clearly you won't be having the experiences of both bodies, would you?
If you believe the soul is a separate thing to the body, then it naturally follows that duplicating the body is merely a cargo-cult version of immortality.
If you believe that there is no separate thing which is a soul — that all that we are is just a particular arrangement of matter — then duplicating some particular arrangement from a backup is like forking from an old git commit: just because you can label one particular branch “master” doesn’t make it genuinely special, and any special treatment you give to one branch is merely your convention.
>just because you can label one particular branch “master” doesn’t make it genuinely special, and any special treatment you give to one branch is merely your convention.
It does matter, depending on your motivations. If you think the reason for living is that you're a great person whose continued existence will benefit society and/or your loved ones, then sure, it doesn't matter as long as the copy is identical. If you simply want to stay alive and experience/enjoy more of it, then having a bit-for-bit identical copy might not achieve your goal.
For the latter option, consider the following thought experiment: suppose say you want to do an enjoyable activity, such as going on a one week vacation, but you can't afford it. Would you pay 50% of the cost (which you can afford) so a clone of you go on vacation?
In many respects, I think you already break your stream of consciousness when you sleep or are knocked out. When a copy of your brain is made, that copy will experience what all of us do after a good night’s rest in a train: waking up to an unfamiliar environment, subtly changed by random firing of your neurons when you were asleep. The “you before sleep” is effectively not the “you after sleep” because there was a break in the chain.
Thus, there’s no benefit to the “you that didn’t go on vacation”. There was a benefit to the “you that did go on vacation”.
So you acknowledge that in the event of a brain transfer, two versions of your consciousness will exist, and that the "original" one will be lost, but it's fine because neither would be able to tell (ie. the original will be "going to sleep", and the copy will be "waking up")?
Exactly. I would put forth that you can take it even further, and assume that every moment In time represents a recreation of your conscious experience of the past up until the present.
Under that perspective, Consciousness is like the illusion of motion on a computer monitor. There's some sort of framerate, related to the minimum period of time in which you can perceive a conscious thought, and the sequence playing back at a high frequency creates the illusion of continuity.
> just because you can label one particular branch “master” doesn’t make it genuinely special, and any special treatment you give to one branch is merely your convention.
If you can have the Mona Lisa, and a copy of the Mona Lisa simultaneously, the copy is not the original. It is clear that if I burn the original, no harm comes to the copy, and vice versa. I don't think the Mona Lisa has a soul, though. I also don't think any correct arrangement of molecules becomes the Mona Lisa.
The comparison is imperfect because we do not have the technology to make copies with sufficiently high fidelity.
If someone made a copy of the mona lisa that is physically indistinguishable from the original (including any kind of sensing technology one could come up with) then I wouldn't mind much one of them getting burned, after all we can mass-produce the artifact now. The aversion about loss is that a "one of a kind" thing gets destroyed which is irreversible because being one of a kind implies there's no redundant information from which it can be recovered.
What if it was the "copy" you owned? I only have my own set of conscious experiences. My set is not the same as the clones. When people seriously argue this, I start to wonder if they are bots or NPC's that somehow have no conscious experiences. It is clear to me that if I had a clone, I would be just as attached to the idea of this instance of me staying alive--in fact I could have 1000 clones in space now. They are cold comfort to me.
If the copy were perfect you would not be able to tell the difference. In essence there would be no "original" and "copy", just two instances of the same information. This is basic particle physics or information theory. Particles do not come tagged with UUIDs. Once you duplicated all properties the original and the copy become interchangeable. At the macro level we only care about the difference because analog copying processes tend to be lossy.
> I only have my own set of conscious experiences. My set is not the same as the clones.
It is at the moment of "cloning" (assuming that includes memory duplication). They may diverge afterwards, but at the moment the copy is made the sense of self is the same for both.
> When people seriously argue this, I start to wonder if they are bots or NPC's that somehow have no conscious experiences.
Perhaps some people overrate this near-magical concept of consciousness? It temporarily ceases when you sleep or are comatose anyway while your biochemical processes keep shuffling around atoms all the time and that's not the end of the world either. The body ship-of-theseuses itself all the time.
They do though. Particles have a unique path through time and space that cannot be replicated.
I don't care who "thinks" they are me--that has no effect on their me-ness.
> Perhaps some people overrate this near-magical concept of consciousness?
This is the point I should assume I'm actually debating with GPT-3 or an NPC.
Hypothetically, if a gunman has you, and a clone of you in a room and is going to kill one of you, are you saying you have no preference which dies? If these things are the same, how is it he could kill the clone in a different room and you wouldn't know? Doesn't that mean they aren't "the same" and really you are an instance of an object, not a class.
I think if you consider the Ship of Theseus from the point of view of a passenger, it becomes clear that it is the "same" ship in a more meaningful sense even though it changed while you were on it. Similarly, I still arrive if ships identical to the original state sank while I am en route.
> Hypothetically, if a gunman has you, and a clone of you in a room and is going to kill one of you, are you saying you have no preference which dies?
“Of course I have a preference!” I said at the same moment my completely perfect identical duplicate said exactly the same thing.
“I want to live!” I shouted, pointing at myself. My completely-absolutely-100%-identical-at-the-quantum-level duplicate shouted the same words and pointed at himself instead of at me.
The entire reason that I care and I’m upset by the comparison with plastic forks, is because I think most people think less of the copies, even though the copies will be just as much a living breathing human capable of suffering and joy as the actual original.
You’re not the first person I’ve encountered to use this sort of example to try and prove their point. I’m still not sure I even understand the worldview that you’re espousing well enough to try and actually engage with it properly.
It isn't a matter of thinking less of the clones--they'd probably be my best buds, probably closer than family. They'd understand that we are not the same person also.
My clones would understand the example is rhetorical to prove the point we absolutely do not share consciousness. That, all things being equal if I die, the movie in my head stops playing, and the one in their head keeps going, but does nothing for me. And since all that is the case, it proves we cannot be the same because our location in space and time is different, and there is no sharing of thoughts, and they can exist entirely independent of whether I exist or not, or even whether the clones have clones.
> They do though. Particles have a unique path through time and space that cannot be replicated.
That path is not encoded in the particle state. If you brought me two samples of isotopically pure carbon, one sourced from the earth, presumably there since the planet formed and another bunch synthesized in a fusion reactor and I put them into a black box that shakes, whirrs, spits out two equally sized boxes of carbon and then destroys itself you will not be able to tell whether it mixed them or passed through the original samples.
This kind of information can only be inferred from ensembles of particles and even then only under some assumptions about the underlying processes.
We only recognize fossils by radio-dating, the rock layers they're in, their mineralogy and so on. Never by things encoded in individual particles. So if you can assemble things atom by atom then with enough effort you could make a perfect forgery. And assembling things atom by atom, well, that's what this supposed teletransportation "paradox" is about.
> Hypothetically, if a gunman has you, and a clone of you in a room and is going to kill one of you, are you saying you have no preference which dies?
Assuming the copy was made moments ago and I am 100% confident that this is indeed a perfect copy (and that kind of confidence is rather hard to come by) and the outcome is that exactly one of me will die is entirely inevitable then it does not really matter which one it is. Intuition does not work in these cases because such scenarios do not occur all that often in real life. Alter the situation slightly and my preferences would start to shift.
> If these things are the same, how is it he could kill the clone in a different room and you wouldn't know?
You seem to be ignoring the point of copies diverging over time (I explicitly mentioned that in my previous post). If I were copied and then immediately thrown into a cell and then one of the copies get killed I would not be able to tell which one was the original and which one the copy. Thus the distinction is irrelevant at that moment. I would be more concerned with my redundancy getting reduced further, hitting zero means irrecoverable loss.
> Similarly, I still arrive if ships identical to the original state sank while I am en route.
Let's assume you're actually traveling inside a hotel container on a container ship. Assuming some gentle crane action while you had the container doors closed you wouldn't be sure whether they just moved around your container within the same ship or moved you onto another ship that looks identical.
The paradox only exists if you assume that location and path through space-time is not a property. Since this property cannot be copied, no "identical copy" is possible.
This isn't a divergence--this is an uncopyable property.
re: gentle crane action, it doesn't change the fundamental truth--it isn't my belief that gives me identity. I don't claim to know whether I'm a clone, or be better than the clone. Only that a clone and the original are not the same.
“The” is your convention — I don’t think the prefix “the” would remain appropriate once a hypothetical perfect copy existed.
(Separately: it has been restored many times, and apparently some of these restorations have removed the upper surface of the paint while others have retouched paint, so the Ship Of Theseus could also apply in principle).
From an outsider perspective, sure. But don't you have an inner mental world? I don't understand if you are conscious why you think a copy of your consciousness that you have no access to would offer you anything, or in any way be "you." There could be 5000 copies of you a block away and you'd have no idea. In a sense, aren't "you" the passenger "on" the Ship of Theseus? Whatever "you" is, you can trace the path of the ship across time from beginning to the end, and "you" have only access to the scope of your instance of consciousness.
> I don't understand if you are conscious why you think a copy of your consciousness that you have no access to would offer you anything, or in any way be "you."
I don’t know why you think I’m saying that it does.
I’m saying that: if this body were to die, and a backup of my brain state (made almost immediately before this body’s death) was copied in a new body or simulated on a computer, that would be no different for the entity that wakes up than it was for me to wake up from general anaesthetic having only half of the memory of the countdown the nurse walked me through before I went under.
(Or that weird time where I was right next to a CO2 cylinder when it exploded, resulting in me having a separately sized gap in my visual vs audio episodic memory).
Conversely, the me which did die would absolutely experience that death, would be scared, would try to avoid it — the me that wakes up afterwards might be glad to have missed it or might scream immediately, depending on when the death and the backup happened, but either way the death would still be real.
Killing of a copy would also mean an entity would experience death.
None of these are natural things to think about of course, so there’s no reason to expect that any of our own intuitions should even be close to correct, let alone the most comfortable ones.
A better word isn't all we're lacking. We don't have a clear idea of what that thing is, or whether it even exists.
> Clearly you won't be having the experiences of both bodies, would you?
Whether there is a "you" distinct from both bodies is the actual question to be answered here. If the experiment were done, my guess is that both bodies would swear that they were the original -- in much the same way, perhaps, that you would swear that you are the same "you" you were yesterday.
If you take a materialist view of consciousness, it is really no different from falling asleep and waking up. There is also no continuity of consciousness between the states in that case.
I think the best way to think of it is to assume that when you clone a brain, consciousness splits between the two copies.
However, in this case, one of the two conscious paths doesn't really exist since the procedure is destructive. Therefore it seems probable to you subjectively, that you will wake up in a lab in a future century if that ever comes to pass.
Otherwise, you just die like a regular schmuck.
It seems like it's worth a try, since the alternative is certain death and non-existence.
I agree, it's analogous to Pascal's wager. I'm not sure what it being free or costing money has to do with it. If you're about to die, arguably you have less concern about your savings account.
> A terminal patient choosing brain preservation with the hope of future revival via mind uploading is making the same type of rational judgement –faced with the alternative of oblivion I choose to undergo an uncertain surgical procedure that has some chance of restoring most of the unique memories that I consider to define ‘me’ as an individual. Hopefully this makes clear that I am rejecting a ‘magical’ view of the self. An individual’s mind is computational and, just like with a laptop, an imperfect backup copy is better than complete erasure.
Doesn't this argue against the entire brain preservation enterprise? That is, without a "magical" view of selfhood, why attempt to preserve a partially faithful replica of one's self instead of finding other ways to do the things that you'd want to do in the future once revived?
I don't really back up my laptop in a conventional sense. I do git pushes of my git clones, I copy some files to rsync.net, I do a lot of work on cloud services like Google Docs and Trello, etc. A lot of what's on my laptop is transient. This is nice, because I'm not backing up a Mac; if I decide to run Debian or switch to a Chromebook or whatever, I can still achieve my high-level goal of not losing work without the low-level implementation of restoring a Mac. And certainly I don't back up servers at work in the conventional sense, either; most of those "servers" are now just Kubernetes pods anyway, represented declaratively, and that's a lot better than a backup.
I think in the same sense, I do have a plan for immortality, and that plan is to change the world for the better while I am alive, now, as conventionally defined, in lasting ways. I don't really know what I would do if I were resurrected many centuries in the future. (I would expect at least as much change in the world as between now and many centuries in the past, and I can't really imagine even the greatest thinkers or doers or heroes of ages past productively helping the world today. Should Arthur return from Avalon to save Britain today, he'd have a lot of trouble recovering the throne in a largely pro-democratic society, and he'd have no idea what to make of "Brexit.")
Meanwhile, there's quite a bit I can do today to improve the world, to improve the lives of others, to try to improve by a fraction of a percent the chances of human society even existing a few centuries hence, etc. My self - my life and physical conscious existence - is just a tool for accomplishing whatever goals I have; it's not the goal itself. My laptop is also a tool; if I can keep doing the work that was on my laptop, I don't need a clone of the laptop itself.
It seems to me, then, that the only argument for brain preservation - for attempting to preserve one's "self" into the future and for investing in the ability to make it happen - is seeing one's self in this "magical" way, in believing that there's more value in the very fact of one's existence, and in fact even a partial and inaccurate continuation of that existence - than in what you do with that existence.
(And it does not save you from having to influence the world and engineer its future. At the least, as we can see, you have to spend a fair amount of your life today convincing society that it should develop in a way so that, in the future, they build the means to restore you.)
I'm pretty sure historians from every category would positively salivate over the prospect of being able to interview an actual person from a few hundred years ago. And it isn't like they'd stop being interested at just one.
And in terms of work one can do to improve the world- the tools available today amplify the work someone can do by orders of magnitude compared to centuries past, particularly mental work. I see no reason to think this trend won't continue. Who is to say that, given enough time and development, a society of the future might have an entire pathway for the freshly-revived to go back to school, so to speak, and become able to do things those of us now can only dream of?
The first one is a good point, although it doesn't quite sound to me like the folks advocating for preservation are doing so with the intention of being valuable to future generations for the interests of those generations. If they were, then they'd get themselves comfortable with being revived multiple times (either re-preserved or terminated and re-cloned, whatever's easiest to future society) and would prefer to be revived as far in the future as possible, and they'd accomplish what they intend during this life. But most of the motivation I see around this seems to be focused around trampling down death by cryopreservation and continuing to live your life in the future.
You could also imagine that, in a future where we are close to being able to revive human brains, we can just query human brains via simulation without bringing them back to life. The ethics of that are different, but - at least with consent from the person while they were be alive - it doesn't seem obviously wrong.
Re work improving the world - why do we imagine that someone from the present would be more effective at using those tools than someone from the future? Again, take the example of Arthur: if he returned, what would he do? What would you have him do? Or if even Isaac Newton were to return, would he be able to keep up with the brightest minds of the present generation of students who all took calculus in high school? I'm not doubting that he'd still be a sharp thinker, but would he be doing anything groundbreaking and world-changing like he did in his natural life, or would he "just" interview well at FAANG?
I'm not disputing that both of them would do things beyond their own wildest dreams during their lifetimes. Honestly, I think Arthur would have a lot of fun being in the House of Lords (which is probably where they'd put him) and Newton would get a blast out of being an entry-level engineer at FAANG. I'm disputing that they would do anything beyond what the natural-born of today would do, and that unless you have a sentimental correlation between your revived self and your old self, there's not really a point in one more average or even above-average person existing in the future.
>The first one is a good point, although it doesn't quite sound to me like the folks advocating for preservation are doing so with the intention of being valuable to future generations for the interests of those generations.
They probably aren't. But that isn't incompatible with both them desiring to continue to live and them contributing to whatever society they are reborn in to. After all, people today are primarily concerned with their own lives first and foremost, yet we manage to work together to build societies just the same.
>why do we imagine that someone from the present would be more effective at using those tools than someone from the future?
Diversity of thought. That doesn't mean that revived-person-x is going to be better at any particular productive activity than someone who was born in to the future in question. But simply by being from a different era, I like to think that there is potential to be able to contribute meaningful value. Or, put another way, while it is true that the world benefits greatly from those who are the best of the best, it is also true that there is a place for a large number of competent but not exceptional people to do the bulk of the work, and that their lives have positive value, too.
Would it not be more feasible, more robust, and more effective to ensure diversity of thought for the future by building mechanisms into society to sustain them on their own (e.g., value and uphold communities that take both strongly positive and strongly negative views towards modernity) instead of relying on developing the technical ability to unfreeze people from the past and then promptly putting them to work in average jobs?
(It seems silly, leaving aside the ethics of it, that we may find ourselves in the position of wishing we had the "diversity of thought" of peoples that we had long since either wiped out or pushed to assimilate into what's rapidly becoming a single global dominant culture.)
I mean, it rather sounds like we have changed the pitch from "If you desire, you can avoid death" (and the specific form of "If you have a terminal disease at a young age, we can freeze you until the disease can be cured, so you can live out the rest of your life") to "It is good for society that we build mechanisms to clone large numbers of people from the past into the present to lead average lives," which at the very least is a whole different ethics ballgame.
For one, there's the question of what happens if turns out that we can clone people from the past, en masse, even without them having been prepared specially. (Perhaps certain types of embalming cause enough stability in brain structure. Perhaps we can revive people who froze to death, like the hundreds on Mount Everest or similar mountains.) Going back to the idea that we only need a partial restoration and that there's no magical "self," is it ethical to clone them, if it is helpful to present society? Is it ethical to clone parts of them, if that's a technology we develop and it's beneficial?
Also, it seems pretty unlikely that humankind is on its path to having a vastly lower population than we do today, and we have yet to be assured that we will be able to colonize other planets. Lives have value, but when we have reached the capacity of Earth, how do we weight the potential value of cloning millions of people rom the past?
Baby boomers run the world, and they're getting sick and dying. They've generally already achieved all that they're going to achieve, and survive by taxing generations who are still in their productive years (or are abandoned by comically bad safety nets.)
Maybe after they're gone, power will be distributed a little more equally throughout the remaining generations - if only because a lot of them had children very late in life. Until then, there's going to be a lot of money going into immortality schemes, no matter what their value is philosophically.
If you asked me, I wouldn't even understand the value of making a permanent mark on the world, even if that mark is positive. My job is to help the world think, to spread good information, and to minimize the material harm I cause. To disappear quietly and completely is to be biodegradable. My main problem with preserving rich people's brains is that it involves the burning of real resources to keep something around of dubious value. Nobody needs you 100 years from now; there will be plenty of people.
What an extraordinary pain to read. This really demonstrates why Twitter is awful for debates. This can be made readable, but you'd have to rewrite the entire thing, more or less, to pull out the individual points, consolidate broken up thoughts, and create consistent formatting per author so you can follow it instead of being a sea of text with occasional implicit author switching.
(It also demonstrates why you shouldn't have light-gray text on gray background, pale green links, and dashed underline links.)
I apologize and wish I could have presented it better. I first compiled this in 2019 and had been planning to summarize it more. Alas, I recently realized that I wouldn't have the energy for that and that I should simply publish as is.
I'm surprised that people care at all, but perhaps I underestimate how many are as interested in the content as I am.
Regarding the formatting, I will try to update the background to something more readable using wordpress, although I obviously lack your skills in this area. In the meantime, you're free to copy it onto your website or elsewhere.