I'll be speaking at the Singularity Summit AU this coming Sunday afternoon, on the topic "Survival Beyond the Flesh" - which relates to the prospect of "uploading" rather than to anything of a more otherworldly or spiritual kind. I'm rather sceptical about uploading - though I can't rule it out totally, no matter how advanced our technology becomes. I spent yesterday working out what I could say meaningfully in a quite short slot, given that I'd devote at least three lectures to such a topic if teaching it from scratch as part of a philosophy course on issues such as mind and personal identity. I think I managed to work out something useful in the end. This is a sufficiently slippery topic that I'll be relying quite heavily on the shortish paper that I ended up writing. Hopefully I can end up making it interesting.
Please don't ask me to publish the paper on the internet. I am, among other things, a professional writer. Although I don't see an obvious market for this paper I can't simply throw away the intellectual property in something that just might give me a sale somewhere with a bit of tweaking (the paper is nowhere near original enough or rigorous enough or adequately grounded in the existing literature to be sent to an academic journal - it's meant for popular consumption). If you'd like to buy such a paper for a magazine or a book, please contact me.
The big issue, it seems to me, is why I would want to upload myself. Presumably it's to live longer and to gain certain advantages such as being able to think more quickly and powerfully. But that means I must be confident that I can look forward to enjoying those advantages. It's no use if the advantages will simply be enjoyed by a being somewhat like me. Thus, issues about personal identity, survival, and so on are inescapable, even if our conceptions of these things are hopelessly vague. There do seem to be situations where a psychological duplicate of me could be made and it would be pretty clear that I would not enjoy whatever experiences it has, so it's not as if there's no risk of things turning out like that. But what criteria do we use and where do we draw the line?
I don't have clear and convincing answers to these questions, and I don't believe anyone else has either - Derek Parfit's answers are fairly clear but I don't find them all that convincing. Still, I might at least be able to offer my audience some tools for thinking about it.
Recently, I got into a conversation with someone about the prospect of human teleportation, and the concern was raised that the consciousness at the end of the trip might not be the same as at the beginning, or rather, that the old consciousness died in the process and a new one was created with the same memories and the illusion of continuity. This got me thinking about day-to-day consciousness, and I began to speculate, what if some routine brain event, such as deep sleep, was essentially doing the same thing? Occasionally the old consciousness is shut down and a new one booted up with the illusion of continuous identity, such that a new mental being is born each morning. Or what if this regeneration happens constantly? After all, I only ever experience "right now". I have only a little familiarity with philosophy and cognitive science, but this train of thought helped renew my appreciation for how utterly mysterious consciousness still is.
Well, maybe some people would find it consoling that some copy of their memories exists on, but honestly I believe that many uploading enthusiasts may simply not have thought that through. It makes a lot of sense if you believe that some your-soul mojo passes from your brain into the machine, but once you accept that our conscience is an emergent property of our bodies, the whole thing falls apart quite quickly.
I have no expertise in this area, so my thoughts might be very naïve, but it has long seemed to me that the 'continuity of personality' problem is not only difficult to solve but difficult to prove solved. In earlier science fiction devices like the Star Trek transporter and other teleportation devices it seems the original body is destroyed and a replica created at the point of arrival. In the case of uploading', once again presumably, the personality is transferred to a machine and then the 'original' destroyed. How is it possible to know afterwards whether the result is the transferred original or instead a copy indistinguishable from that original? To an outside observer there may be no objective way to tell. The duplicate will have all the original's memories and traits and so will behave and recall as if it were that person. But Russell's doubts remain; what if I were the original and looked forward to the benefits of transfer or upload, but then found only oblivion, while a duplicate who is not me, but who behaves and remembers exactly as I would, exists in my place?
Without a solid theory of consciousness it is perhaps too early for us to answer this question, but it seems to me to have faint echoes of Cartesian mind-body dualism. If conscious is a phenomenon arising from the complexity of the substrate (the brain) then perhaps no sort of transfer is possible. Without the original brain the original consciousness simply ceases to exist.
I find that too much thinking about personal identity (and other deep philosophical questions) leads me in rather morbid directions. I'm then reminded of what David Hume wrote:
Most fortunately it happens, that since Reason is incapable of dispelling these clouds, Nature herself suffices to that purpose, and cures me of this philosophical melancholy and delirium, either by relaxing this bent of mind, or by some avocation, and lively impression of my senses, which obliterate all these chimeras. I dine, I play a game of backgammon, I converse, and am merry with my friends. And when, after three or four hours' amusement, I would return to these speculations, they appear so cold, and strained, and ridiculous, that I cannot find in my heart to enter into them any farther."
— David Hume (An Enquiry Concerning Human Understanding)
I hope the summit will be enjoyable, and not too serious!
"The big issue, it seems to me, is why I would want to upload myself. Presumably it's to live longer and to gain certain advantages such as being able to think more quickly and powerfully."
Another reason might be that the idea of a copy of oneself is living on a in a computer or positronic brain or whatnot is "cool" or otherwise fascinating. Or because uploading, like Mount Everest, is "there." (Obviously, uploading isn't "there" already, but I'm presuming one is living in a world where uploading has become viable.) One's reasons for uploading need not be straightforwardly related to personal benefit.
As Marsie said:
"...it seems to me to have faint echoes of Cartesian mind-body dualism. If conscious is a phenomenon arising from the complexity of the substrate (the brain) then perhaps no sort of transfer is possible. Without the original brain the original consciousness simply ceases to exist."
I'm an engineer. I have no truck with dualism. I don't get invited to philosophy conferences.
Russell, good luck and give 'em hell!
It seems like there should be at least some routes to uploading that dont run afoul of personal identity issues. Eg: start with the technologically 'simplest' kind of things like a medium bandwith brain-computer interface, maybe just as a way of boosting sensory capability. We currently can do something like this, letting monkeys control robotic arms with a brain implant. Then, as technology improves, keep adding features, until we have a single connected system consisting of biological me, some fancy computer processing stuff, and a backup copy of my brain. As long as all of the parts are connected to my neurons in a causally appropriate way (it would share functional properties with how my neurons connect to themselves in a causally appropriate way), I dont think there is any plausible step where identity gets fuzzy. When you want to upload once and for all, simply shut down the biological brain.
It should be noted that similar, though much simpler things happen in neuroscience; personal identity seems ok in principle after strokes, (loss of part of a biological brain), for example; and new brain regions can come online (after removal of a tumor, for instance).
Strictly speaking you're right JJ - it might be nice in some sense to be uploaded even if there's no personal benefit. I'll be sure to make that point. But most people who advocate uploading seem to think that it's a route to obtain - that word probably deserves emphasis, obtain - longer life in a more durable form, etc.
I don't think this requires Cartesian dualism, exactly = that move has always seemed a bit quick to me - but I certainly don't think it's straightforward. As some of y'all are saying, there's nothing unchanging, like a Cartesian mind/soul or a Brahminical spiritual self that can transmigrate from one body to another. Some other account, not involving outright Cartesianism - has to be given as to how I can get the benefit that I might be seeking. This will, presumably involve a functionalist theory of mind, but it's not clear that functionalism is up to the task, even if it's true, as I'm prepared to assume.
I think this article on the subject is good:
I agree with Almond's main conclusions, though not with his "What Should We Do?" section.
One reason that no-one ever seems to mention for people wanting to upload their mental patterns could be that you wish to have a 'save point' to revert to should things go awry in your life. e.g. You save your mental state before going in for a brain op, or asking out a woman, or trying crack cocaine. Thus if the rest of your life spiraled into abject misery there would be a version of you that would not have those memories and pain.
Your uploaded self is as much "like you" as possible in the beginning or it is not an upload at all. Your argument seems to presume that a change of substrate with identity (whatever that really is) intact is not possible. Thus it begs the question.
Samantha, who are you addressing? It can't be me, because I have put no such argument either here or in the actual talk that I gave on Sunday. Nothing that I have said in either forum assumes that identity (whatever that is) cannot be transferred to a non-biological substrate. I don't make that assumption at all; indeed, I think it may be possible.
But I take it you were addressing someone else ...
Post a Comment