From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!csd.unb.ca!morgan.ucs.mun.ca!nstn.ns.ca!news.cs.indiana.edu!sdd.hp.com!usc!sol.ctr.columbia.edu!destroyer!news.iastate.edu!rjk Tue May 12 15:48:41 EDT 1992
Article 5371 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!csd.unb.ca!morgan.ucs.mun.ca!nstn.ns.ca!news.cs.indiana.edu!sdd.hp.com!usc!sol.ctr.columbia.edu!destroyer!news.iastate.edu!rjk
>From: rjk@iastate.edu (Russell J Kraemer)
Subject: Re: Ego death
Message-ID: <1992May2.175903.11163@news.iastate.edu>
Keywords: recomendation
Sender: news@news.iastate.edu (USENET News System)
Organization: Iowa State University, Ames, IA
References: <X4q5JB2w164w@cybernet.cse.fau.edu>
Date: Sat, 2 May 1992 17:59:03 GMT
Lines: 36

In article <X4q5JB2w164w@cybernet.cse.fau.edu> tomh.bbs@cybernet.cse.fau.edu writes:
>Assuming we can transfer a human's mind into a device, such as
>by one-at-a-time neuron replacement or somesuch, so that there
>is continuity for the conscious human.  Assume that the "state"
>of the device can be completely quantified, stored, and re-established
>in a different but identical device.
>
>If the device is turned off, and then restored from 'backup',
>the new resident 'mind' would presumably not be aware of anything
>that had transpired after the backup was made, but would also
>not be aware of any gap in continuity.  So the person would still
>think he was 'alive'.  And yet, the old copy has, in effect, died.
>The old copy's 'ego' has terminated and no longer exists.
>
>To prevent such instances of 'ego death', the state of the device
>must always be saved before being powered off.  Backups would never
>be used after the initial use, the next time the device is turned on.
>(In the event of power fail, or tornado, or other catastrophic
>failure, well, that's the way it goes.  But is it moral to restore
>an old backup?)
>
>tomh@bambi.ccs.fau.edu

Just an idea here, you might want to read a science fiction
book called Buying time by Joe Haldmen (sp?) It deals with this question
specifically in that a character has his mind "backedup" in a
turning box. The morals and problems associated with it
are dealt with in very interesting fashion. Highhly recommend
IMHO.



-- 
rjk@iastate.edu                 

I think therefore I.......shoot, was I saying something?


