From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!swrinde!gatech!uflorida!cybernet!tomh Tue May 12 15:48:40 EDT 1992
Article 5369 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!swrinde!gatech!uflorida!cybernet!tomh
>From: tomh.bbs@cybernet.cse.fau.edu
Newsgroups: comp.ai.philosophy
Subject: Ego death
Message-ID: <X4q5JB2w164w@cybernet.cse.fau.edu>
Date: 2 May 92 04:46:20 GMT
Sender: bbs@cybernet.cse.fau.edu (BBS)
Organization: Florida Atlantic University, Boca Raton
Lines: 21

Assuming we can transfer a human's mind into a device, such as
by one-at-a-time neuron replacement or somesuch, so that there
is continuity for the conscious human.  Assume that the "state"
of the device can be completely quantified, stored, and re-established
in a different but identical device.

If the device is turned off, and then restored from 'backup',
the new resident 'mind' would presumably not be aware of anything
that had transpired after the backup was made, but would also
not be aware of any gap in continuity.  So the person would still
think he was 'alive'.  And yet, the old copy has, in effect, died.
The old copy's 'ego' has terminated and no longer exists.

To prevent such instances of 'ego death', the state of the device
must always be saved before being powered off.  Backups would never
be used after the initial use, the next time the device is turned on.
(In the event of power fail, or tornado, or other catastrophic
failure, well, that's the way it goes.  But is it moral to restore
an old backup?)

tomh@bambi.ccs.fau.edu


