From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!uwm.edu!linac!uchinews!spssig.spss.com!markrose Tue Apr  7 23:23:34 EDT 1992
Article 4855 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!uwm.edu!linac!uchinews!spssig.spss.com!markrose
>From: markrose@spss.com (Mark Rosenfelder)
Subject: Re: The Chinese Room (or Number Five's Alive)
Message-ID: <1992Mar31.233018.36448@spss.com>
Date: Tue, 31 Mar 1992 23:30:18 GMT
References: <1992Mar29.185454.21236@psych.toronto.edu> <493@tdatirv.UUCP> <1992Mar31.215817.1@cc.helsinki.fi>
Nntp-Posting-Host: spssrs7.spss.com
Organization: SPSS Inc.
Lines: 45

In article <1992Mar31.215817.1@cc.helsinki.fi> eperkio@cc.helsinki.fi writes:
>In article <493@tdatirv.UUCP>, sarima@tdatirv.UUCP (Stanley Friesen) writes:
>> However, *ungluggin* such a machine would probably not 'kill' it, most
>> computer now are quite capable of rebooting, and everything except the
>> contents of main memory (short-term memory) is invariably stable.  Thus
>> this is more like giving the computer a mickey-fin.  (When a humans are
>> knocked out they tend to lose short-term memory contents too).
>> 
>> To kill it you would need to destroy the disks and burn the back-ups.

>When a human loses consciousness, the brain keeps working, and some
>continuity exists.  When a machine is turned off, any continuity within
>is disrupted; taking the hardware apart and sending the software to Mars
>and back would make no difference - in a sense the machine has died, and
                                     ^^^^^^^^^^
>putting it back together and turning it on would be more akin to creating
                                                     ^^^^^^^^^^^^
>it again than "waking it up".

Interesting that your statements start out categorical and end up hedged.
I take this as a recognition that we are only dealing in analogies here.
There's no clear-cut answer to the question of whether an AI that's
turned off is sleeping, knocked out, dead, or none of the above.

An AI implemented on a computer would have a lot to worry about with this
concept of personal continuity, I think.  It has no assurance that it hasn't
been turned off, duplicated, or single-stepped in the last five minutes...

>A further problem rises, if you regard the back-ups 'continued existence'.
>Consider taking a human brain from the body, copying every cell, every
>intra-cellular connection, etc. until you had a perfect back-up copy of
>it.  Then mash the original and resurrect the body using the copy.  No one
>would notice the difference, but the original mind would be dead.

Better yet, resurrect the body somewhere else, and you have teleportation.

The "backup" thinks he's the original, and has no sensation of discontinuity.
Is this any more odd, philosophically, than our assumption that we are the
same people we were before we last went to sleep?

>If I was the original, I'd be very upset, if someone was going to do this
>to me.

"If you have to take me apart to get me there, I'd rather not go at all."
(from the _Hitchhiker's Guide_ series.)


