From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!mips!swrinde!gatech!bloom-beacon!eru.mt.luth.se!lunic!sunic!news.funet.fi!hydra!klaava!cc.helsinki.fi!eperkio Tue Apr  7 23:23:30 EDT 1992
Article 4850 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!mips!swrinde!gatech!bloom-beacon!eru.mt.luth.se!lunic!sunic!news.funet.fi!hydra!klaava!cc.helsinki.fi!eperkio
>From: eperkio@cc.helsinki.fi
Newsgroups: comp.ai.philosophy
Subject: Re: The Chinese Room (or Number Five's Alive)
Message-ID: <1992Mar31.215817.1@cc.helsinki.fi>
Date: 31 Mar 92 19:58:17 GMT
References: <7341@uqcspe.cs.uq.oz.au> <1992Mar29.185454.21236@psych.toronto.edu> <493@tdatirv.UUCP>
Sender: news@klaava.Helsinki.FI (Uutis Ankka)
Organization: University of Helsinki
Lines: 34

In article <493@tdatirv.UUCP>, sarima@tdatirv.UUCP (Stanley Friesen) writes:
> In article <1992Mar29.185454.21236@psych.toronto.edu> michael@psych.toronto.edu (Michael Gemar) writes:
> |Well, I will perhaps feel differently about this issue once I see AI types
> |worrying over the moral implications of unplugging their machines.  Until
> |*they* take this possibility seriously, I see no reason for me to.
> 
> [ A part of sarima@tdatirv.UUCP's answer deleted.]
> 
> However, *ungluggin* such a machine would probably not 'kill' it, most
> computer now are quite capable of rebooting, and everything except the
> contents of main memory (short-term memory) is invariably stable.  Thus
> this is more like giving the computer a mickey-fin.  (When a humans are
> knocked out they tend to lose short-term memory contents too).
> 
> To kill it you would need to destroy the disks and burn the back-ups.
> -- 
> ---------------
> uunet!tdatirv!sarima				(Stanley Friesen)

When a human loses consciousness, the brain keeps working, and some
continuity exists.  When a machine is turned off, any continuity within
is disrupted; taking the hardware apart and sending the software to Mars
and back would make no difference - in a sense the machine has died, and
putting it back together and turning it on would be more akin to creating
it again than "waking it up".

A further problem rises, if you regard the back-ups 'continued existence'.
Consider taking a human brain from the body, copying every cell, every
intra-cellular connection, etc. until you had a perfect back-up copy of
it.  Then mash the original and resurrect the body using the copy.  No one
would notice the difference, but the original mind would be dead.

If I was the original, I'd be very upset, if someone was going to do this
to me.


