From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael Mon May 25 14:04:46 EDT 1992
Article 5582 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael
>From: michael@psych.toronto.edu (Michael Gemar)
Subject: Re: AI failures
Organization: Department of Psychology, University of Toronto
References: <umpm0INNpv8@early-bird.think.com> <1992May12.213451.5026@csc.canterbury.ac.nz>
Message-ID: <1992May12.155749.2848@psych.toronto.edu>
Date: Tue, 12 May 1992 15:57:49 GMT

In article <1992May12.213451.5026@csc.canterbury.ac.nz> chisnall@cosc.canterbury.ac.nz (The Technicolour Throw-up) writes:

[in reference to "killing" duplicate AI's]

[an isolated base has a toix spill that must be cleaned up or everyone
 will die.]

>What  to  do?   Ordering  your  crew,  who are all subordinate to you, to
>remove this substance will kill them and you'll be up on murder  charges.
>Even if they all sign contracts stating that they're willingly doing this
>and thereby letting you off any legal difficulties you'll still have your
>own personal qualms and you'll probably be haunted by this for some time.
>You could sacrifice yourself, saving your crew, but you then have an hour
>to  face  the  fact  that  you  are  about to die by your own hand (so to
>speak).
>
>Now it so happens that your  base  has  a  neat  whizzo  Star  Trek  like
>duplicater  that  can be used to duplicate, almost instantly, and with no
>deleterious side effects to the base (or its  energy  supply),  anything,
>living  or dead, that is placed within it.  Amongst other things this can
>be used to duplicate people.
>
>It seems to me then that the solution to this problem is to run off a few
>dozen  copies of yourself and have these copies clean up the toxic spill.
>Although most people would face inner mental turmoil at  the  thought  of
>either  committing  suicide or of knowingly sending other people to their
>depths I suspect that there would be very little turmoil (or even perhaps
>none at all) in sending copies of yourself to certain death.

Once they've existed for the briefest period of time, your "copies" are
no longer copies.  They begin to diverge from *you* the moment they
step out of the duplicator.  They become individuals in their own
right.

Besides, why is it, if they are all perfect copies, that *you* as the
original should live?  Why do *you* get priority?  (Does the possibility
that *you* might have to do the cleanup and go to certain death make this
suggestion less appealing?)

>Part  of  this  turmoil  arises  because  we  are  in  effect  destroying
>individuals and we hold it as a right  that  individuals  should  not  be
>deliberately  destroyed.  But duplication removes this worry.  As long as
>either the original or one of the copies remains around afterwards  there
>is  still continuity of that individual. 

Not at all, as once the duplicate is made, it diverges from the original.
It doesn't *stay* a duplicate for long...

> Can we also solve this scenario
>by duplicating the crew and sending the crew duplicate outside?  Yes  but
>only  because  the  crew agree to it.  The point I'm trying to get across
>here is that although we would face inner turmoil at giving lethal orders
>to someone else or at having someone else give lethal orders to us few of
>us would, I  suspect,  face  any  turmoil  at  giving  lethal  orders  to
>ourselves  or at having ourselves give lethal orders to us.  If you can't
>trust yourself who can you trust?

This reminds me of the Calvin and Hobbes series in which Calvin creates
duplicates in a modified transmorgrifier so that he can laze about while
his clones go to school.  Once out, of course, the clones agree to nothing
of the sort ("OK, Calvin, let's have a vote..."), and go wreak havoc
in a typically Calvinesque fashion.  There is a profound insight here...


- michael





