From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!zaphod.mps.ohio-state.edu!think.com!Think.COM!moravec Tue May 12 15:49:50 EDT 1992
Article 5497 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!zaphod.mps.ohio-state.edu!think.com!Think.COM!moravec
>From: moravec@Think.COM (Hans Moravec)
Newsgroups: comp.ai.philosophy
Subject: Re: AI failures
Date: 8 May 1992 21:58:47 GMT
Organization: Thinking Machines Corporation, Cambridge MA, USA
Lines: 100
Distribution: world
Message-ID: <uetinINNco5@early-bird.think.com>
References: <1992May7.152447.7930@waikato.ac.nz> <727@ckgp.UUCP> <uc2m8INNn5d@early-bird.think.com> <1992May8.155052.13848@psych.toronto.edu>
NNTP-Posting-Host: turing.think.com

In article <1992May8.155052.13848@psych.toronto.edu>, michael@psych.toronto.edu (Michael Gemar) writes:
|> moravec@Think.COM (Hans Moravec) writes:
 h> Soon after (AIs) are possible at all, they will be super plentiful
 h> ... it will be necessary to throw them away ...
 h> Otherwise the world will be up to its armpits in
 h> useless individuals

|> Presumably similar arguments could have been made when slavery existed.
|> "There are just too damn many of them - it is absolutely necessary to
|> kill them when they're no longer needed.  Otherwise..."  For that
|> matter, I see no reason why the same argument would not apply to 
|> overpopulation in Third World countries.
|> If you are going to adopt such a position for the sake of expediency, 
|> you should realize just how radical the ethical implications are.
|> I personally think that such a position is indefensible on *any* 
|> ground other than sheer expedience, which is of course no *moral*
|> reason at all.

     Expediency and morality both arise from necessity, and are much less
     different than you imagine.
     Morality is a name for a sytem of social conventions, that modulates 
     behavioral predispositions of individuals (and groups) in a way that 
     (ideally) improves the group's well being.  I don't kill or steal from
     my neighbor (even when I may be able to get away with it), and my
     neighbor returns the courtesy, and we both benefit from the mutual
     consideration, and so does the neighborhood.  When the physical ground
     rules of existence change, though, so will the behaviors that
     produce the maximum collective benefit.
     You imply that killing is somehow prohibited today.  There are many
     situations (self defense, war, capital punshment, abortion) where
     present society condones elimination of individuals that are judged
     to have a negative  net worth.  There are many other situations
     (termination of medical or rescue efforts, limits on resources
     expended in safety precautions, hazardous employment, limiting
     immigration from third-world countries) where an increased chance
     of death is condoned.  It would be absolutely necessary to create
     many more such situations if individuals didn't have the good grace
     to die of their own accord of old age.
     In fact, it will be.

 h> I can see the same thing happening in real life. 
 h> Putting an AI program into inactive "suspended animation" is
 h> surely ok.

|> Even if it doesn't want to go?  How would *you* feel if your
|> employer said, "Well, Hans, we don't need you now, so we're going
|> to put you to sleep for an indefinite period."? 

     If my employer were also my creator, and my sole means of support
     (I resided in my employer's body, as it were), then I am a component
     of my employer, and it is my employer's business how much, if any, of
     its limited resources it should grant me.
     If I were a self-supporting entity, then it would be a matter for
     negotiation.  As it is in today's world.

 h> But then there will come a time when storage space is
 h> low, and someone notices that the file Moriarity.ai is taking
 h> up 10 terabytes, and hasn't been accessed in five years.

|> ...or that Hans Moravecs's cryogenic sleeper is taking up space, and
|> he hasn't been needed in five years... 
|> So, after broadcasting "does anyone need Moravec?" and receiving no
|> positive responses, the sleeper manager dumps out the corpse.  Maybe
|> some useful organs are scavenged.  Just good housekeeping...

     If my estate hasn't been paying the rent, then this is exactly the
     right answer.  In fact, it's exactly what happens today, to people
     in cryonic suspension, and also to people who need impossibly expensive
     medical procedures to continue to live. 

 h>  Some day human minds may be copied as easily as AIs, a process
 h>  When we grow new minds as easily as our bodies grow new cells, we must
 h>  also be prepared to destroy old minds as our bodies destroy old cells.
 h>  The alternative is suffocation.

|> And when we can grow bodies as easily as we grow new cells, the same
|> would also apply, I suppose.
|> I find such speculation yet another indication that AI folks don't
|> *really* think that what their doing is creating *REAL* minds, entities
|> that are equivalent to humans mentally.  If they did, I don't see how
|> they could possibly suggest such things as the above...
|>                    - michael

     And I find you comments childishly naive.  Placing an effectively
     infinite value on a self-aware entity's existence is a convenient
     counterbalancing fiction in a world where tribally-forged instincts
     are to value a stranger's life very little.    It is a fiction that
     can be maintained most of the time because people die anyway, so
     it's usually easiest to just wait, and because maintaining even an
     unproductive person is relatively cheap, and there are not too many
     of them.  This fiction will break down when individuals become
     potentially immortal, or when they can be reproduced (reproduce
     themselves) cheaply in quantity. 
       Biological evolution solved the problem of providing room for new
     (sometimes improved) individuals by giving us a prearranged death
     by  old age.  If we change the rules, we will have to provide a
     substitute solution, because the problem will remain. 

			-- Hans



