From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!zaphod.mps.ohio-state.edu!think.com!news!moravec Tue May 12 15:50:07 EDT 1992
Article 5526 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!zaphod.mps.ohio-state.edu!think.com!news!moravec
>From: moravec@turing.think.com (Hans Moravec)
Newsgroups: comp.ai.philosophy
Subject: Re: AI failures
Date: 10 May 92 00:45:28
Organization: Thinking Machines Corporation, Cambridge MA, USA
Lines: 78
Message-ID: <MORAVEC.92May10004528@turing.think.com>
References: <uc2m8INNn5d@early-bird.think.com>
	<1992May8.155052.13848@psych.toronto.edu>
	<uetinINNco5@early-bird.think.com>
	<1992May10.003028.19333@psych.toronto.edu>
NNTP-Posting-Host: turing.think.com
In-reply-to: michael@psych.toronto.edu's message of 10 May 92 00:30:28 GMT



michael@psych.toronto.edu (Michael Gemar) writes

> You imply that societal decisions will always be moral ones.  What about
> South Africa?  What about slavery in the U.S.?
...
> Well, your parents were *your* creator, and presumably until you were
> about 18 your sole means of support.  Does this mean that infanticide
> is "moral"?
 ...
> And so presumably those who are using heart-lung machines or dialysis
> machines who go into arrears can *morally* be unplugged...

I didn't imply that social rules are by any measure perfect: they're
more like the cultural equivalent of genes: some work better than others,
and its often hard to tell in the short run: it takes constant trail and
error to keep thing reasonably functional.

Slavery, infanticide and euthanasia of aged family members that could
no longer be supported were sanctioned by many societies in the past.
These practices were probably essential for the early development of
civilization.  Other means of population control, increasing wealth and
mechanization gradually displaced them.  Saying that the old solutions were
"immoral" is just name calling.  Of course, name calling is an effective
tool in political debate, and sways opinions.
What is considered moral behavior changes with time and circumstance,
and is constantly being debated, often vociferously: read the newspapers.

> "Tribally-forged instincts"?!
> And whatever our *instincts* may (or may not) be, this tells me *nothing*
> about what our morality should be - unless you believe that it should
> simply follow our instincts, which, in other words, denies a place for
> morality at all.

My point was that our instincts were forged over a long period, at least
several hundred thousand years, when we lived in small tribes, and most
of our instincts are tuned to that way of life.  For instance, wiping out
the weaker neighboring tribe if they were crowding you too much was
probably a basic survival strategy: otherwise the territory couldn't
provide enough food, and both tribes would slowly die.  But in the last
paltry few thousand years, we've invented agricultural megacivilizations,
working around our no-longer-quite-appropriate tribal instincts.  Part of
the solution was a behavior-altering conditioning process, modulating our
instinctive behavior.  Under it, some things are RIGHT!, and some things
are WRONG!, and if we're taught well enough, especially at an early age, we
become fine upstanding MORAL citizens.  Exactly what's right and what's
wrong keeps changing, and is the subject of continuing debate.
Some religous (and later humanist) windbags come along, who claim
that these rules are somehow fundamental to the cosmos. That's
their debating strategy.

> True perhaps for some naive Utilitarian positions, but hardly the case for
> Kantian ethics.
 ...
>  You've been reading too much sociobiology.  

I'll take E.O. Wilson over your favorite dead humanist windbag any day.
Nobody has all the answers, but at least the scientific approach has effective
way of checking out opinions.

> Well, Hans, the solution we use with *people* now is simply to *not
> produce them*.  This is the suggested method for dealing with the
> problems of the Third World, rather than letting people
> overpopulate and then starve.  I see no reason why the production of
> artificial people should not be governed by the same moral code.

I see many reasons. In a world where the computational resources are
sufficient to create billions of minds with a single command, many
really wonderful things can be done, if the space isn't clogged up with
worthless entities.  Imagine solving a large problem by making many
mental clones of oneself, each modified so it would be obsessed with
working on its own unique part of the search space, and programmed to
simply stop, and relinquish its storage space, when its job was done.

Your alternative is simply pointless.

				-- Hans


