From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!csd.unb.ca!morgan.ucs.mun.ca!nstn.ns.ca!news.cs.indiana.edu!sdd.hp.com!think.com!news!moravec Tue May 12 15:50:14 EDT 1992
Article 5537 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!csd.unb.ca!morgan.ucs.mun.ca!nstn.ns.ca!news.cs.indiana.edu!sdd.hp.com!think.com!news!moravec
>From: moravec@turing.think.com (Hans Moravec)
Newsgroups: comp.ai.philosophy
Subject: Re: AI successes
Date: 10 May 92 14:11:06
Organization: Thinking Machines Corporation, Cambridge MA, USA
Lines: 67
Message-ID: <MORAVEC.92May10141106@turing.think.com>
References: <zlsiida.205@fs1.mcc.ac.uk> <ufa1aINNco5@early-bird.think.com>
	<1992May9.185900.11461@organpipe.uug.arizona.edu>
NNTP-Posting-Host: turing.think.com
In-reply-to: bill@NSMA.AriZonA.EdU's message of 9 May 92 18:59:00 GMT


 bill@NSMA.AriZonA.EdU (Bill Skaggs) writes:

>   First, evolving through foresight is probably a bad idea.  Both
> mother nature and human designers avoid creating machines that can
> change their bottom level structure, because the consequences are
> unpredictable and usually bad.  (Hofstadter discusses this in
> GEB.)  The problem is that you can't know what will happen when
> you make changes in the motivational structure of something as
> complicated as yourself.

Foresight is never perfect, especially if the thing being foreseen is more
complicated than oneself.  But there are many examples of entities that
evolve by foresight, for instance companies that vote to shed parts,
acquire new ones, engage in product developments, etc.  Countries are
another example: the US constitution and the continuing flux of laws
are an attempt to shape the corporate entity in a planned way.  Of course,
they work only imperfectly.  But the percentage success rate is higher than
blind trial and error.  And if the planning screws things up completely -
witness the Marxist states - then there's always the bottom line - the
fallback strategy - of Darwinian extinction.

	 I imagine a diverse future of entities that start out as
present day corporations, but become fully automated (management,
production, research, marketing - everything), and move out into space
where there's more room to grow.  They reproduce and compete for space,
energy, matter and knowledge among themselves.  Some screw up and go out of
business (i.e. die).  Others continue, constantly reshaping themselves
to keep up with the competition.  Many are guided by a kind of constitution,
which states what changes can be made easily, and which others should not
be attempted.  These constitutions act as higher-order genomes
that evolve by success or failure of their adherents - i.e. by Darwinian
means.  But below the broad guidlines of the company constitutions,
management makes day to day decisions, including those about the structure
and business of the company, by anticipating the future and planning for it.

>   Second, it sounds like you think the replacement of humans
> by machines would be a good thing.  But doesn't this contradict
> the theory of morality you advanced in a recent post?  After
> all, it can't possibly be collectively beneficial for humanity to
> be replaced by machines.
>       -- Bill 


It's a matter or tribal identification.
It is good for *us*, if, as I do, you view these creations of ours as
our children (rather than as some invading foreign tribe).  We humans have
two kinds of inheritance passed from generation to generation: one in our
DNAs, the other cultural, passed from mind to mind, book to book and now
co[puter to computer.  The cultural part of us has been growing and
evolving very fast recently, and in term of both quantity and capability is
in the process of eclipsing the old biology.
When the process is a little farther along, the cultural part of us will be
able to run the show on its own -- in fact it will be able to do it far
more effectively, unencumbered by the Rube Goldberg mechanisms of blind
biology.  Intelligent machines are the most obvious manifestation
of our cultural selves in pure form.  Biological evolution is important
historically because it created the substrate for our minds, but our
minds are about to be liberated from that substrate, and be better off
for it - both in terms of our long-term survival, and our potential for a
spectacular future.

  Sayonara, DNA!   Don't worry, we'll remember you in our libraries.

				-- Hans




