Newsgroups: comp.ai
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!newsfeed.pitt.edu!godot.cc.duq.edu!newsgate.duke.edu!news.mathworks.com!newsfeed.internetmci.com!uwm.edu!fnnews.fnal.gov!gw1.att.com!nntphub.cb.att.com!bigtop!news
From: Carlos Casize <rune@dr.att.com>
Subject: Re: artificial intellige
Content-Type: text/plain; charset=us-ascii
Message-ID: <31AC92A7.63D1@dr.att.com>
Sender: news@bigtop.dr.att.com (Netnews Administration Login)
Nntp-Posting-Host: walnut
Content-Transfer-Encoding: 7bit
Organization: RMACS
References: <8C0D385.01CA00376E.uuout@almac.co.uk> <4nls29$mhn@newsbf02.news.aol.com> <31A637DA.7983@dr.att.com> <4o7itq$ih7@globe.indirect.com>
Mime-Version: 1.0
Date: Wed, 29 May 1996 18:08:39 GMT
X-Mailer: Mozilla 2.0 (Win16; I)
Lines: 30

Marty Stoneman wrote:

 
> What if I program my intelligent entity to learn roughly in the ways that
> humans learn, so that it will increasingly do "the smart thing to do"?
> Even if there are some "genetics" that are "hard-wired", there will soon
> be much more, no?  And don't all healthy human babies learn new behaviors
> and goals by "copying" those around them?  Can't I program my entity to
> do the same?


I would say, without a doubt that this is the ideal of AI research, but 
for reasons either technological (we don't have the hardware) or because 
of the approaches that have been use (we don't have the software), 
nothing even close to this learning program has been made. 

Some system can 'copy' behavior, or patterns.  However none can apply 
these 'learned behaviors' to real situations with anything approaching 
intelligence.  That IMHO is the crux of the problem. 

So how can a machine be programmed to select a behavior with the acuity 
of an ant much less that of a human child?

Once that question is answered, it will be time to move on to the ethical 
question: should such an intelligent entity be created?  It will after 
all be smarter, stronger and generally more fit than humankind.  Is such 
an entity sentient?  It could  be deemed such, but then some would argue 
that it is not. 

R.
