From newshub.ccs.yorku.ca!torn!utcsri!rpi!usc!sdd.hp.com!spool.mu.edu!news.nd.edu!bach!jocallag Thu Jul  9 16:20:17 EDT 1992
Article 6401 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!torn!utcsri!rpi!usc!sdd.hp.com!spool.mu.edu!news.nd.edu!bach!jocallag
>From: jocallag@bach.helios.nd.edu (john ocallaghan)
Newsgroups: comp.ai.philosophy
Subject: Re: Defining other intelligence out of existence
Message-ID: <1992Jul1.034918.4743@news.nd.edu>
Date: 1 Jul 92 03:49:18 GMT
Article-I.D.: news.1992Jul1.034918.4743
References: <1992Jun30.193051.28317@sequent.com>
Sender: news@news.nd.edu (USENET News System)
Organization: University of Notre Dame, Notre Dame
Lines: 36

In article <1992Jun30.193051.28317@sequent.com> bfish@sequent.com (Brett Fishburne) writes:
>I have followed all kinds of discussions lately both here and on other
>news groups which talk about methods of evaluating artificial
>(or just plain non-human) intelligence.  What I have taken away from these
>discussions is a clear impression that the philosophical community seems
>to be at a loss to define/evaluate intelligence independent of being
>human.  This may seem trivial (or obvious), but, IMHO, it is an important
>observation which deserves some review.
:a
Precisely how do you characterize "intelligence."  What would you suggest, in such a way that those who are interested can distinguish what you are speaking about as intelligence, as opposed to sight or the reaction of a venus fly trap to a fly or gravity for that matter
:q
:q
:s
>
>The Turing Test is an excellent case and point.  The computer is not
>considered to be intelligent until it is virtually indistinct from a human.
>It seems to me, if you are interested in producing a human, this is a valid
>test.  If, however, you are interested in producing *intelligence*, this
>might be considered overkill.  
>
>Is it fair to require that for something to be considered intelligent it 
>must mimic the _most_ intelligent thing we can think of?  Suppose we applied 
>that standard to running.  You can only be a runner if you can run as fast
>as a cheetah, oh, and, by the way, you must run on all fours.  I know this
>is a ludicrous example, but is it really much worse than what we are asking
>of artificial intelligence?
>
>Equally as intersting, why set this standard?  Could it possibly be that
>humans can not deal with the possibility that we are not unique in the
>universe?  Sounds like a certain stance attributed to most religions, not
>philosophical paradigms...
>
>-- Brett
>bfish@sequent.com
>
>The opinions expressed are my own, blah, blah, blah...


