From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!sun-barr!decwrl!access.usask.ca!ccu.umanitoba.ca!zirdum Tue May 12 15:50:10 EDT 1992
Article 5531 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!sun-barr!decwrl!access.usask.ca!ccu.umanitoba.ca!zirdum
>From: zirdum@ccu.umanitoba.ca (Antun Zirdum)
Subject: Re: Intelligence, awareness... oh no, back to the Turing Test!
Message-ID: <1992May10.025610.4716@ccu.umanitoba.ca>
Organization: University of Manitoba, Winnipeg, Manitoba, Canada
References: <1992Apr28.185141.29465@spss.com> <1992May3.201239.26750@ccu.umanitoba.ca> <1992May04.172754.4328@spss.com>
Date: Sun, 10 May 1992 02:56:10 GMT
Lines: 53

In article <1992May04.172754.4328@spss.com> markrose@spss.com (Mark Rosenfelder) writes:
>In article <1992May3.201239.26750@ccu.umanitoba.ca> zirdum@ccu.umanitoba.ca 
>(Antun Zirdum) writes:
[deleted]
>
>OK, it seems that you define "intelligence" the way Wittgenstein defines
>"game"-- as a conjunction of components none of which is necessary (as you
>put it, the cakes may have nothing in common).  I'm not sure I agree,
>but I don't see anything unreasonable in this position.

I am not sure that I meant that NONE of the components are
necessary, however I think that a lot of the components are
substitutable - without making a great difference.
>
>>I must argue here that even with the expanded
>>definition of behaviour that I present, it is
>>still not as meaningless as Searle's causal
>>powers. I am simply stating that anything that
>>others can know about us is behaviour, is there
>>something wrong with that? 
>
>Yes, you've just demolished your version of the Turing Test!  You defined
>behavior as including not just actions but also states, such as having
>two or four legs.  Now, to duplicate human behavior, a computer must
>duplicate all the states a human has: it must have two legs, have a body
>made of protoplasm, etc.  Well, now no computer can pass the test!

You will notice that I avoid the term "human behaviour", it
seems obvious to me that a machine cannot duplicate human
behaviour without *being* human. I do however believe that
"human behaviour" is not equivalent to "intelligent behaviour"
(with awareness thrown in) Thus to pass the test of being
intelligent, and aware the machine does not have to have
the state of a human body, just as quadraplegics can pass
the TT, so can a machine.
>
>Now you're going to want to say that some of these things are not relevant
>for intelligence.  OK, but in that case we don't judge intelligence just by
>"behavior", but by *certain kinds* of behavior.  What are those kinds?

Your assumption that I was talking about "human behaviour"
led you astray. I was talking about intelligent behaviour,
therefore whatever qualifies as intelligent actions are.
That is why the machine does not need to duplicate the
states that make us human, it only needs the states that
make us aware/intelligent.


-- 
*****************************************************************
*   AZ    -- zirdum@ccu.umanitoba.ca                            *
*     " The first hundred years are the hardest! " - W. Mizner  *
*****************************************************************


