From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!bonnie.concordia.ca!ccu.umanitoba.ca!zirdum Thu Apr 30 15:23:04 EDT 1992
Article 5296 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!bonnie.concordia.ca!ccu.umanitoba.ca!zirdum
>From: zirdum@ccu.umanitoba.ca (Antun Zirdum)
Newsgroups: comp.ai.philosophy
Subject: Re: Intelligence, awareness... oh no, back to the Turing Test!
Message-ID: <1992Apr28.062159.1931@ccu.umanitoba.ca>
Date: 28 Apr 92 06:21:59 GMT
References: <1992Apr24.174822.29402@spss.com> <1992Apr27.083621.9441@ccu.umanitoba.ca> <1992Apr27.173029.36491@spss.com>
Organization: University of Manitoba, Winnipeg, Manitoba, Canada
Lines: 82

In article <1992Apr27.173029.36491@spss.com> markrose@spss.com (Mark Rosenfelder) writes:
>In article <1992Apr27.083621.9441@ccu.umanitoba.ca> zirdum@ccu.umanitoba.ca 
>(Antun Zirdum) writes (quoting me):
>
>If you are right, that's bad news for AI.  If our analysis of intelligence
>is limited to determining by a behavioral test whether it exists, there's
>no chance of programming it.
>
This does not follow! I see everything as behaviour, you would
be hard pressed to come up with an example that is not behaviour
that is also accessable to the world. (By this I exclude subjective
experience - which again I consider to be a special case of behaviour!)

>Again, if humans have an ability to judge whether intelligence exists, they
>must be using some criteria.  Is it really such a mystery what those
>criteria are?  How about things like memory, abstract thought, language use,
>real-world knowledge, creativity, adaptability, problem-solving, model-
>building, goal-setting, consciousness, judgment?
>
Most of these I think that you would agree are behaviour, the rest
such as consciousness I think that you will argue that these are
not even related to behaviour? Such things as having four legs can
be easily refrased as 'behaving as having four legs' - anything else
that we can say about any physical object can be rephrased in the
same way - The physical world behaves! Behaviour does not have to
involve an action - the expenditure of energy, etc.. - behaviour
can be simply existing.
	The mental/subjective 'world' does offer a real problem
for strict behaviorism, but I think that this is one of those
'looking for a non-existing black cat in a dark room' problems.
By this I do not mean that subjective experience does not exist,
rather I think that we are going about the search wrong! It is
clear to me that no matter how many brains we take apart, no
matter how many brain simulations are run, no matter how much
introspection we do - WE WILL NEVER SEE ANY SUBJECTIVE EXPERIENCE
IN THE OUTSIDE PHYSICAL WORLD! (This is what the Searlites were
trying to do.) It also seems clear that subjective experience
cannot exist without physical objects to experience. If we
put one and one together then we can see that subjective 
experience depends on behaviour. It also seems to be the case
that subjective experience is the result of behaviour (by this
I am using the expanded definition of behaviour - not the
everyday usage) By the same token, there is no test that
you can perform that is not testing behaviour!.

>True, we could use the Turing Test to gain indirect evidence for some of
>these things.  But we have other means of investigation, too-- for brains,
>introspection and neurology; for computers, inspection of the algorithm.
>Why should we not use them?
>
>>Your dog comparison is a dog, to say the least! By defering to
>>the turing test, we define a dog as something that belongs to
>>a certain genus, and has four legs, and barks, and....
>
>What do you mean by "deferring to the Turing Test" here?  If you mean
>that you use behavior to decide what's a dog or not, your own list
>betrays you: having four legs is not behavior!  
>
>In fact we regularly use non-behavioral criteria in assigning things
>to categories.  This offers no support for the Turing Test!
>
>But you're missing my point.  As your own remarks show, "dog" is not a
>semantic primitive; it is composed of underlying defining characteristics.
>My point was that treating "dog" as a primitive would be absurd; and
>that by analogy treating "intelligence" as a primitive is just as foolish.

You are of course correct that 'dog' may not be a semantic primitive
but in many cases it's treated as if it is. (Same for intelligence)
So in many cases when we see a 'dog' we see a semantic primitive,
it may be the case that different parts of our brains take the
dog 'apart' but for the semantic primitive we call 'ourselves'
we see it as a single. Thus point of view is very important when
looking for something (one needs only look at the physics of
EM radiation to confirm this) this is especially true when we
are looking for intelligence/awareness - the way we look can
determine whether we are successful or not.

-- 
*****************************************************************
*   AZ    -- zirdum@ccu.umanitoba.ca                            *
*     " The first hundred years are the hardest! " - W. Mizner  *
*****************************************************************


