From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!moe.ksu.ksu.edu!kuhub.cc.ukans.edu!spssig.spss.com!markrose Tue May 12 15:48:56 EDT 1992
Article 5399 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!moe.ksu.ksu.edu!kuhub.cc.ukans.edu!spssig.spss.com!markrose
Newsgroups: comp.ai.philosophy
Subject: Re: Intelligence, awareness... oh no, back to the Turing Test!
Message-ID: <1992May04.172754.4328@spss.com>
>From: markrose@spss.com (Mark Rosenfelder)
Date: Mon, 04 May 1992 17:27:54 GMT
References: <1992Apr28.062159.1931@ccu.umanitoba.ca> <1992Apr28.185141.29465@spss.com> <1992May3.201239.26750@ccu.umanitoba.ca>
Organization: SPSS Inc.
Nntp-Posting-Host: spssrs7.spss.com
Lines: 34

In article <1992May3.201239.26750@ccu.umanitoba.ca> zirdum@ccu.umanitoba.ca 
(Antun Zirdum) writes:
>I was trying to make the point that if you take it
>appart into components, what you have left is not
>intelligence. Sure you may have memory, recognition,
>etc.. but when can you say that you have intelligence?
>So in one sense it is a primitive, I do not think that
>we will ever see a recipe for intelligence, as I see it
>it is a wide overlapping set of ingredients. Something 
>like a cake, even though there may be many types of cakes
>that have nothing in common ingredientially, they are all
>cakes.

OK, it seems that you define "intelligence" the way Wittgenstein defines
"game"-- as a conjunction of components none of which is necessary (as you
put it, the cakes may have nothing in common).  I'm not sure I agree,
but I don't see anything unreasonable in this position.

>I must argue here that even with the expanded
>definition of behaviour that I present, it is
>still not as meaningless as Searle's causal
>powers. I am simply stating that anything that
>others can know about us is behaviour, is there
>something wrong with that? 

Yes, you've just demolished your version of the Turing Test!  You defined
behavior as including not just actions but also states, such as having
two or four legs.  Now, to duplicate human behavior, a computer must
duplicate all the states a human has: it must have two legs, have a body
made of protoplasm, etc.  Well, now no computer can pass the test!

Now you're going to want to say that some of these things are not relevant
for intelligence.  OK, but in that case we don't judge intelligence just by
"behavior", but by *certain kinds* of behavior.  What are those kinds?


