From newshub.ccs.yorku.ca!ists!torn!utgpu!cs.utexas.edu!uunet!tdat!swf Tue Jun 23 13:21:17 EDT 1992
Article 6318 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!torn!utgpu!cs.utexas.edu!uunet!tdat!swf
>From: swf@teradata.com (Stanley Friesen)
Newsgroups: comp.ai.philosophy
Subject: Re: The Turing Test is not a Trick
Message-ID: <502@tdat.teradata.COM>
Date: 19 Jun 92 00:20:22 GMT
References: <60807@aurs01.UUCP> <1992Jun12.190924.36762@spss.com> <491@tdat.teradata.COM> <1992Jun18.164543.42825@spss.com>
Sender: news@tdat.teradata.COM
Reply-To: swf@tdat.teradata.com (Stanley Friesen)
Organization: NCR Teradata Database Business Unit
Lines: 86

Well, actually, it looks like we are not that far apart, what reamins
is to thrash out the details.

In article <1992Jun18.164543.42825@spss.com> markrose@spss.com (Mark Rosenfelder) writes:
|In article <491@tdat.teradata.COM> swf@tdat.teradata.com (Stanley Friesen) 
|writes (quoting me):
|>Still too vague, does E.T. have to have eyes?, or wwill sonar do, or niether?
|
|The senses can be as off-the-wall as you like.  However, I doubt that 
|intelligence could evolve in a creature without complex sense perception, 
|and I think it's likely (or at least can't be ruled out a priori) that
|complex sensory processing is a part of intelligence.

O.K., but just what *is* the actual criterion to be used?
How do we judge the presence of 'complex sensory processing'?  What measure
do we use to determine 'complexity'?

Also, this touches on the *one* significant difference between an evolved
and a manufactured intelligence.  An evolved one must have arrived in its
current configuration via a series of accessible intermediates, this
restriction does *not* necessarily apply to a manufactured one.  Thus
it is seems quite possible that some of the restrictions on 'natural'
intelligences may not apply to 'artificial' ones.

So, is there any reason other than evolutionary accessibility that requires
complex sensory processing be *integral* to intelligence?

|>Why should ET physiology be capable of anything resembling human intonation?
|
|Looking at it another way, how can you look at human verbal communication,
|which is a complex of words, rules both strict and flexible, intonation ...
|
|I'd be quite surprised if alien communication were significantly less
|sophisticated and multi-channelled.  

I can agree with this principle, but I feel that specifying intonation
per se is too anthropocentric.  How about changing it to something
more general like 'uses multi-mode communication'.

|>Again,we need to be more specific, and try to make sure the required non-
|>verbal behavior is atually relevent to 'intelligence'.
|
|Just to give one example, watching an alien successfully repair its broken 
|space scooter would give you good prima facie evidence for its intelligence,
|even if it never uttered a word (besides a photic obscenity or two).

Hmm, maybe, maybe not.  I am not sure that such behavior is not possible
to a much less 'intelligent' being than a human.  In fact it might even be
possible to train a chimpanzee to do simple mechanical repairs.

I think it would depend on just what type of *adaptive* behavior were
required for the particular repair needed. (That is was it a 'simple'
or a 'difficult' repair).

This is why the test needs to be specific and explicit about what criteria
are to be used.

This really looks like a job for net-brain-storming.  Any ideas anyone?
How should we specify this to make the test actually usable?

|>These are good (especially since I believe intelligence evolved in humans
|>largely for due to the requirements of social interactions).
|
|Yes, I think this point gets lost sometimes in discussions of the Turing Test.
|To me it seems only reasonable, if intelligence evolved at least in part
|to facilitate social interaction, to search for intelligence by observing
|social interactions.  The Turing Test, which some folks have described as
|simulating a hermit in a cave, doesn't strike me as a good way to do this.

Well, actually a conversation *is* a social interaction, it is just a rather
specific, and narrow, one.

Would humor be a universal amoung intelligent beings?  Or is it a particular
higher primate adaption to dealing with conceptual dissonance?

How does one incorporate *planning* in an 'intelligence' test?
(This being one of the most important social functions of language).

How about 'friendship' and 'bonding' processes?  How would these apply
to a computer?  What form might they take in such an intelligence?
[Note, I am *not* talking about 'implementation', I am talking about
"appearance"].
-- 
sarima@teradata.com			(formerly tdatirv!sarima)
  or
Stanley.Friesen@ElSegundoCA.ncr.com


