From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!think.com!yale.edu!qt.cs.utexas.edu!cs.utexas.edu!convex!constellation!a.cs.okstate.edu!onstott Mon Mar  9 18:34:55 EST 1992
Article 4234 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!think.com!yale.edu!qt.cs.utexas.edu!cs.utexas.edu!convex!constellation!a.cs.okstate.edu!onstott
>From: onstott@a.cs.okstate.edu (ONSTOTT CHARLES OR)
Subject: Re: Intelligence and Understanding
References: <1992Feb29.080019.9272@ccu.umanitoba.ca> <1992Mar1.072408.25643@a.cs.okstate.edu> <1992Mar2.031253.3229@ccu.umanitoba.ca>
Message-ID: <1992Mar4.022416.11169@a.cs.okstate.edu>
Organization: Oklahoma State University, Computer Science, Stillwater
Date: Wed, 4 Mar 92 02:24:16 GMT
Lines: 103

In article <1992Mar2.031253.3229@ccu.umanitoba.ca> zirdum@ccu.umanitoba.ca (Antun Zirdum) writes:
>In article <1992Mar1.072408.25643@a.cs.okstate.edu> onstott@a.cs.okstate.edu (ONSTOTT CHARLES OR) writes:
>>
>>
>>
>>First Question:
>>  Can a computer, without another agent present, utilize the language
>>  of chinese in an original way?  Example, in the Chinese room example
>>  hertofore made to correspond to the language?  In short, can the system
>>
>You place an unusual burden on the computer, when you refuse to
>give it what every human has had! Every human has had the benefit
>of listening to other speakers listen to the correct use of the
>language in relation to external inputs! For example, I say
  No, its not an unusual burden; because a I am assuming that the computer
has information about the language, as you would maintatin, and that both
humans and computers can be left alone without interlocutors or other
agents present.  The critical difference is that a human can think about
these things, indeed event invent problems to solve on their own, without
other agents preset.  I doubt this to be possible on a computer.
>
>> Second Qustion:
>>  of truth (or correspondance)?  In other words, will the computer be
>>  capable of saying anything about the environment other than what we
>>  might expect from a "sense-data"(from Russell) response?
>>
>What can you say about the environment that is not based on sense data?
>Anything that you say (that will make sense to me) will be based
>on input by way of your senses. (but please, if there is something
>that you can say that is not from sense data - say it.)
  "My mind is feeling rather gloomy today."  Where is the sense data for
that statement?  Sense data are things like, "Blue patch here now."  How
do you find 'feeling' in the material world, what is 'gloom' and what is
a 'mind'? But, of course, you still understand the sentence.

>>
>>  First Proposition:
>>   Truth can be obtained without understanding.  
>>    Ex: 2+2=4=2+2 is TRUE however the operators + and = do not themselves
>>    understand.
>
>The operators =/+ do not obtain anything, they are merely used by
>humans to state a tautology. (A defined truth!) The human must
>however, understand the use of these symbols!
>>
>>  Second Proposition:
>>   Understanding is a system relationship; but a particular kind of system.
>>   For example, as can be derived from the first proposition, the truth
>>   and the understanding to go with it requires that of which deems 
>>   2+2=4=2+2 to be TRUE and Meaningful.  The understanding of that proposition
>>   as True comes from the fact that True is itself meaningful.
>
>The understanding of that proposition comes from the fact that
>it is a tautology. There is nothing to understand, that is the
>way it is defined!
  Then you have baffled me as to how you maintain that a computer, which
resets on these tautologies, can have meaning at all. (much less understanding.)


>
>>
>>  Third Proposition:
>>   Meaningfulness comes from volition.
>>   The system must have volition--in turn which means that it is
>>   dynamic and creative.
>>
>>  Fourth Proposition:
>>   A computer does not have volition.  A computer does not have volition
>>   because, even as a system, its behavior is presecribed and thus
>>   predetermined.  
>>
>I beg to differ! It's behavior is prescribed in the same way that
>a humans behavior is prescribed. The human has inputs thru senses,
  I am not sure how you think that human behavior is prescribed in the
same way as a computer except by way of the very metaphore we are 
discussing.  This statement, of course, does not mean that human behavior is
not prescribed at all, but that there are some very essential differences.
One of those differences is the ability for a human to invent situations
and reactions to those situations quite arbitrarily; something which a 
computer can not, and will never be able, to do.

>
>>  Fifth Proposition:
>>
>Thus, without arguing that humans have no volition, I have argued
>that *whatever* humans have, so do computers! (quite successfully
>I might add. :-)
  I beg to differ.  See above.  :->

BCnya,
  Charles O. Onstott, III

------------------------------------------------------------------------
Charles O. Onstott, III                  P.O. Box 2386
Undergraduate in Philosophy              Stillwater, Ok  74076
Oklahoma State University                onstott@a.cs.okstate.edu


"The most abstract system of philosophy is, in its method and purpose, 
nothing more than an extremely ingenious combination of natural sounds."
                                              -- Carl G. Jung
-----------------------------------------------------------------------



