From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!utgpu!cs.utexas.edu!sun-barr!ames!network.ucsd.edu!swrinde!mips!atha!aunro!alberta!kakwa.ucs.ualberta.ca!access.usask.ca!ccu.umanitoba.ca!zirdum Mon Mar  9 18:35:35 EST 1992
Article 4298 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!utgpu!cs.utexas.edu!sun-barr!ames!network.ucsd.edu!swrinde!mips!atha!aunro!alberta!kakwa.ucs.ualberta.ca!access.usask.ca!ccu.umanitoba.ca!zirdum
>From: zirdum@ccu.umanitoba.ca (Antun Zirdum)
Newsgroups: comp.ai.philosophy
Subject: Re: Intelligence and Understanding
Message-ID: <1992Mar6.011131.4146@ccu.umanitoba.ca>
Date: 6 Mar 92 01:11:31 GMT
References: <1992Mar1.072408.25643@a.cs.okstate.edu> <1992Mar2.031253.3229@ccu.umanitoba.ca> <1992Mar4.022416.11169@a.cs.okstate.edu>
Organization: University of Manitoba, Winnipeg, Manitoba, Canada
Lines: 76

In article <1992Mar4.022416.11169@a.cs.okstate.edu> onstott@a.cs.okstate.edu (ONSTOTT CHARLES OR) writes:
>In article <1992Mar2.031253.3229@ccu.umanitoba.ca> zirdum@ccu.umanitoba.ca (Antun Zirdum) writes:
>>In article <1992Mar1.072408.25643@a.cs.okstate.edu> onstott@a.cs.okstate.edu (ONSTOTT CHARLES OR) writes:
>>You place an unusual burden on the computer, when you refuse to
>>give it what every human has had! Every human has had the benefit
>>of listening to other speakers listen to the correct use of the
>>language in relation to external inputs! For example, I say
>  No, its not an unusual burden; because a I am assuming that the computer
>has information about the language, as you would maintatin, and that both
>humans and computers can be left alone without interlocutors or other
>agents present.  The critical difference is that a human can think about
>these things, indeed event invent problems to solve on their own, without
>other agents preset.  I doubt this to be possible on a computer.

Quite presumptious of you to assume that a computer has no way 
of running by itself, with no inputs from people!
>>
>>that you can say that is not from sense data - say it.)
>  "My mind is feeling rather gloomy today."  Where is the sense data for
>that statement?  Sense data are things like, "Blue patch here now."  How
>do you find 'feeling' in the material world, what is 'gloom' and what is
>a 'mind'? But, of course, you still understand the sentence.
>
Again, quite presumptious to assume that I understand the question!
I think that you are saying (a) your feeling sad (b) you are not
able to think clearly (c) you are tired. In either case you are
stating a fact about your body.
	Let us say that our Turing tested computer recognizes
that its hardware is not functioning properly, random errors,
it says "Sorry, but I cannot answer questions, my mind is
feeling rather gloomy today"! What is this statement about?

>>>
>>The understanding of that proposition comes from the fact that
>>it is a tautology. There is nothing to understand, that is the
>>way it is defined!
>  Then you have baffled me as to how you maintain that a computer, which
>resets on these tautologies, can have meaning at all. (much less understanding.)
>
How does a computer rest on tautologies? That I do not understand.
I think that we have established that computers and 
humans get thier information in the same way! We
have established that a computer is able to recieve 
information and to output information in the same
way as a human, the only difference we are arguing
about is what is the difference that goes on inside
the persons head that is not possible in a computer!
(remember, I said not possible, ever!)
>
>>>   A computer does not have volition.  A computer does not have volition
>>>   because, even as a system, its behavior is presecribed and thus
>>>   predetermined.  
>>>
>>I beg to differ! It's behavior is prescribed in the same way that
>>a humans behavior is prescribed. The human has inputs thru senses,
>  I am not sure how you think that human behavior is prescribed in the
>same way as a computer except by way of the very metaphore we are 
>discussing.  This statement, of course, does not mean that human behavior is
>not prescribed at all, but that there are some very essential differences.
>One of those differences is the ability for a human to invent situations
>and reactions to those situations quite arbitrarily; something which a 
>computer can not, and will never be able, to do.
>
Sorry, in my experience there is nothing to prevent computers from
inventing situations as arbitrarily as humans. I am not saying
that this has been done, but just that it is possible!
So again, I come to the conclusion, whatever humans can do
computers can do just as well!
>
>BCnya,
>  Charles O. Onstott, III
-- 
*****************************************************************
*   AZ    -- zirdum@ccu.umanitoba.ca                            *
*     " The first hundred years are the hardest! " - W. Mizner  *
*****************************************************************


