From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!think.com!mips!atha!aunro!alberta!kakwa.ucs.ualberta.ca!access.usask.ca!ccu.umanitoba.ca!zirdum Mon Mar  9 18:34:26 EST 1992
Article 4186 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!think.com!mips!atha!aunro!alberta!kakwa.ucs.ualberta.ca!access.usask.ca!ccu.umanitoba.ca!zirdum
>From: zirdum@ccu.umanitoba.ca (Antun Zirdum)
Newsgroups: comp.ai.philosophy
Subject: Re: Intelligence and Understanding
Message-ID: <1992Mar2.031253.3229@ccu.umanitoba.ca>
Date: 2 Mar 92 03:12:53 GMT
References: <1992Feb29.080019.9272@ccu.umanitoba.ca> <1992Mar1.072408.25643@a.cs.okstate.edu>
Organization: University of Manitoba, Winnipeg, Manitoba, Canada
Lines: 124

In article <1992Mar1.072408.25643@a.cs.okstate.edu> onstott@a.cs.okstate.edu (ONSTOTT CHARLES OR) writes:
>
>
>
>First Question:
>  Can a computer, without another agent present, utilize the language
>  of chinese in an original way?  Example, in the Chinese room example
>  the guy in the room only gets written "squiggles" as inputs and 
>  sends out corresponding "squaggles" as outputs.  According to your
>  stance, you claim that this is sufficient for understanding.  Therefore,
>  the question becomes; can the computer take in other inputs, such as
>  visual, auditory(not language sounds but, say "background noises),
>  smells, etc(without debating on whether this can be done at all--for
>  the purposes of this question we will assume it can), convert these
>  inputs into a form that is matchable to the chinese language and produce
>  outputs in a way that *if* a chinese person immediately entered the
>  room and listened to commentary made by the computer he could understand
>  *provided* that no mapping of these sorts of sensory inputs had been
>  hertofore made to correspond to the language?  In short, can the system
>
You place an unusual burden on the computer, when you refuse to
give it what every human has had! Every human has had the benefit
of listening to other speakers listen to the correct use of the
language in relation to external inputs! For example, I say
"it's cold" what am I talking about? Is the air cold, is something
I am holding cold? Is your attitude cold? We humans only make the
proper connections of these inputs to our language by hearing
and experiencing these conditions. Therefore, your expectation
that the computer be able to take sensory inputs and convert it
to Chinese is unrealistic (both for human speakers and computers!)
If it had the benefit of opportunity to 'learn' these symbols, by
say, placing the computer (& program) into a baby's skull, and
people reacted to it as they do to a baby (and taught it the same)
then I have no doubt that IT would be able to apply its inputs in 
a creative way in the language(s).

> Second Qustion:
>  of truth (or correspondance)?  In other words, will the computer be
>  capable of saying anything about the environment other than what we
>  might expect from a "sense-data"(from Russell) response?
>
What can you say about the environment that is not based on sense data?
Anything that you say (that will make sense to me) will be based
on input by way of your senses. (but please, if there is something
that you can say that is not from sense data - say it.)

> Third Question: 
>  It could be argued that while the above answers may be "no" they do
>  not constitute understanding because, in fact, a system of "truth" may
>  be false.   The question becomes, "what is truth?"
Does a understanding of a 'false' system in any way diminish understanding?
I believe it does not, but since I do not know what you mean by
understanding I will not pretend to know the definitive answer.
Truth - that is a different matter, humans use the word truth
in at least three different ways.
1) a system is true - it can be defined by itself (tautology)
2) a fact about the universe - "I live in Winnipeg." Objective
truth. (this use of the word is not provable!)
3) subjective truth - "This painting is true art." This use of
the word does not state anything usefull.

As always, look to how the word is used, do not look for *A* single
definitive meaning. (if there are any other uses someone please
fill them in for me.)

>
>  First Proposition:
>   Truth can be obtained without understanding.  
>    Ex: 2+2=4=2+2 is TRUE however the operators + and = do not themselves
>    understand.

The operators =/+ do not obtain anything, they are merely used by
humans to state a tautology. (A defined truth!) The human must
however, understand the use of these symbols!
>
>  Second Proposition:
>   Understanding is a system relationship; but a particular kind of system.
>   For example, as can be derived from the first proposition, the truth
>   and the understanding to go with it requires that of which deems 
>   2+2=4=2+2 to be TRUE and Meaningful.  The understanding of that proposition
>   as True comes from the fact that True is itself meaningful.

The understanding of that proposition comes from the fact that
it is a tautology. There is nothing to understand, that is the
way it is defined!

>
>  Third Proposition:
>   Meaningfulness comes from volition.
>   The system must have volition--in turn which means that it is
>   dynamic and creative.
>
>  Fourth Proposition:
>   A computer does not have volition.  A computer does not have volition
>   because, even as a system, its behavior is presecribed and thus
>   predetermined.  
>
I beg to differ! It's behavior is prescribed in the same way that
a humans behavior is prescribed. The human has inputs thru senses,
so does the computer. The human bases his decisions from a 
combination of memory, and sense data 'facts', so does the 
computer.
	In other words, the computer is not a self contained
system (thus predetermined in that way). It gets outside
influence thru senses, thus you (or anything else) could not
determine what it's behaviour will be based on any state that
it is currently in! (Read that again, that is a powerfull statement!)

This means that a computer is not predetermined, in exactly the
same way as a human is not predetermined!

>  Fifth Proposition:
>
Thus, without arguing that humans have no volition, I have argued
that *whatever* humans have, so do computers! (quite successfully
I might add. :-)

>  BCnya,
>    Charles O. Onstott, III
-- 
*****************************************************************
*   AZ    -- zirdum@ccu.umanitoba.ca                            *
*     " The first hundred years are the hardest! " - W. Mizner  *
*****************************************************************


