From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!convex!constellation!a.cs.okstate.edu!onstott Mon Mar  9 18:33:16 EST 1992
Article 4080 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!convex!constellation!a.cs.okstate.edu!onstott
>From: onstott@a.cs.okstate.edu (ONSTOTT CHARLES OR)
Subject: Re: Definition of understanding
References: <1992Feb26.190407.5123@organpipe.uug.arizona.edu> <1992Feb27.025740.8034@a.cs.okstate.edu> <1992Feb27.041137.29433@mp.cs.niu.edu>
Message-ID: <1992Feb27.192839.5346@a.cs.okstate.edu>
Organization: Oklahoma State University, Computer Science, Stillwater
Date: Thu, 27 Feb 92 19:28:39 GMT

In article <1992Feb27.041137.29433@mp.cs.niu.edu> rickert@mp.cs.niu.edu (Neil Rickert) writes:
>In article <1992Feb27.025740.8034@a.cs.okstate.edu> onstott@a.cs.okstate.edu (ONSTOTT CHARLES OR) writes:
>>I would suggest that unless the system can translate the language from
>>its original tongue to that of chinese or be capable of generating its
>>own originial statements free of context from the ones being presented
>>to it, the system has altogether failed to understand anything at all.
>
>  Remember that you are talking about what is supposed to be a programming
>system designed to handle Chinese.  As such, its original tongue IS
>CHINESE.  It matters not one iota whether the steps are implemented by
>someone whose native language is English or Martian, or MC68030 or 80486
>or Cray.  The system itself has only one language.  If you want the system
>to be able to translate English into Chinese you had better change the
>Chinese room so that the program is also required to handle English.  But
>without changing the definition, you cannot hope to get English out of
>the system.  The system is different from the individual implementing it.
>
  Yes, but even if I grant you the above; can the system generate a new
statement without some verbal (or language oriented) stimuli?  In other
words, can the system intend to say something that is not simply a matching
of inputs to outputs, according to a set of rules, and that is original? 

If it can not generate statements free of context; I doubt it can be 
thought of as understanding.  Further, can the system internally generate
a problem and use that language to analysis that problem and produce outputs
whether or not another agent is nearby?  IN this case, we would say that the
machine is probably thinking.  But it seems to me that the system can do
neither; it can not generate outputs without expected inputs, and it can not
identify a problem to solve using the language in a unique and creative way
sans another agent present.

BCnya,
  Charles O. Onstott, III

------------------------------------------------------------------------
Charles O. Onstott, III                  P.O. Box 2386
Undergraduate in Philosophy              Stillwater, Ok  74076
Oklahoma State University                onstott@a.cs.okstate.edu


"The most abstract system of philosophy is, in its method and purpose, 
nothing more than an extremely ingenious combination of natural sounds."
                                              -- Carl G. Jung
-----------------------------------------------------------------------




