From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!neat.cs.toronto.edu!maione Tue May 12 15:48:59 EDT 1992
Article 5404 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!neat.cs.toronto.edu!maione
Newsgroups: comp.ai.philosophy
>From: maione@cs.toronto.edu (Ian Christopher Maione)
Subject: Comments on Searle
Message-ID: <92May4.231849edt.47880@neat.cs.toronto.edu>
Organization: Department of Computer Science, University of Toronto
Distribution: na
Date: 5 May 92 03:19:10 GMT
Lines: 73


 In article 6348 Bill Skaggs writes .......

>  Searle's "Chinese Room" argument contains some superficial
> flaws that allow it to be easily refuted on a superficial
> level, but, as Jeff Dalton sees, the standard refutation
> (the Systems Reply) does not get at the deepest essence of
> the argument.

>   The real power of the Chinese Room lies in the "argument
> from intentionality".  To Searle, and to Jeff Dalton, and
> to common sense, it is just obvious that humans believe
> things and perceive things, and that we know what it is
> that we believe and perceive.  The Chinese Room is an
> attempt to show that computers, even if they behave
> correctly and implement the right sorts of programs,
> cannot have this kind of intentionality.  I don't think
> the Chinese Room in itself is all that compelling, but I
> do think that Searle's conclusion is correct:  computers
> cannot possess strong intentionality (which requires
> infallibly knowing what it is that you believe and
> perceive).

    A couple of comments on Searle:

   First of all, a common misconception about Searle which is oft-
repeated is that he holds that a digital computer cannot think.  What
Searle does say is that it cannot be solely in virtue of the fact that
it implements a computer program that it thinks.  Searle wants some sort
of extra 'causal' factor to be present,  which he does not specify,
because he has no idea what it is.  Searle sums his argument up as
follows:  (this is from "Minds, Brains, and Science" - the quoted parts
are Searle's, everything else is mine)
     1. "Brains cause minds"  -->  for Searle, since we know we have
         minds, and we are biological entities, there must be some
         sort of causal property that results in our mental states
         having intrinsic meaning.  Whether other sorts of matter
         (i.e silicon) could do so is an open question.

     2. "Syntax is not sufficent for semantics"

     3. "Computer programs are entirely defined by their formal,
         syntactic structure"  -->  in my opinion, right here is where
         to look to find a flaw in Searle's argument.  Even if you
         accept this statement (and I think it could be argued either
         way), if you read the Chinese Room argument carefully, there
         is an equivocation going on between computer programs, and
         implementations of computer programs.  There is no physical
         entity which is a computer program, just as you can't point to
         something and say "this is the number 2".  A computer program
         is a mathematical object - an implementation of one is a
         physical object.  I don't see why one should a priori suppose
         that the implementation doesn't have the appropriate causal
         properties in (1), particularly since Searle doesn't tell us
         anything about them.

     4. "Minds have mental contents; specifically, they have semantic
         contents"  --> Searle takes this as self-evident, and
         Daryl McCullogh's comments essentially attack this assumption.

   On the other hand, although Searle's conclusions may be incorrect,
there does seem to be an issue floating around here.  I have puzzled
over the Chinese Room for a while, but I find it difficult to really
pin down what's going on.  It does seem as there is a clear difference
between someone who is a native Chinese speaker and someone who is
manipulating symbols to produce Chinese sentences.  Does anyone have a
good idea on how to precisely characterize this difference?


Regards,
Ian




