From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael Mon Dec  9 10:48:58 EST 1991
Article 1957 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael
>From: michael@psych.toronto.edu (Michael Gemar)
Subject: Re: Searle and the Chinese Room 
Message-ID: <1991Dec8.193847.7238@psych.toronto.edu>
Organization: Department of Psychology, University of Toronto
References: <YAMAUCHI.91Dec5040116@heron.cs.rochester.edu> <1991Dec5.191043.10565@psych.toronto.edu> <1991Dec5.220612.27855@bronze.ucs.indiana.edu>
Date: Sun, 8 Dec 1991 19:38:47 GMT

In article <1991Dec5.220612.27855@bronze.ucs.indiana.edu> chalmers@bronze.ucs.indiana.edu (David Chalmers) writes:
>In article <1991Dec5.191043.10565@psych.toronto.edu> michael@psych.toronto.edu (Michael Gemar) writes:
>
>>It seems to me that, unless strong AI proponents can provide a coherent
>>explanation of why Searle's logical argument fails, the field as a whole
>>rests on a profound misunderstanding.
>
>(1) Recipes are completely syntactic.
>
>(2) Cakes are crumbly.
>
>(3) Syntax is not sufficient for crumbliness.
>
>(4) Therefore implementing the appropriate recipe cannot be sufficient
>    to produce a cake.
>
>Reflection on why this argument is fallacious should lead one to
>uncover the fallacy in Searle's analogous argument.  [Hint: a computer
>program is purely syntactic, but an *implementation* of a computer
>program is not.]

The analogy does not hold, as it greatly depends on what *material* I use
to implement a recipe, whereas functionalism asserts that the actual
substance used to implement a mind doesn't matter.  Indeed, the above
analogy seems to fall more in line with Searle's assertion that minds
are biological properties of brains (whatever *that* might mean).  
Recipes are indeed completely syntactic, and despite the fact that you
might have the same formal arrangement with motor oil and iron filings
as you do with water and flour, the former won't make a cake.
Likewise, having the same *formal* properties of the biological stuff
of the brain, Searle might argue, is insuffient *in itself* 
for understanding.  The "stuff" one uses to implement the syntax 
matters for Searle.

I do not claim that Searle's assertion WRT the "causal powers of the brain"
makes sense to me -- indeed, I believe that the position is epistemically
opaque (we can never know what the relevant causal powers are), and gets
us no closer to a solution of the problem.  However, I don't see at all
why the analogy you present is problematic for Searle, as it seems to me
to be, quite contrary, simply an example of another domain in which
functionalism fails.

- michael



