Newsgroups: sci.skeptic,alt.consciousness,comp.ai.philosophy,sci.philosophy.meta,rec.arts.books
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!Germany.EU.net!EU.net!sun4nl!cwi.nl!olaf
From: olaf@cwi.nl (Olaf Weber)
Subject: Re: Penrose and Searle (was Re: Roger Penrose's fixed ideas)
Message-ID: <D00uxJ.8o2@cwi.nl>
Sender: news@cwi.nl (The Daily Dross)
Nntp-Posting-Host: havik.cwi.nl
Organization: CWI, Amsterdam
References: <JMC.94Nov22011226@white.wisdom.weizmann.ac.il> <3b5f56$d2o@news-rocq.inria.fr>
	<CzuCBz.80z@cwi.nl> <Czzp8B.C2t@cogsci.ed.ac.uk>
Date: Tue, 29 Nov 1994 09:13:02 GMT
Lines: 47
Xref: glinda.oz.cs.cmu.edu sci.skeptic:96578 comp.ai.philosophy:22804 sci.philosophy.meta:15101

In article <Czzp8B.C2t@cogsci.ed.ac.uk>, jeff@aiai.ed.ac.uk (Jeff Dalton) writes:
> In article <CzuCBz.80z@cwi.nl> olaf@cwi.nl (Olaf Weber) writes:

>> Searle seems to think that the material from which the brain is
>> constructed is important, but get (IMHO) rather vague when he has
>> to demonstrate exactly _how_ that could matter.

> That's because he doesn't know.  Nor does anyone else.

At which point I wonder how he (or anyone else) can know that the
kind of material used matters at all.

>> To me, the restriction seems rather parochial: he refers to the
>> "neuronal chauvinism" that only entities with neurons like our own
>> can have mental states(*), but cheerfully makes the same mistake
>> (IMHO) elsewhere in his arguments.

> Could you say where?  I assume the ref below is to where he talks
> of "neural chauvinism" rather than to where he makes the same mistake.

>> (*) The Rediscovery of the Mind, Chapter 2 section III, page 38 in
>> the MIT paperback edition.

The mistake made by "neural chauvinists" is that they say "if it is
too different from us, it cannot have mental states," with "too
different" defined as "not using neurons like ours".  While Searle is
more sophisticated than that, he certainly appeals to those sentiments
(ibid, Chapter 9 section IV, page 207):

	But now if we are trying to take seriously the idea that the
	brain is a digital computer, we get the uncomfortable result
	that we could make a system that does just what the brain does
	out of pretty much anything.  [Including] cats and  mice and
	cheese or levers or water pipes or pigeons ...

Here he appeals to the intuition that because computers can be made
out of "silly stuff", while brains "obviously" cannot, the brain
cannot be a computer.  The problem is that even if the brain isn't a
computer, he still hasn't shown that brains cannot consist of "silly
stuff".  Thus I classify the argument above as a (subtle) form of
"neural chavinism".

Ultimately, I am less uncomfortable with the notion that a brain might
be made of "silly stuff" and work in a "silly manner" (like a lookup
table) than with a blanket condemnation of either.

-- Olaf Weber
