From newshub.ccs.yorku.ca!torn!cs.utexas.edu!zaphod.mps.ohio-state.edu!rpi!think.com!ames!agate!doc.ic.ac.uk!uknet!edcastle!cam Wed Oct 14 14:58:41 EDT 1992
Article 7214 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!torn!cs.utexas.edu!zaphod.mps.ohio-state.edu!rpi!think.com!ames!agate!doc.ic.ac.uk!uknet!edcastle!cam
>From: cam@castle.ed.ac.uk (Chris Malcolm)
Newsgroups: comp.ai.philosophy
Subject: Re: Simulated Brain
Message-ID: <26773@castle.ed.ac.uk>
Date: 11 Oct 92 20:04:45 GMT
References: <BARRY.92Oct6151915@chezmoto.ai.mit.edu> <26609@castle.ed.ac.uk> <1992Oct7.205933.5138@meteor.wisc.edu>
Organization: Edinburgh University
Lines: 49

In article <1992Oct7.205933.5138@meteor.wisc.edu> tobis@meteor.wisc.edu (Michael Tobis) writes:
>In article <26609@castle.ed.ac.uk> cam@castle.ed.ac.uk (Chris Malcolm) writes:
>>In article <BARRY.92Oct6151915@chezmoto.ai.mit.edu> barry@chezmoto.ai.mit.edu (Barry Kort) writes:

>>>Daniel Dennett ... saw no reason
>>>why intelligence and consciousness could not reside in a sufficiently
>>>powerful computer processor.

>>I'm sure he intended the processor to be running suitable software :-)
>>Given that rider, this is hardly controversial.

>> Contrary to popular
>>opinion, even Searle of Chinese Roon fame agrees with that, as he made
>>plain in the Jan 1990 edition of Scientific American.

>Sorry, while I distinctly remember him going to great lengths to avoid being
>called a dualist, he did maintain that a Turing machine was insufficient for
>consciousness, and maintained a need for some sort of chemical process.

He gave chemical examples. He does not insist that it be chemical. In
the first para of the Sci AM article he says "it might be possible to
produce a thinking machine .... out of silicon chips or vacuum tubes."

>"a simulation of cognition will ... not produce the effects of the 
>neurobiology of cognition. All mental phenomena, then, are caused by 
>neurophysiological processes in the brain. ... Any artifact that produced
>mental phenomena ... could not do that just by running a formal program."
>[op.cit. p. 29] [Searle]               ^^^^
                                         |

>So unless you are using a broader than usual definition of "computer",
>your representation of Searle's opinion is inaccurate.

No, note the marked "just" in your quotation of Searle. Searle is
quite happy to suppose that a computer might possibly be conscious, he
merely insists that if it is, then the consciousness cannot be _just_
the _result_ of (i.e. caused by) running a program.

In short, Searle argues that a computer _program_ cannot cause
understanding. Many people incorrectly collapse this to the assertion
that understanding could not reside in a computer. Searle has always
been careful to make it clear that he does go that far. That is a very
important distinction to make if one if one is -- as Searle claims to
be -- a materialist. But it is a distinction that dualists, in their
haste to enlist Searle's support, often (always?) overlook.
-- 
Chris Malcolm    cam@uk.ac.ed.aifh          +44 (0)31 650 3085
Department of Artificial Intelligence,    Edinburgh University
5 Forrest Hill, Edinburgh, EH1 2QL, UK                DoD #205


