From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael Thu Feb 20 15:21:15 EST 1992
Article 3782 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael
>From: michael@psych.toronto.edu (Michael Gemar)
Subject: Re: Reference (was re: Multiple Personality Disorder and Strong AI)
Message-ID: <1992Feb16.185120.9182@psych.toronto.edu>
Keywords: consciousness,functionalism,meaning
Organization: Department of Psychology, University of Toronto
References: <1992Feb13.045721.29805@cs.yale.edu> <1992Feb13.201109.25439@psych.toronto.edu> <418@tdatirv.UUCP>
Date: Sun, 16 Feb 1992 18:51:20 GMT

In article <418@tdatirv.UUCP> sarima@tdatirv.UUCP (Stanley Friesen) writes:
>In article <1992Feb13.201109.25439@psych.toronto.edu> michael@psych.toronto.edu (Michael Gemar) writes:
>|
>|  I am a 
>|*realist* with regards to intentionality.  I *know* that I possess it,
>|and I believe that at least other humans possess it too.  I believe that,
>|contrary to what is implied above, there is an actual *fact of the matter*
>|whether information processing systems have it as well.  I *don't* believe
>|that it is only a matter of "taking a stance."  Atoms are real, and so
>|are minds, "regardless of whether anyone is taking any stance toward them."
>
>In general I would say that I agree with this.
>
>|I am *happy* to assert that whether or not computers have minds like
>|us is a *fact*, and not merely a stance.
>
>Yes, and now the question is 'can computers have minds?'
>
>I say that right now the answer is wholly unknown, and in fact at
>present it is entirely *undecidable* (Searle and Penrose to the contrary).
>
>It is only by *trying* to make such a computer that the answer can be found.

Stanley, you have obviously missed Searle's point.  His claim is that
even if we make a computer which *behaviourally* acts like it is 
conscious, it still won't be.  Note that his proof does not rely on
a difference in the "observables" between human behaviour and computer
behaviour, and so therefore is not decidable empirically.  It instead
relies on an analysis of the way computers operate.  In order to attack 
Searle's claim, you have to do philosophy (however nasty you may
find that...)

- michael



