From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!christo Thu Feb 20 15:22:08 EST 1992
Article 3869 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!christo
>From: christo@psych.toronto.edu (Christopher Green)
Subject: Re: Reference (was re: Multiple Personality Disorder and Strong AI)
Message-ID: <1992Feb19.173620.10529@psych.toronto.edu>
Keywords: consciousness,functionalism,meaning
Organization: Department of Psychology, University of Toronto
References: <418@tdatirv.UUCP> <1992Feb16.185120.9182@psych.toronto.edu> <426@tdatirv.UUCP>
Date: Wed, 19 Feb 1992 17:36:20 GMT

In article <426@tdatirv.UUCP> sarima@tdatirv.UUCP (Stanley Friesen) writes:
>In article <1992Feb16.185120.9182@psych.toronto.edu> michael@psych.toronto.edu (Michael Gemar) writes:
>|Stanley, you have obviously missed Searle's point.  His claim is that
>
>As *I* understand Searle, he is talking about being indistiguishable by
>
TO THE SOURCE!
_Minds, Brains, and Science_ pp. 39-40:
1. Brains cause minds. Now, of course, that's really too crude....
2. Syntax is not sufficient for semantics....a conceptual truth....
3. Computer programs are entirely defined by their formal, or syntactical
     structure....true by definition [of a computer program]
4. Minds have mental contents; specifically, they have semantic contents....
     just an obvious fact about the way minds work....

Conclusion 1. No computer program by itself is sufficient to give a system 
              a mind. Programs, in short, are not minds, and they are not by
              themselves sufficient for having minds....
Conclusion 2. The way that that brain functions cause minds cannot be solely
              in virtue of running a computer program....
Conclusion 3. Anything else that caused minds would have to have causal
              powers at least equivalent to those of the brain....
Conclusion 4. For any artefact that we might build which had mental states
              equivalent to human mental states, the implementation
              of a computer program would not by itself be sufficient.
              Rather, the artefact would have to have powers equivalent to 
              the powers of the human brain.


Sounds like philosophy to me Stanley. Now could we please consider the
claims that are actually made?



-- 
Christopher D. Green                christo@psych.toronto.edu
Psychology Department               cgreen@lake.scar.utoronto.ca
University of Toronto
---------------------


