From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!news-server.csri.toronto.edu!rpi!think.com!mips!news.cs.indiana.edu!bronze!chalmers Thu Feb 20 15:21:19 EST 1992
Article 3788 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!news-server.csri.toronto.edu!rpi!think.com!mips!news.cs.indiana.edu!bronze!chalmers
>From: chalmers@bronze.ucs.indiana.edu (David Chalmers)
Subject: Re: Strong AI and panpsychism
Message-ID: <1992Feb16.222943.19110@bronze.ucs.indiana.edu>
Organization: Indiana University
References: <1992Feb12.150503.23822@oracorp.com>
Date: Sun, 16 Feb 92 22:29:43 GMT
Lines: 46

In article <1992Feb12.150503.23822@oracorp.com> daryl@oracorp.com writes:

>What I thought you were saying was that in Putnam's argument that any
>physical object implements any functional system, the correspondence
>between physical states and mental states was very unnatural and ad
>hoc. My question was why does it matter; if there exists *any* mapping
>(however strange) between physical states and mental states, then why
>can't the physical object be said to *possess* those mental states?

Well, there's a mapping from the 600 mental states I've had in the
last 10 minutes to the 600 books in my bookshelf, so obviously a
mapping alone isn't enough.  What we need is a *theory* according
to which mental states arise from certain kinds of physical states.
And the physical states that Putnam appeals to don't fit the theory
(of course, one can always argue that the theory is wrong).

>Definition and interpretation are relevant to what it *means* for a
>mental state to exist. I thought that you were taking the
>functionalist stance, that a physical object has certain mental states
>if and only if it implements a certain state machine. It implements
>such a state machine if and only if there is a mapping from physical
>states to machine states that preserves transition relations. Putnam
>claims that there *always* exists such a mapping, and you seem to be
>saying that only certain mappings count. Are you in agreement with all
>of this?

I'm not an interpretationist about mental states -- i.e. I don't think
we just interpret systems *as* mental, I think that certain systems
really *do* have mental properties (at least as far as qualia are
concerned, there's a fact of the matter).  I don't think that
functionalism (about qualia) is a conceptual truth, but I think it's
a contingent fact that qualia arise from the right kind of functional
organization.  And I don't think that simple rocks have the right kind
of functional organization (at least, not the right kind for complex
qualia of the kind that humans have), because I don't agree that rocks
really do implement any FSA.  (It looks like I don't need to rule out
complex mappings to make the last point, as the proof fails in any case.
But I don't think it would be ad hoc to rule out time-varying states,
at the very least, as realizing states.  Otherwise one might even allow
in states like "co-exists with a brain in state S", and so on, in which
case my stapler would automatically have all the mental states that I do.

-- 
Dave Chalmers                            (dave@cogsci.indiana.edu)      
Center for Research on Concepts and Cognition, Indiana University.
"It is not the least charm of a theory that it is refutable."


