From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!bonnie.concordia.ca!thunder.mcrcim.mcgill.edu!snorkelwacker.mit.edu!apple!ames!ncar!noao!stsci!stsci.edu!bsimon Mon Jan  6 10:30:02 EST 1992
Article 2439 of comp.ai.philosophy:
Xref: newshub.ccs.yorku.ca comp.ai.philosophy:2439 sci.philosophy.tech:1664
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!bonnie.concordia.ca!thunder.mcrcim.mcgill.edu!snorkelwacker.mit.edu!apple!ames!ncar!noao!stsci!stsci.edu!bsimon
>From: bsimon@elvis.stsci.edu (Bernie Simon)
Newsgroups: comp.ai.philosophy,sci.philosophy.tech
Subject: Re: Causes and Reasons
Message-ID: <BSIMON.91Dec30071853@elvis.stsci.edu>
Date: 30 Dec 91 12:18:53 GMT
References: <1991Dec24.014716.6901@husc3.harvard.edu>
	<1991Dec25.042628.18737@bronze.ucs.indiana.edu>
	<1991Dec25.015221.6911@husc3.harvard.edu>
	<1991Dec28.221923.17443@bronze.ucs.indiana.edu>
Sender: news@stsci.edu
Organization: me, myself, and i
Lines: 25
In-Reply-To: chalmers@bronze.ucs.indiana.edu's message of 28 Dec 91 22: 19:23 GMT

> Premise: Mental states are supervenient on computational states.
> 
> Now, talk of computational states is somewhat vague, but from Putnam's
> other writing we can take it that he is referring either to states of
> probabilistic automata or of Turing machines.  We'll take the latter,
> though it doesn't matter much for these purposes (anyone who finds
> probabilistic FSAs more realistic can recast the discussion
> straightforwardly).
> 
> So we can paraphrase the above claim as something like: when a human
> is in a mental state M, then that human is a realization of a Turing
> Machine T in state S, such that any physical system that realizes T in
> state S will be in mental state M.

This premise begs the question of the truth of functionalism. Of
course, if an (arbitrary) mental state is a realization of a Turing
machine in state S, then functionalism is true. But this is just
saying "if functionalism is true, then functionalism is true", an
entirely tautologous result. Moreover, this premise is not equivalent
to supervenience, since supervenience merely states that if two
mental states differ, there is necessarily a difference in the
associated physical states. It says nothing about these physical
states being realizations of Turing machines.
--
Bernie Simon	(bsimon@stsci.edu)


