From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rutgers!psuvax1!hsdndev!husc-news.harvard.edu!zariski!zeleny Tue Apr  7 23:22:50 EDT 1992
Article 4777 of comp.ai.philosophy:
Xref: newshub.ccs.yorku.ca comp.ai.philosophy:4777 sci.philosophy.tech:2450
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rutgers!psuvax1!hsdndev!husc-news.harvard.edu!zariski!zeleny
>From: zeleny@zariski.harvard.edu (Mikhail Zeleny)
Newsgroups: comp.ai.philosophy,sci.philosophy.tech
Subject: Re: A rock implements every FSA
Message-ID: <1992Mar28.103932.10369@husc3.harvard.edu>
Date: 28 Mar 92 15:39:31 GMT
References: <1992Mar25.161024.2081@oracorp.com>
Organization: Dept. of Math, Harvard Univ.
Lines: 57
Nntp-Posting-Host: zariski.harvard.edu

In article <1992Mar25.161024.2081@oracorp.com> 
daryl@oracorp.com (Daryl McCullough) writes:

DMC:
>I hate to admit this, but I think I agree with Mikhail Zeleny on this
>point; Mikhail's arguments have convinced me that Putnam's proof does
>make a certain amount of sense.

Daryl, this is the second most sensible thing I've ever heard you say.
Keep it up, and I just might drag you back in the camp of humanity.

DMC:
>As several people have pointed out, the standard notion of
>implementing a finite state machine involves getting the inputs and
>outputs right, (which Putnam's rocks manifestly don't). However, that
>notion of implementation is too strong if you want to say (a) that
>"being conscious" means to implement a certain kind of state machine,
>and (b) that sensation-deprived human brains are still conscious. The
>problem with the latter case (imagine a deaf, blind, paralytic) is
>that there are no inputs and outputs from the world possible, so
>whatever it means to implement a conscious being cannot require
>getting the *actual* inputs and outputs right.
>
>If, on the other hand, you drop the input/output requirement, then
>there is nothing to prevent a rock from implementing every FSM. I
>don't see how Chalmer's complaints about counterfactuals is even a
>problem. All that it takes to support counterfactuals is to be able to
>show, for each possible input sequence, that there is a corresponding
>trace of the states of the rock showing how the rock would have
>responded if that input sequence had occurred.

Methinks you overestimate the philosophical relevance of inputs and
outputs.  Lacking a distinction between active and passive powers, just how
will your self-professed materialism accomodate the distinction between a
lump of matter "receiving" the inputs and "producing" the outputs, and
another lump "just sitting there"?  Keep in mind that the communication
metaphors utilized by the automata theorists derive all of their rhetorical
force from a tacit presupposition of agency implicit in all communication
schemata.  Given your dismissal of intentionality, such presuppositions are
incompatible with the rest of your views.

>Daryl McCullough
>ORA Corp.
>Ithaca, NY


`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'
: Qu'est-ce qui est bien?  Qu'est-ce qui est laid?         Harvard   :
: Qu'est-ce qui est grand, fort, faible...                 doesn't   :
: Connais pas! Connais pas!                                 think    :
:                                                             so     :
: Mikhail Zeleny                                                     :
: 872 Massachusetts Ave., Apt. 707                                   :
: Cambridge, Massachusetts 02139           (617) 661-8151            :
: email zeleny@zariski.harvard.edu or zeleny@HUMA1.BITNET            :
:                                                                    :
'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`


