From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!usc!cs.utexas.edu!uunet!ogicse!das-news.harvard.edu!husc-news.harvard.edu!zariski!zeleny Tue Apr  7 23:23:22 EDT 1992
Article 4834 of comp.ai.philosophy:
Xref: newshub.ccs.yorku.ca comp.ai.philosophy:4834 sci.philosophy.tech:2478
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!usc!cs.utexas.edu!uunet!ogicse!das-news.harvard.edu!husc-news.harvard.edu!zariski!zeleny
>From: zeleny@zariski.harvard.edu (Mikhail Zeleny)
Newsgroups: comp.ai.philosophy,sci.philosophy.tech
Subject: Re: A rock implements every FSA
Message-ID: <1992Mar30.233730.10489@husc3.harvard.edu>
Date: 31 Mar 92 04:37:27 GMT
Article-I.D.: husc3.1992Mar30.233730.10489
References: <1992Mar30.164508.13978@oracorp.com>
Organization: Dept. of Math, Harvard Univ.
Lines: 92
Nntp-Posting-Host: zariski.harvard.edu

In article <1992Mar30.164508.13978@oracorp.com> 
daryl@oracorp.com (Daryl McCullough) writes:

>chalmers@bronze.ucs.indiana.edu (David Chalmers) writes:
>(in response to Mikhail Zeleny)

DC:
>>I note that this doesn't address the substance of my objection.  It
>>probably doesn't matter much, as that objection was made under the
>>assumption that you, following Putnam, were mapping inputs to inputs
>>and states to states ("the relevant remarks on p. 124" can only be
>>read that way).  Your exchange with Joseph O'Rourke has made it clear
>>that in fact you're mapping both inputs and states of the FSA
>>simultaneously onto internal states of the rock, which may as well be
>>entirely causally disconnected from the outside world.
>
>>This is an audacious strategy; if one allowed such mappings, then
>>anything might follow.

DMC:
>It is certainly true that in the mathematical theory of finite
>automata, a big distinction is made between "inputs" and "states".
>However, I don't think that this distinction is fundamental. For a
>real system, such as a computer or a human being, an input is
>essentially a non-destructive state transition with an external cause.
>If you do as Mikhail suggests, and map inputs and states to rock
>states, there is nothing particularly wrong, except for the difficulty
>of communicating with the rock. There is no easy mechanism for causing
>the rock to go into the state corresponding to having recieved input
>i. This difficulty can be viewed as a defect in the sensory mechanisms
>of the rock, rather than a defect in the "thought processes" of the
>rock. Just as a person being deaf doesn't imply that the person is
>unintelligent, perhaps we shouldn't take the difficulty of making
>inputs for the rock as evidence of lack of intelligence in the rock.

Bravo, Daryl!  I applaud your integrity: instead of giving up on your
assumptions, you push on to ostensibly counterintuitive consequences.  Let
us see where this gets us.  The essential inconsistency that remains in
your approach has to do with your retaining the notion of an external
cause.  Consider that a true materialist has no recourse to drawing any
boundaries separating the knowing subject from the objects of his
knowledge.  What appears to be an external cause to one observer will, with
a bit of judicious gerrymandering, appear like a bona fide internal state
to another.  On the other hand, even the rock is capable of accepting
inputs, e.g. from natural clocks, like the Sun.  So what's the big deal
about being human?

DMC:
>Note that it is not *impossible* to communicate with the rock, only
>very difficult. Since the states of the rock are mapped to pairs of
>states and inputs, if you want to make input I, simply *put* the rock
>in the state corresponding to <S,I>. This would correspond, in the
>case of a deaf person, to simply putting the person's brain in the
>state corresponding to having heard a certain sound, even though the
>sound wasn't actually received by the ears.

True enough.  Does this still seem unproblematic?

DMC:
>I don't see how functionalism can be liberal enough to grant
>intelligence to humans with no working sense organs, and strict enough
>to deny intelligence to rocks and clocks and tape players. If getting
>the inputs right is important, then I don't see how the
>sensation-deprived humans pass the test, and if it isn't important,
>then I don't see how rocks fail.

To play an advocatus diaboli, the biggest problem facing the functionalist
is the absence of a stable notion of implementation, which depends on the
absolute identification of functional states in its subjects.  Observe that
this requires achieving the same Platonic task of "separating the nature at
its joints"; furthermore, note that the situation is exactly parallel to
that obtaining in the case of putative reference without intentionality.
To my uneducated mind, if you buy Platonism, you might as well buy it lock,
stock, and barrel.  As observed Karl Marx, there's no such thing as being
"a little bit pregnant".

>Daryl McCullough
>ORA Corp.
>Ithaca, NY


`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'
: Qu'est-ce qui est bien?  Qu'est-ce qui est laid?         Harvard   :
: Qu'est-ce qui est grand, fort, faible...                 doesn't   :
: Connais pas! Connais pas!                                 think    :
:                                                             so     :
: Mikhail Zeleny                                                     :
: 872 Massachusetts Ave., Apt. 707                                   :
: Cambridge, Massachusetts 02139           (617) 661-8151            :
: email zeleny@zariski.harvard.edu or zeleny@HUMA1.BITNET            :
:                                                                    :
'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`'`


