From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rutgers!ub!zaphod.mps.ohio-state.edu!sol.ctr.columbia.edu!usc!wupost!uunet!psinntp!scylla!daryl Tue Apr  7 23:23:17 EDT 1992
Article 4825 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rutgers!ub!zaphod.mps.ohio-state.edu!sol.ctr.columbia.edu!usc!wupost!uunet!psinntp!scylla!daryl
>From: daryl@oracorp.com (Daryl McCullough)
Newsgroups: comp.ai.philosophy
Subject: Re: A rock implements every FSA
Message-ID: <1992Mar30.150319.7149@oracorp.com>
Date: 30 Mar 92 15:03:19 GMT
Organization: ORA Corporation
Lines: 45

chalmers@bronze.ucs.indiana.edu (David Chalmers) writes:

>>A deterministic finite state machine can be defined by the following
>>functions:
>>
>>   T : State -> State, the internal transition function
>>   I : State x Input -> State, the input transition function
>>   O : State -> Output, the output function

>This is more or less OK, but there's no need to split the transition
>function into T and I.  T is just a special case of I, for the
>case where the input is the "null input", or whatever you want
>to call it.  What you have above is harmless, but more complex
>than necessary (and we'll see below that you exploit this split to
>do more work than it can).

I made the split to satisfy *you*, Dave. In our discussion about the
table lookup program, your main argument against the table lookup
being conscious was the "lack of richness" of its thinking process.
And this lack of richness was revealed by the fact that it took zero
time to "think" about its inputs before it made its outputs. So I have
patched up this discrepancy by allowing "silent" transitions where
there is thinking, but no inputs. However, as I thought my example
showed, this silent, internal thinking can be perfectly trivial; as
simple as counting. It is therefore not clear to me in what sense
there can be more "richness" in some FSA's than there is in a table
lookup.

If you allow a "null input" to be a possible input, then the humongous
table lookup program becomes functionally equivalent to a human brain.
To see this, note that the states of the table lookup program are
essentially sequences of inputs [i_1,i_2,i_3,...,i_n]. We use the
mapping M([]) = the initial state,
M([i_1,i_2, ..., i_n,i_{n+1}]) = I(M([i_1,i_2, ..., i_n]),i_{n+1}).
The output for state [i_1,i_2, ..., i_n] is whatever the lookup table
has for that sequence of inputs, which is correct by the assumption that
the table lookup program gets the behavior right.

The conclusion, whether you have silent transitions or not, is that
functional equivalence doesn't impose any significant constraints on a
system above and beyond those imposed by behavioral equivalence.

Daryl McCullough
ORA Corp.
Ithaca, NY


