From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!sun-barr!olivea!uunet!mcsun!uknet!edcastle!aiai!jeff Thu Apr 16 11:33:27 EDT 1992
Article 4988 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!sun-barr!olivea!uunet!mcsun!uknet!edcastle!aiai!jeff
>From: jeff@aiai.ed.ac.uk (Jeff Dalton)
Newsgroups: comp.ai.philosophy
Subject: Re: A rock implements every FSA
Message-ID: <6585@skye.ed.ac.uk>
Date: 8 Apr 92 18:16:02 GMT
References: <1992Mar30.150319.7149@oracorp.com> <1992Mar30.202444.24243@bronze.ucs.indiana.edu>
Sender: news@aiai.ed.ac.uk
Organization: AIAI, University of Edinburgh, Scotland
Lines: 61

In article <1992Mar30.202444.24243@bronze.ucs.indiana.edu> chalmers@bronze.ucs.indiana.edu (David Chalmers) writes:
>In article <1992Mar30.150319.7149@oracorp.com> daryl@oracorp.com (Daryl McCullough) writes:

>>I made the split to satisfy *you*, Dave. In our discussion about the
>>table lookup program, your main argument against the table lookup
>>being conscious was the "lack of richness" of its thinking process.
>>And this lack of richness was revealed by the fact that it took zero
>>time to "think" about its inputs before it made its outputs. So I have
>>patched up this discrepancy [...]

(I would have thought that DC would have said the lookup machine
was conscious.  Why is it worse off than a thermostat?  But let's
suppose we mean consciousness that's more or less like human
consciousness.)

>I made it abundantly clear that the problem with the lookup
>table is not the mere lack of silent transitions -- see my response
>to your message about the brain that beeps upon every step.  Rather,
>the objection is that (a) a lot of conscious experience goes on
>between any two statements I make in a conversation; and (b) it's
>very implausible that a single state-transition could be responsible
>for all that conscious experience.

As may be clear from my previous articles, I agree with Chalmers
here.  

Daryl McCullough's point that a table lookup with the right I/O
behavior is functionally equivalent to a brain is right, given what he
usually seems to mean by "functionally equivalent".  But pretty much
all he means is that it looks the same from outside, ie has the right
I/O behavior.

Now, if that's all you care about, fine.  But it just doesn't
answer the question that many of us care about, namely whether
the right sort of things are going on inside.

It's always possible to say "well, let's just add some stuff, and
if it's sufficiently complex we can argue that there's an
interpretation that shows it's the same as what's going on in
a brain."

>From the point of view of someone who thinks what goes on inside
matters, this is at least an improvement over Daryl's older argument
that the complexity of the data alone (ie, of the sentences in the
table) allowed that sort of interpretation.

However, the emphasis on whether it takes "zero time" for thinking
shows that the verificationist/behaviorist 3rd-person perspective
is still being used to muddy the waters.  When Chalmers says "a lot
of experience goes on", the verificationst can say "prove it's not
going on in table lookup without any reference to how it would seem
subjectively to the table lookup machine".  But no one can prove
that, any more than they can prove coffee cups aren't secretly
thinking "what a stupid pace to set me down" when they're moved
near the edge of a table.  The after muddying comes in when the
v/behaviorist says "but coffee cups don't have any of the right
behavior", as if we could, w/o begging the question of TT behaviorism,
say that the behavior of the table lookup machine showed it wasn't
conscious.

-- jd


