From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!sol.ctr.columbia.edu!bronze!chalmers Tue Jan 21 09:27:08 EST 1992
Article 2885 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!zaphod.mps.ohio-state.edu!sol.ctr.columbia.edu!bronze!chalmers
>From: chalmers@bronze.ucs.indiana.edu (David Chalmers)
Subject: Re: Intelligence Testing
Message-ID: <1992Jan18.225713.26060@bronze.ucs.indiana.edu>
Organization: Indiana University
References: <1992Jan18.144220.11862@oracorp.com>
Date: Sat, 18 Jan 92 22:57:13 GMT
Lines: 82

In article <1992Jan18.144220.11862@oracorp.com> daryl@oracorp.com writes:
>David Chalmers writes:
>
>> (1) The right behaviour logically implies mentality.
>> (2) The right behaviour empirically "implies" mentality.
>> (3) Implementing the right program logically implies mentality.
>> (4) Implementing the right program empirically implies mentality.
>
>I don't understand these four alternatives. First of all, my
>understanding of "A logically implies B" is that such statements are
>only true if (a) there is some generally-accepted theory from which
>you can logically deduce B from A, or (b) it is true by definition,
>such as "Being a bachelor logically implies being unmarried". There
>are no such definitions or theories, so it seems to me that 1 & 3 are
>impossible.

Your conception of logical implication is narrower than mine, I think.
What I'm talking about is truth by conceptual necessity -- i.e.
propositions such that denial thereof would imply misuse of the
concepts involved.  "All bachelors are men" is such a proposition, for
instance, and so, probably, is "All bachelors are unmarried" (modulo
dubious Lakoff-style considerations about the possibility of wild
swinging married people who one might call "bachelors").

Maybe you're invoking the Quinean denial of the analytic/synthetic
distinction, but all that's required here is a very weak distinction
(in particular I'm not implying that (1) there are no shades of grey,
(2) analytic statements are unrevisable, (3) there are explicit
definitions for all or most terms).  All that's required is that
there are semantic constraints on the way that one applies concepts.
(Roughly, that concepts have application-conditions, even if these may
be blurry, inexplicit, holistic, revisable and so on.)

If you really think that 1 and 3 are out of bounds immediately, you'll
have a lot of arguing to do.  Any number of philosophers, from Ryle
though Lewis and Armstrong to Dennett have thought that the concept
of mentality is such that something like 3 (in Ryle's and maybe
Dennett's case, 1 also) holds.  Maybe they were wrong, but they weren't
stupid.  I recommend Lewis's "Psychophysical and theoretical
identifications", Australasian Journal of Philosophy 50:249-58, 1972,
for such a conceptual analysis of mentality.  I also recommend Horgan's
"Supervenience and cosmic hermeneutics", Southern Journal of Philosophy
Supplement 22:19-38, 1984, for a nicely argued case that *all* facts
follow from physical facts via conceptual necessity.

>Also, I don't see the difference between 2 & 4. A program, to me, is a
>specification of behavior, so I don't see how "behavior implies
>mentality" is any different from "program implies mentality".

Well, this is simply false, I think.  Consider the two programs:

1. print "1"                    2. for i:=1 to 6 do
   print "2"                          if 6 mod i = 0 then print (i);
   print "3"
   print "6"

These have the same behaviour, but different programs.  A program
doesn't just specify behaviour, it puts strong constraints on how
that behaviour comes about.

>I agree that the giant lookup table is ridiculous as a way to
>implement AI, but I don't understand why it is so obvious that such an
>implementation would lack mentality. Your answer might be that it
>would lack the internal states that real minds have, but I don't even
>grant that: in the case of the lookup table, the internal state would
>be coded as a location in the lookup table.

It's not that it has no internal states, the problem is more that it has
trivial internal states, with utterly uninteresting causation going on
between one statement and the next.  Furthermore, the content of each
statement that the system utters could be arbitrarily changed without
affecting the causal structure of the system at all.  Of course, I
can't *prove* that such a system lacks consciousness, any more than I
could prove that a rock lacks consciousness.  But it certainly seems
deeply implausible.  For an in-depth argument about why a look-up table
would lack mentality, see Block, "Psychologism and behaviorism",
Philosophical Review 90:5-43, 1981.

-- 
Dave Chalmers                            (dave@cogsci.indiana.edu)      
Center for Research on Concepts and Cognition, Indiana University.
"It is not the least charm of a theory that it is refutable."


