From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!swrinde!zaphod.mps.ohio-state.edu!sol.ctr.columbia.edu!bronze!chalmers Tue Feb 11 15:24:55 EST 1992
Article 3523 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!swrinde!zaphod.mps.ohio-state.edu!sol.ctr.columbia.edu!bronze!chalmers
>From: chalmers@bronze.ucs.indiana.edu (David Chalmers)
Newsgroups: comp.ai.philosophy
Subject: Re: Strong AI and Panpsychism
Message-ID: <1992Feb6.051835.21146@bronze.ucs.indiana.edu>
Date: 6 Feb 92 05:18:35 GMT
References: <1992Feb2.192512.24293@psych.toronto.edu> <1992Feb4.044728.12324@bronze.ucs.indiana.edu> <1992Feb5.183955.13789@psych.toronto.edu>
Organization: Indiana University
Lines: 123

In article <1992Feb5.183955.13789@psych.toronto.edu> michael@psych.toronto.edu (Michael Gemar) writes:

>But my problem is in seeing how we would distinguish beliefs from other
>things which cause us to behave in a certain way.  If beliefs are *not*
>phenomenal, at least *in principle* if not always (which I will concede
>for now), then how do we know that the reason my leg flies up when you
>hit me below the knee is *not* because of a reflex, but because I
>*unconsciously* believe that, when you strike me there, I *should*
>kick?

Because it's the wrong kind of causation to count as a belief -- it's
not complex enough (just a simple circuit in the leg), it doesn't
interact with other beliefs and desires in the way that a belief would,
and so on.  It doesn't even look like a belief from the intentional
stance; if it did, we should expect it to possibly be affected by the
acquisition of new evidence, which it won't be.  As I said before,
there are plenty of ways to draw functional distinctions without
invoking phenomenology.

>But, if we are going to be purely instrumentalist, and use intentional-stance
>talk (as you suggest is ok below), then how can we even talk about whether
>it *actually* has beliefs or not?  I thought that all that was required
>for the intentional stance was that it *acted* like it had beliefs.  Why
>must we think that there is some functional complexity that is *really*
>belief, and some other kind which *acts* like belief but *isn't*, if
>all there is to beliefs is a propensity to act in a certain way?  It
>seems to me you can't be an instrumentalist, and *then* claim that
>certain functions are *real* beliefs.

First, note that I said that the intentional-stance analysis of belief
is only approximately correct.  This is because there can conceivably
exist systems that produce the right patterns of behaviour, without
the underlying beliefs -- e.g. the fabled humungous lookup table.  Still,
I think that most practical systems that get the (actual and
counterfactual) patterns of behaviour right will also have the
functional organization required for belief, so the intentional stance
works pretty well in practice.

>Indeed, it seems to me that
>functionalism *needs* some sort of "phenomenal tinge" to the appropriate
>functional complexity if it is going to assert that only some actions
>are the result of *real* beliefs. 

This certainly doesn't follow.

>You obviously have a different notion of the term "introspection" than
>I do, as I take as a necessary part of introspection that there be
>*subjective awareness* that the states you are reporting *belong
>to you*.   But I may be misconstruing your point here.

You can construe "introspection" in various ways.  My point is simply
that there are psychological construals of the notion, e.g. as
accessibility to verbal reports, that don't require phenomenology
but that may do much of the work that one wants the notion to do.

>Well, an intentional-stance analysis, it seems to me, can't get at
>the intension-with-an-s of a belief, and that's a big problem.
>When Oedipus married Jocasta, he also married his mother.  It
>is clear that his desire was to marry Jocasta, but *not* his mother.
>But a purely instrumentalist account of desire (and beliefs, and other
>mental terms) cannot, in this particular event, distinguish between these
>two beliefs.  (My thanks to Chris Green for this suggestion.) 

That's simply false.  An intentional-stance attribution is dependent
on counterfactual behaviour as well as actual behaviour (even a hardline
logical behaviourist like Ryle invoked behavioural dispositions, not
just actual behaviour).  Presumably if someone had let Oedipus know
that Jocasta was his mother, then he wouldn't have married her, but he
would have married her if his desire was to marry his mother.

>It is also the case that we can have beliefs which have *no* behavioural
>implications.  Take the (admittedly contrived) following case:
>I have a belief that cats are actually extremely cleverly disguised
>visitors from Mars.  I *also* believe that the Wicked Witch of the
>West will turn me into a toad if I *act* like I believe cats are
>aliens.  Surely there is some difference between this case and the 
>case where I believe that cats are just cats, yet there would be no
>behavioural difference.  I can also have beliefs about events that
>happen in dreams, yet these have no (overt) behavioural consequences.

Again, counterfactuals should be able to bring the differences out.
e.g. if I draw you aside and give you convincing evidence that the
Wicked Witch is dead, and that the Martian cats are about to overrun
the planet, then you'll do something about it.

Counterfactuals won't *always* be able to bring belief-differences
out, which is one reason why I'm not a logical behaviourist, but it
takes very contrived situations for this to happen -- e.g. the
lookup table, or Putnam's "super-Spartans" who are conditioned to
never let anyone know that they're in pain (and the possibility of
the latter is arguable).

>It is not clear to me if you feel that instrumentalism is a *necessary*
>consequence for taking a purely psychological approach to the mental.

It's not.  I think that functionalism about psychology is more
useful for a number of reasons, but the point is that
intentional-stance analyses of psychological concepts aren't far
from the mark.

>As far as the impossibility of psychology that requires qualia, would
>you say that phenomenal psychology is either impossible, or denies the
>importance of the phenomenal state?  There is a long, although very
>quiet, history of the study of the phenomenal in psychology.  Or would
>you argue that, in this case, it is not *actually* phenomenal states
>that are being studied? 

I tend to think that this kind of work hasn't really told us much
about the really mysterious parts of phenomenal states.  It's told
us about aspects of those states that can be characterized in
psychological terms -- the very fact that this kind of work usually
rests on verbal reports is evidence of that.  One could perform the
same kind of study on a zombie.  Nevertheless, I believe in a
strong coherence between phenomenal and psychological properties --
e.g. differences between colours, say, can be characterized either
phenomenologically or psychologically -- so psychological studies
can certainly shed some light on phenomenal properties, though
they can't take us all the way.

-- 
Dave Chalmers                            (dave@cogsci.indiana.edu)      
Center for Research on Concepts and Cognition, Indiana University.
"It is not the least charm of a theory that it is refutable."


