From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael Wed Feb  5 11:56:19 EST 1992
Article 3409 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael
>From: michael@psych.toronto.edu (Michael Gemar)
Subject: Re: Strong AI and Panpsychism
Message-ID: <1992Feb2.192512.24293@psych.toronto.edu>
Keywords: panpsychism
Organization: Department of Psychology, University of Toronto
References: <1992Feb1.224845.10781@bronze.ucs.indiana.edu> <1992Feb2.000933.29482@psych.toronto.edu> <1992Feb2.053646.625@bronze.ucs.indiana.edu>
Date: Sun, 2 Feb 1992 19:25:12 GMT

In article <1992Feb2.053646.625@bronze.ucs.indiana.edu> chalmers@bronze.ucs.indiana.edu (David Chalmers) writes:
>In article <1992Feb2.000933.29482@psych.toronto.edu> michael@psych.toronto.edu (Michael Gemar) writes:

>>What makes a state "mental"?  If my blood sugar goes up, I get sleepy.  The
>>change in blood sugar is a change in state which causes a behavioural change.
>>Why isn't it a "mental" state?   If the mind is the entity that's responsible
>>for producing behaviour, then why isn't *any* change in state that
>>produces behaviour a change in "mental" state?  My naive answer to this
>>is that it is because mental states have a special quality, namely their
>>phenomenal, or at least (in the case of unconscious states, at least
>>*potentially* phenomenal) component.  Otherwise, it hard for me to see how
>>we can distinguish between mental states and other kinds of causes of
>>behaviour.
>
>Well, there are different ways in which states can cause behaviour, and
>only some of them are mental.  The question of how to characterize the
>difference is a very tricky one, and the whole field of action theory
>in philosophy is essentially devoted to this question, but the
>phenomenal/nonphenomenal distinction isn't the only or even the best
>way of doing this.  A recent book on the subject is Dretske's _Explaining
>Behavior: Reasons in a World of Causes_; like most other books on the
>topic, it goes on and on about characterizing mental states without
>even raising the problem of phenomenal states.
>
>One way of characterizing causal states as mental is that there must
>be some rational link between the state and the action, though that's
>far short of a complete account, and only works for some states.

Thanks for the reference.  As this is an area I know rather little
about, I should probably check some of the classic sources before
shooting my mouth (fingers?) off. 

>>But certainly it is only a belief if it *in principle* could be conscious.
>>The notion of "believing" things that we can't *in principle* know we
>>believe seems ludicrous.  This is not to say that there can't be unconscious
>>beliefs, but only that *any* such belief must, under other circumstances,
>>have a conscious component.  This requirement makes belief dependent upon
>>phenomenal experience, even if any particular belief is not conscious.
>
>Searle argued just this in his 1990 BBS paper "Consciousness, explanatory
>inversion, and cognitive science".  Most of the commentators seemed to
>disagree.  In practice, it seems to me that just about any belief will
>be consciously accessible in principle, but I see no reason why that
>has to be so.

I also obviously should read the above paper!  

>e.g. imagine that world where God created a universe physically identical
>to ours, but with no phenomenal states.  We'd probably say that our twins
>in that world had no sensations, and maybe no pains, but I think they
>would nevertheless have beliefs.

Well, I must say that our intuitions are different.  I would certainly
argue that if *I* had no awareness of *all* my "beliefs", I
would think the term to be mis-applied to the states being labeled.
In this case, "beliefs" are merely *ascribed*.  In my view, what
distinguishes "true" beliefs from merely ascribed beliefs is that
the former carry phenomenal baggage, whereas the latter do not.  A
Venus Fly-trap closes when an object brushes its hair-triggers.  Does it
close because it *truly* believes that there is a fly?  Or is that
merely an way of using intentional talk to describe a non-intentional
act?  I think the latter.

Indeed, one alternative way of looking at the question of "unconscious"
beliefs is that the use of the term "belief" in such cases is merely
taking an "intentional stance" toward states which aren't *really*
beliefs. 

- michael
 




