From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael Wed Feb  5 11:56:33 EST 1992
Article 3434 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael
>From: michael@psych.toronto.edu (Michael Gemar)
Subject: Re: Strong AI and panpsychism (was Re: Virtual Person?)
Message-ID: <1992Feb3.192337.12056@psych.toronto.edu>
Organization: Department of Psychology, University of Toronto
References: <1992Jan31.193524.28969@psych.toronto.edu> <1992Feb3.145017.27440@watdragon.waterloo.edu>
Date: Mon, 3 Feb 1992 19:23:37 GMT


In article <1992Feb3.145017.27440@watdragon.waterloo.edu> cpshelle@logos.waterloo.edu (cameron shelley) writes:
>michael@psych.toronto.edu (Michael Gemar) writes:

[discussion of the possibility of "virtual" situatedness, with which
we both agree]

>
>I think we understand one another.  My quibble is still with your
>previous assertion: there is a non-zero chance of a collection of
>material attaining even a transitory conscious structure.  I find this
>an extremely weak basis for general statements about consciousness,
>and therefore for any eithical considerations based on such
>generalities.  Your remarks about intelligent rocks and tornados has
>not assuaged this concern any. 
>
>
>> Would you follow after Chalmers in asserting that atoms therefore have
>> "limited" consciousness?  And if all you ask for is a change in state
>> provoked by the "outside", then practically *any* physical feature
>> can generate "consciousness".  This to me makes consciousness a meaningless
>> term.  If atoms can have subjective experiences, if there is something that
>> it is like to be an atom, then I think that the notion of mentality become
>> vacuous.
>
>I recall answering this exact question in the article previous to this one!
>If we take environment seriously as a constraint on consciousness, then
>atoms and galaxies are even less likely than bodies of air to be ascribed
>consciousness.  I invite you to review my previous posting containing this
>observation.  You also did not answer my question about whether you take
>consciousness to be binary.  Your postings in the past indicate you do
>have a binary view of the issue, and a deterministic requirement for
>knowledge.  I cannot agree with either, so part of our disagreement may
>simply be varying assumptions regarding these points.

You're right, you *did* answer this question!  My apologies.  However,
I'm not sure (now that I've recalled your reply) that I find the answer
satisfactory.  Let me start by answering *your* question as to whether I
take consciousness to be "binary".  I think that my answer must be
qualified with "It depends on what you mean by consciousness".  If you
are referring to the kinds of experiences that we as humans have, no, I
don't think it's binary.  I'm happy to ascribe a mental life to
apes, bats (although I don't know what they'd be like :-), and even
fish and (some?) invertebrates.   Their experiences differ from ours, and
might in some way said to be "less rich" than our experiences (although
I worry about possible anthrocentrism).  So, I would be happy saying that
some organisms have a more boring mental life.

On the other hand, this does not necessarily imply that there is no principled
division between hunks of matter with mental lives and hunks without.  I
think that it makes no sense to talk of "half-qualia", or individuated
experiences which are somehow "less" experiential than other experiences
(less subjective??).   So, either some matter has the feature of having
experiences and some doesn't, or all matter has some experiences (a la
Chalmers).  I prefer the former view, although as Dave has demonstrated, a
good case can be made for the latter approach (I don't think it's right,
but it's still a good case).

If you want to avoid the pan-qualiaism of Chalmers, which I take it you
do (given your problems with sentient rocks), then it seems that you
(and other like-minded Functionalists) have to give a principled account
as to how qualia comes about from functional complexity.  That is, you
have to explain why an extremely (functionally) complex like a 
hurricane does *not* have qualia, whereas a relatively simple entity 
like a slug does (this assumes that you would assert the truth of these
two propositions - if not, then we've got further to go! (Yes, I know
that some people may argue that the slug is more functionally complex than
the hurricane.  I take it that, at least for Cam, there would be *some*
physical phenomena which would be functionally more complex which he
would take not to be sentient.  Remember, I am not talking about the
*type* of complexity, merely, if you like, the number of possible states
the system can be in.)). 

It seems rather ad hoc to me to say that it is only the "right" kind of
complexity which produces qualia, the "right" functional arrangements.
Not only is it ad hoc, but there would be no way of establishing if
such a position were wrong, as long as we captured the functional
arrangements that produce *our* experiences!   That is, there would be
no way to determine if *other* functional arrangements, which we as
humans did *not* possess, *did* in fact produce qualia.  

So, one problem that I see with a non-pan-experientialist version of
Functionalism is that it must give a principled distinction between
functions that produce qualia and those that don't.  (I think 
that pan-experientialism has a similar problem in distinguishing
among functions that produce *different* qualia, but that point I'll
take up with Chalmers).

Another difficulty that, on reflection, I see with your position, Cam, is
a principled way of distinguishing between "environment" and "entity".
If we are only concerned with functional relations, then how do we
separate those functions which are "outside" of the "entity" and
those which are internal?  Input and outputs, if looked at purely
functionally, are merely more functional relations, which are connected
to other functional relations in the world.  Come to think of it, this 
is, I believe, a problem for functionalism in general.  (If this issue
has been dealt with before by someone, I would appreciate any references...).

- michael                                   


