From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!csd.unb.ca!morgan.ucs.mun.ca!nstn.ns.ca!aunro!ukma!wupost!zaphod.mps.ohio-state.edu!sol.ctr.columbia.edu!bronze!chalmers Fri Jan 31 10:27:24 EST 1992
Article 3305 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!csd.unb.ca!morgan.ucs.mun.ca!nstn.ns.ca!aunro!ukma!wupost!zaphod.mps.ohio-state.edu!sol.ctr.columbia.edu!bronze!chalmers
>From: chalmers@bronze.ucs.indiana.edu (David Chalmers)
Newsgroups: comp.ai.philosophy
Subject: Re: Strong AI and Panpsychismdir
Message-ID: <1992Jan30.193547.7032@bronze.ucs.indiana.edu>
Date: 30 Jan 92 19:35:47 GMT
References: <1992Jan29.210141.26133@cs.yale.edu> <1992Jan29.214150.1709@bronze.ucs.indiana.edu> <1992Jan30.021105.1157@memstvx1.memst.edu>
Organization: Indiana University
Lines: 64

In article <1992Jan30.021105.1157@memstvx1.memst.edu> langston@memstvx1.memst.edu writes:

>  If, indeed, consciousness is over and above the physical properties, why is
>it the case that consciousness can be influenced and controlled by these same
>properties?

I don't know, but there's little doubt that consciousness is dependent on
the physical in a very direct way.

>Isn't it just as plausible to consider consciousness an
>emergent property brought about by the various physical components? 

This depends on what you mean by emergent.  In one sense, that's compatible
with what I'm saying.  But in the more common sense among AI people,
i.e. the sense of an "innocently emergent" property, or a systems-level
property, I can't see it.  Systems level properties should follow from
lower-level properties, along with knowledge of the way in which they
work and so on, if one is careful enough about analyzing the system; there
shouldn't be any deep surprise there.  e.g. in some ways the ability of
a neural network to learn may seem to be an emergent property from certain
low-level rules, but it's nevertheless quite predictable from those
low-level rules, with a close enough analysis.  Whereas there's no sense
in which the existence of qualia seems to follow directly from the
physical structure and function of the brain.

>  Why is it appropriate to discuss the 'consciousness' of a thermostat or a
>soda machine in materialistic terms, and not the human mind?

I'm no more materialist about thermostat qualia than I am about human
qualia.  But I take the existence of human qualia as a given, note that
qualia seem to depend on physical properties in a direct way, try on the
basis of various kinds of reasoning to constrain the kind of physical
properties on which qualia depend, come to the conclusion that the
relevant properties are information-processing properties, and cutting
a long story very short, it becomes quite natural to think that
thermostats have qualia too.

>  It just seems as though since it is very difficult to look at the 
>independent functions of mind, or even of brain, and point out consciousness,
>people are willing to say that consciousness must be something different by
>orders of magnitude.

This would be fine if by consciousness we just meant some sophisticated
cognitive capacity, e.g. the capacity to reflect on the contents of
one's thoughts, or to report one's mental states, or to voluntarily
control action.  In each of these cases consciousness could be analyzed
as a complex function, and I'd agree that the lack of localizability
shoul;dn't be taken to indicate a vast mystery.  But in each of these
cases, if you performed a sophisticated enough analysis of the functions
involved, then one would end up with an *explanation* of the phenomena
in question -- i.e. people are able to report their representational
contents because of mechanism X.  This would just be a standard sort
of cognitive-science explanation of cognitive capacities.  But the point
is that consciousness, in the sense it is most puzzling, isn't a
*capacity* to be explained at all.  It's more of a raw datum -- the
existence of subjective experience.  And it's very difficult to see
how any functional explanation, no matter how sophisticated, could
begin to explain why this experience exists.  It's simply the wrong
kind of explanation.

-- 
Dave Chalmers                            (dave@cogsci.indiana.edu)      
Center for Research on Concepts and Cognition, Indiana University.
"It is not the least charm of a theory that it is refutable."


