From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rutgers!ukma!wupost!zaphod.mps.ohio-state.edu!cis.ohio-state.edu!cannelloni.cis.ohio-state.edu!chandra Wed Feb  5 11:55:39 EST 1992
Article 3341 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rutgers!ukma!wupost!zaphod.mps.ohio-state.edu!cis.ohio-state.edu!cannelloni.cis.ohio-state.edu!chandra
>From: chandra@cis.ohio-state.edu (B Chandrasekaran)
Newsgroups: comp.ai.philosophy
Subject: Thermostats, consciousness and derived intentionality
Message-ID: <1992Jan31.184254.9068@cis.ohio-state.edu>
Date: 31 Jan 92 18:42:54 GMT
Sender: news@cis.ohio-state.edu (NETnews        )
Organization: The Ohio State University, Department of Computer and Information Science
Lines: 107
Originator: chandra@cannelloni.cis.ohio-state.edu

The example of thermostats has come up often in various related
threads: For example, David Chalmers and Cameron Shelly think
thermostats have *some* qualia.  I would like to argue that 
information processing accounts are often *stances* that we take
descriptively and do not inhere in the situation.  This has
consequences to the argument about thermostats having qualia or
whatever, without say mere rocks having them.

Consider the following scenario. In some forest there is an almost
enclosed cave with a small opening close to the floor, and there is
also a rock pretty much covering the hole.  All of this simply
occurred as a result of natural and random forces moving things
around.  I am a stone-age tribesman and am walking by and squeeze
myself into the cave and notice that over a period of time the case is
pretty constant in temperature.  I make further investigations and
find that as the sun shines on the opening of the cave, the rock near
the opening expands and actually closes the opening, shutting out the
sun and the heat.  As the cave cools down, the rock shrinks and some
sun comes in.  This is scenario 1.

I go to another cave whose insides I happen to like better, but whose
temperature varies too much for comfort.  I now reorganize the rocks
in the second cave to mimic the way that the first cave's temperature
was held constant.  My buddies ask me what this arrangement is called,
and in a moment of inspiration, I say "Let us call it a thermostat."
This arrangement of rocks is scenario 2.

In late night rap sessions, we stoneagers talk about consciousness and qualia.
Someone proposes that the thermostat has to have some qualia, since it
has some representation, is doing some information processing, has
goals and means, etc.  Someone else examines the arrangements of rocks
in the two caves, and finds no essential distinction.  "Either the
arrangements of rocks in both caves have similar qualia or neither of
them do" is the conclusion we cavemen finally come up with.  We just
choose to ignore the proposal by one of us that when a human
conceptualizes arrangements of rocks for specific goals, that
particular arrangement of rocks inherits some of the qualia of humans,
much as some latterday humans thought that human photographs inherited
some of the souls of their subjects :-).

 It looks to me like we can distinguish three classes of situations:

i. Random arrangements of matter

ii. Arrangements of matter by an intentional entity for the purpose of
control or prediction (thermostats, calculators).  The representation
is in the "heads" of the intentional entity, and the entity merely
arranged the matter so as to "match" the requirements of the
representation.

iii.  Here is the tricky one.  Arrangements of biological control
systems, such as that of a frog.  Given a visual field with blue on
one side and green on the other, and a moving shadow up higher on the
green side, we notice that different regions of the neural matter
produce pulses which are connected to the motor system in a certain
way that makes the frog jump to the blue side.  We analysts might say
that the pulses "represent" water, land and predator, or, more
neutrally, blue, green and moving object, which information, we might
go on to say, is used by the frog's visual/motor system to send a
command to its muscles to jump in a certain direction.

With respect to i. none but the panpsychist (in the sense of Drew
McDermott's definition) would say that there is any qualia whatsoever.
With respect to ii, I think the situation of "what it is like to be"
is identical to i, so I don't see how we can ascribe to the rocks
*any* qualia either.  With respect to iii, we need to distinguish
between two rather different senses of the word "symbol" (perhaps this
is what Mikhail Zeleny was trying to get at recently in one of his
postings -- I can't immediately see a mapping between my distinction
and his) or representation in talking about information processing in
biological systems.
 
In one we have an intentional agent representing something with
something else for some purpose (as in "imagine this book is the
building and this pencil is me standing in front of the building").
When we as programmers write programs, we use this sense of
symbolizing to create streams of representations and methods of
manipulating them.  When the caveman in scenario ii above built his
thermostat, he was engaging in this kind of representational behavior.
But the thermostat that he built is merely *being a bunch of rocks,
not being a information processing agent*.  So the rocks are not
experiencing *what it is to be a thermostat*, the caveman is
experiencing *what it is to be one who thinks of a thermostat*.

Now consider the operation of the brain of the caveman who is thinking
of the thermostat.  Standing outside his brain but probing it, we may
be able to see pulses corresponding to the primal sketch, 2 1/2-d
sketch etc in his visual cortex, we may even see how his goal of
constructing a cool cave is represented as some representation in some
other part of the cortex, and how these representations are processed.
This information processing view (and the representations) do not
inhere in the brain any more than the information processing account
that the caveman constructed of how the cave stayed cold in scenario i
above inheres in the rocks.  Just like the representation of the rock
as something that sensed the heat and how that information is involved
in the control of the temperature is a feature of the deliberation of
another intentional, representation-making and processing entity, I
claim that the information processing account that I have of your
brain is an account in my representation-making and using mind.

In cognitive science and AI, the notion of a symbol is munged into
just one.  I argue that computer science can get away with it since we
have programmers who represent and who then organize matter to mimic
the representation.  In cognitive science and AI, we need to make a
distinction between the symbols that an intentional agent A constructs
and the symbols that we (other intentional agents) construct in our
description of the generation of the behavior of A.


