From newshub.ccs.yorku.ca!torn!cs.utexas.edu!sun-barr!ames!elroy.jpl.nasa.gov!sdd.hp.com!ux1.cso.uiuc.edu!usenet.ucs.indiana.edu!bronze.ucs.indiana.edu!chalmers Wed Sep 16 21:22:47 EDT 1992
Article 6865 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!torn!cs.utexas.edu!sun-barr!ames!elroy.jpl.nasa.gov!sdd.hp.com!ux1.cso.uiuc.edu!usenet.ucs.indiana.edu!bronze.ucs.indiana.edu!chalmers
>From: chalmers@bronze.ucs.indiana.edu (David Chalmers)
Subject: Re: Consciousness
Message-ID: <BuDr7y.1LA@usenet.ucs.indiana.edu>
Sender: news@usenet.ucs.indiana.edu (USENET News System)
Nntp-Posting-Host: bronze.ucs.indiana.edu
Organization: Indiana University
References: <1992Sep6.010048.1@watt.ccs.tuns.ca> <18eh2uINNt6v@agate.berkeley.edu>
Date: Thu, 10 Sep 1992 20:50:21 GMT
Lines: 140


In a pair of interesting articles, Edward Paul Faith writes:

>Therefore, an experience that involves manipulation of the
>environment requires the environment itself, or rather that part of the
>environment which is involved.

Well, I don't think this is strictly true.  All that's needed
for the experience are the boundary conditions -- whatever's
happening at the border between "brain" and "environment".
If a different environment provides the exactly the same
sensory stimuli, then the experience will be exactly the same.
Thus your common-or-garden brain-in-a-vat thought-experiment.
In fact, it's probably the case that less is required: e.g.,
one could probably get rid of actual eyes and ears entirely,
and provide conscious experiences indistinguishable from the
ones you're having now, by directly stimulation of the
visual and auditory cortices, etc.  (This basically amounts
to redrawing the position of the "boundary".)  So, experience
only really requires the brain.  In practice the environment
is essential, but only because that's by far the easiest way
to get the boundary conditions right.

>The same thing holds true for elements of the brain.  Certain parts of
>the brain are involved in subjective experience more than others.
>The removal of certain parts results in greatly reduced ability to
>experience (loss of sight, or even death).  Meanwhile, the removal
>of other parts has very little effect on subjective experience as
>reported by the subject.  I conjecture that the important parts form a
>system, all parts of which are in constant contact with each other.  I
>further conjecture that the parts of the brain which do not much
>affect subjective experience do not interact much with that system.

Maybe so: perhaps many parts of the brain are like the eyes and the
ears, in that they only play an indirect role in supporting conscious
experience.

>The result is that an experience does not require simply the brain to
>exist, but rather requires a combination of the brain plus part of the
>outside world in order to exist.

This doesn't seem to follow to me.  If you get the brain-state right,
you get the conscious experience right, and the way the environment
is is irrelevant.  Of course, the environment usually plays a very
important role in getting the brain-state to be the way it is, but
importantly, its role is "screened off" by the state of the brain --
if the same state had been caused by a different environment (e.g.
by hallucination, direct brain stimulation, etc), the conscious
experience would almost certainly have been the same.  Note
the asymmetry here: it's certainly *not* the case that if one had
the same environment, but a different brain state, then the
conscious experience would have been the same.

>It is widely accepted that it is conceivable to fool a person
>into thinking he is in the real world by hooking him up to a
>computer in the usual sf brain-in-a-vat scenario.  It is also
>accepted, I think, that if a piece of the brain is sliced out and
>replaced with a prosthesis that behaves *outwardly* in
>exactly the same way, then the resulting "entity" will continue
>claiming that nothing seems to have happened.  The new
>entity, if it is an entity at all, will be fooled into thinking that
>his inner experience is the same as it was before.
>
>So in this way the two thought experiments lead to results
>which are indistinguishable internally from the "normal 
>state of affairs", at least to the extent that there is an "inside".

OK, the brain-in-a-vat will probably give the same conscious
experiences if it's done right.  The prosthesis will certainly
produce the same outward behaviour.  It's not so obvious
that it will produce the same conscious experience, but I
think it's quite likely that it would, as long as not too much
of the brain is replaced at once.  If one replaced a big chunk,
as e.g. replacing the whole brain with a behaviourally-equivalent
lookup-table, then presumably consciousness would be affected
radically.  If one replaces just a small bit by something that
plays an identical functional role, it's very likely that
experience would remain just the same (by the good old "fading
qualia" and "dancing qualia" arguments).

>Where they differ is in the question of whether there is an
>objective change.  In the first case, where a brain is wired
>into a computer, we have no problem accepting that
>experience continues as before.  But when we imagine even a
>tiny part of the brain being substituted by a prosthesis which
>behaves outwardly in exactly the same fashion, we start to get
>worried about the objective continuance of normal
>experience.  The source of our worry is the notion that the
>consciousness takes place in the brain, and therefore it takes
>place in a certain region of the brain over a certain length of
>time.  Cut away even a small part from this region, and
>replace it with something which is internally different but
>externally identical, and there is a feeling that the objective
>experience has changed.

By "objective change" I take it that you're referring to states
of consciousness.  I can see why one would be worried that
consciousness *might* change if you change too much of the
brain, but your last sentence seems too strong to me: it doesn't
seem at all obvious that the experience *would* change, just
by e.g. replacing a few neurons with silicon, and in fact I
think it's quite implausible.

>But I would offer the counterargument that consciousness no
>more needs the internal structure of a certain piece of brain
>than it needs the environment to inherently be a certain way.
>As long as the environment, or the piece of brain, looks the
>same way to the rest of the brain, then experience is
>preserved completely.

I agree with this, subject to the proviso that only small
parts of the brain are replaced at once, preserving overall
functional organization.  If one is allowed to make big
changes that radically alter the causal organization of
the brain, then anything goes.

>In circular fashion, if this result is accepted, then my previous
>article is rather easy to accept as well.  (Recall again that 
>I attempted to put into doubt the notion that a specific 
>subjective experience occurs entirely in the brain.)  For if we
>agree that a sufficiently small piece of the brain can be seen
>correctly as having the same relation to consciousness
>as the environment, then it follows that consciousness no 
>more occurs in a region needing that piece of brain, than 
>it occurs in a region needing the environment.

Again, this doesn't follow.  All that follows is that what's
essential to consciousness is not the specific bits and
pieces that make up the brain, but the overall functional
organization.  Consciousness still supervenes on brain state,
in the sense that two identical brains will give rise to
identical conscious experiences, in a way that it doesn't
supervene on environmental states.  So one has a kind of
within-brain holism, but you can't extend this to a wider
environment-involving holism.

-- 
Dave Chalmers                            (dave@cogsci.indiana.edu)      
Center for Research on Concepts and Cognition, Indiana University.
"It is not the least charm of a theory that it is refutable."


