From newshub.ccs.yorku.ca!torn!cs.utexas.edu!wupost!spool.mu.edu!agate!sunkist.berkeley.edu!epfaith Wed Sep 16 21:21:56 EDT 1992
Article 6801 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!torn!cs.utexas.edu!wupost!spool.mu.edu!agate!sunkist.berkeley.edu!epfaith
>From: epfaith@sunkist.berkeley.edu (Edward Paul Faith)
Newsgroups: comp.ai.philosophy
Subject: Re: Consciousness
Date: 7 Sep 1992 05:00:09 GMT
Organization: U.C. Berkeley Math. Department.
Lines: 60
Message-ID: <18enkpINNqq@agate.berkeley.edu>
References: <1992Sep6.010048.1@watt.ccs.tuns.ca> <18eh2uINNt6v@agate.berkeley.edu>
NNTP-Posting-Host: sunkist.berkeley.edu

Subjective experience and its physical support.

In my previous article, I attempted to put into doubt the notion that
a specific subjective experience occurs entirely in the brain.I

>attacked the question in a certain way, by asking what physical
>events were necessary in order for a specific experience to occur.
>When we extend our hand and manipulate real objects, we also
>extend the physical support for our experiences.  When we retract
>our hand, or when we slice out a small portion of our brain, we
>reduce the physical support.

I think that my conclusion has implications for the following
problem:

It is widely accepted that it is conceivable to fool a person
into thinking he is in the real world by hooking him up to a
computer in the usual sf brain-in-a-vat scenario.  It is also
accepted, I think, that if a piece of the brain is sliced out and
replaced with a prosthesis that behaves *outwardlly* in
exactly the same way, then the resulting "entity" will continue
claiming that nothing seems to have happened.  The new
entity, if it is an entity at all, will be fooled into thinking that
his inner exparience is the same as it was before.

So in this way the two thought experiments lead to results
which are indistinguishable internally from the "normal 
state of affairs", at least to the extent that there is an "inside".

Where they differ is in the question of whether there is an
objective change.  In the first case, where a brain is wired
into a computer, we have no problem accepting that
experience continues as before.  But when we imagine even a
tiny part of the brain being substituted by a prosthesis which
behaves outwardly in exactly the same fashion, we start to get
worried about the objective continuance of normal
experience.  The source of our worry is the notion that the
consciousness takes place in the brain, and therefore it takes
place in a certain region of the brain over a certain length of
time.  Cut away even a small part from this region, and
replace it with something which is internally different but
externally identical, and there is a feeling that the objective
experience has changed.

But I would offer the counterargument that consciousness no
more needs the internal structure of a certain piece of brain
than it needs the environment to inherently be a certain way.
As long as the environment, or the piece of brain, looks the
same way to the rest of the brain, then experience is
preserved completely.

In circular fashion, if this result is accepted, then my previous
article is rather easy to accept as well.  (Recall again that 
I attempted to put into doubt the notion that a specific 
subjective experience occurs entirely in the brain.)  For if we
agree that a sufficiently small piece of the brain can be seen
correctly as having the same relation to consciousness
as the environment, then it follows that consciousness no 
more occurs in a region needing that piece of brain, than 
it occurs in a region needing the environment.


