From newshub.ccs.yorku.ca!torn!cs.utexas.edu!sun-barr!olivea!uunet!van-bc!ubc-cs!unixg.ubc.ca!kakwa.ucs.ualberta.ca!access.usask.ca!ccu.umanitoba.ca!mona.muug.mb.ca!kenwolfe Wed Sep 16 21:23:24 EDT 1992
Article 6908 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!torn!cs.utexas.edu!sun-barr!olivea!uunet!van-bc!ubc-cs!unixg.ubc.ca!kakwa.ucs.ualberta.ca!access.usask.ca!ccu.umanitoba.ca!mona.muug.mb.ca!kenwolfe
>From: kenwolfe@muug.mb.ca (Ken Wolfe)
Newsgroups: comp.ai.philosophy
Subject: Re: Consciousness
Message-ID: <1992Sep12.054041.20257@muug.mb.ca>
Date: 12 Sep 92 05:40:41 GMT
References: <1992Sep6.010048.1@watt.ccs.tuns.ca> <18eh2uINNt6v@agate.berkeley.edu> <BuDr7y.1LA@usenet.ucs.indiana.edu>
Organization: Manitoba Unix User Group, Winnipeg, Manitoba, Canada
Lines: 76


>>It is widely accepted that it is conceivable to fool a person
>>into thinking he is in the real world by hooking him up to a
>>computer in the usual sf brain-in-a-vat scenario.  It is also
>>accepted, I think, that if a piece of the brain is sliced out and
>>replaced with a prosthesis that behaves *outwardly* in
>>exactly the same way, then the resulting "entity" will continue
>>claiming that nothing seems to have happened.  The new
>>entity, if it is an entity at all, will be fooled into thinking that
>>his inner experience is the same as it was before.
>>
>>So in this way the two thought experiments lead to results
>>which are indistinguishable internally from the "normal 
>>state of affairs", at least to the extent that there is an "inside".

>OK, the brain-in-a-vat will probably give the same conscious
>experiences if it's done right.  The prosthesis will certainly
>produce the same outward behaviour.  It's not so obvious
>that it will produce the same conscious experience, but I
>think it's quite likely that it would, as long as not too much
>of the brain is replaced at once.  If one replaced a big chunk,
>as e.g. replacing the whole brain with a behaviourally-equivalent
>lookup-table, then presumably consciousness would be affected
>radically.  If one replaces just a small bit by something that
>plays an identical functional role, it's very likely that
>experience would remain just the same (by the good old "fading
>qualia" and "dancing qualia" arguments).

>>Where they differ is in the question of whether there is an
>>objective change.  In the first case, where a brain is wired
>>into a computer, we have no problem accepting that
>>experience continues as before.  But when we imagine even a
>>tiny part of the brain being substituted by a prosthesis which
>>behaves outwardly in exactly the same fashion, we start to get
>>worried about the objective continuance of normal
>>experience.  The source of our worry is the notion that the
>>consciousness takes place in the brain, and therefore it takes
>>place in a certain region of the brain over a certain length of
>>time.  Cut away even a small part from this region, and
>>replace it with something which is internally different but
>>externally identical, and there is a feeling that the objective
>>experience has changed.

>By "objective change" I take it that you're referring to states
>of consciousness.  I can see why one would be worried that
>consciousness *might* change if you change too much of the
>brain, but your last sentence seems too strong to me: it doesn't
>seem at all obvious that the experience *would* change, just
>by e.g. replacing a few neurons with silicon, and in fact I
>think it's quite implausible.

>>But I would offer the counterargument that consciousness no
>>more needs the internal structure of a certain piece of brain
>>than it needs the environment to inherently be a certain way.
>>As long as the environment, or the piece of brain, looks the
>>same way to the rest of the brain, then experience is
>>preserved completely.

>I agree with this, subject to the proviso that only small
>parts of the brain are replaced at once, preserving overall
>functional organization.  If one is allowed to make big
>changes that radically alter the causal organization of
>the brain, then anything goes.
>Dave Chalmers                            (dave@cogsci.indiana.edu)      

I've been following this thread and there is something I am wondering
about your discussion of replacing sections of the brain with
"prosthetics".  You seem to suggest that if this is done all at one time
consciousness would be radically changed, but if it were done gradually,
or rather one small part at a time, consciousness would be unaffected. 
Would the end result of bothe of these hypothetical operations not be
exactly the same?  I am not clear on how the speed of these prosthetic
replacements, or their granularity if you prefer, would affect the final
state of consciousness.  

Ken Wolfe


