Newsgroups: talk.philosophy.misc,alt.philosophy.objectivism,comp.ai.philosophy,comp.ai,comp.ai.alife,udel.neuro,bionet.neuroscience
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!newsfeed.pitt.edu!gatech!newsfeed.internetmci.com!howland.reston.ans.net!ix.netcom.com!netcom.com!jqb
From: jqb@netcom.com (Jim Balter)
Subject: Re: Exact Duplicate?, also Dennet (was Re: Brain & Body)
Message-ID: <jqbDM0Dnx.56F@netcom.com>
Organization: NETCOM On-line Communication Services (408 261-4700 guest)
References: <4cu3ql$mok@aladdin.iii.org.tw> <4diqul$20o@pheidippides.axion.bt.co.uk> <jqbDLr9LD.JoL@netcom.com> <4edioq$ef7@copland.udel.edu>
Date: Tue, 30 Jan 1996 19:19:09 GMT
Lines: 311
Sender: jqb@netcom19.netcom.com
Xref: glinda.oz.cs.cmu.edu comp.ai.philosophy:37245 comp.ai:36559 comp.ai.alife:5013

In article <4edioq$ef7@copland.udel.edu>,
Thomas R. Gregg <greggt@copland.udel.edu> wrote:
>In article <jqbDLr9LD.JoL@netcom.com>, Jim Balter <jqb@netcom.com> wrote:
>>>: See Daniel Dennet's "Conciousness Explained" for an extremely good argument
>>>(1) A person's brain is replaced, neuron by neuron, with electronic devices
>>>(with suitable electrochemical I/O), one for each neuron.   Would the
>>>resulting person possess the same identity?
>>>(2) A person's brain is copied, atom by atom.   His brain is then cut off
>>>where the brain stem is joined to the spinal column and replaced, by
>>>suitable microsurgery, with the exact replica.   Would the resulting person
>>>possess the same identity, or would the original (now disconnected, and
>>>probably dead) brain?
>>>Intuition suggests that identity would be preserved in case (1) but not
>>>case (2).   However, an external observer might not notice the difference
>>>in case (2), but would in case (1).   In both cases, the person would
>>>*think* their identity had survived, so asking them would be pointless.
>
>>*My* intuition suggests that your notion of identity is incoherent.
>
>How so?  What is unrealistic and/or self-contradictory ("incoherent")
>about thinking that each human brain has an identity?  In other words
>(IOW), each brain remembers the experiences that happened to that brain,
>and the experiences that happened to the body which is attached to that 
>brain. (Each body is separated from the outside environment by the body 
>surfaces, e.g., skin, membrane).

There is nothing incoherent in a notion of identity that equates it with the
information represented in a brain, but that is not how the word is normally
used, does not capture aspects of the word in use, and is not the same as the
intuitive notion of identity used above.  With the information ("remembers the
experiences") version, all these questions and paradoxes are readily answered.
People's "identities" are constantly changing as they accumulate experiences,
no one one has quite the identity they had a moment ago.  Replacing a neuron
affects memory exactly to the degree that it affects the information held.
Copying a brain atom by atom creates two brains with identical "identities"
(the uniqueness of identity is what makes the notion incoherent, and is what
makes this "remembers the experiences" version not a suitable description of
the word as it is used).  Replacing a brain with a duplicate would leave the
person with the same "identity" iff the two brains were still
information-equivalent.  An intuition that sees (1) above as preserving
identity but (2) above as not preserving it is based an a notion of identity
that is inconsistent or malformed.  However, perhaps something else is hidden
in the claim that "However, an external observer might not notice the
difference in case (2), but would in case (1)".  What difference would be
noticed by virtue of replacing neurons with electronic devices?  If this
refers merely to the physical difference of having silicon/whatever in the
head, this would be observable to the individual too, and would thus give her
a markedly different set of experiences ("identity") from the original.  If
this difference is a difference in behavior, I don't see why we would expect
any.

>>There is so such thing as an identity, which is possessed.  The laws of
>>conservation of matter and energy do not apply to identities.
>>"Identity" is a human concept involving the notion of continuity.
>
>You seem to contradict yourself.  Does identity not ("sot"?) exist, or
>does it exist, though it is a human invention?

I said there is no such thing as "*an* identity".  Surely you can distinguish
between a thing and a concept of a thing?  Why create, via word games,
"apparent" contradictions where there are none?  I already said that this
concept of identity is incoherent, so by saying "is" I cannot be saying it
exists.  Suppose we use the word "fribble" to refer to primes greater than 2
and which are successors of odd numbers.  There are no fribbles, but "fribble"
is a human concept.  Asking whether fribble exists is simply a misuse of the
word "exists".

>Perhaps your contradiction
>can be resolved in the following way, which incorporates developmental
>cognition: 
>
>A human, through experience, comes to think that s/he and other people
>each has a set of stable personality attributes, since s/he observes
>hirself and others acting in a predictable set of ways each day.  Later,
>this predictable set of responses (and the states of "mind" (brain)
>inferred to be present when those responses occur) comes to be known to
>that person as hir "identity".  This is a behaviorist model. 

But that's what people mean by "personality", not "identity".  If that were
"identity", then (2) above would have the same identity, and there would be no
teleporter identity paradoxes.  By attempting to "resolve" this "contradiction"
regarding existence, you seem to be insisting that, if people use a word,
there must be a coherent concept behind it.  The same argument has been made
here in re words like "consciousness".  But it just ain't so.

>>The Greeks played these games long ago.  If each plank of Theseus's ship is
>>replaced, one by one, is it still the same ship?  After you've thought about
>>it and debated it and come up with your answer, consider that the removed
>>planks have been reassembled as a separate ship.  Which is *really* Theseus's
>>ship?  
>
>Obviously, Theseus has entered a parallel universe.  ;) Seriously, this is
>a different case.  In the case of the ship I am talking about something
>that is outside the skin-boundary of my body.

Um, the discussion is about the meaning of "identity".  Not only personalities
are held to have identities.  It is silly to point out that there are
differences in an analogy when it is the similarities that are relevant.

>My brain (which contains my
>identity) is in my body.

My notebook contains Windows 95.  Is Windows 95 then my notebook's identity?
Can it be some other notebook's identity too?  Inherent to the notion of
identity is *uniqueness*.  What is it about the contents of your brain that is
*unique*, and that is different from the same sort of thing in an exact clone
of your brain?  *Those* are the issues at hand.

>The current argument has to do with whether
>human identity is contained in the body, the brain, or the mind.

Where is the mind?  Do cloned brains have the same mind, or different ones?
We can never answer these questions until we look at the structure of the
assumptions underlying these words like "mind" and "identity" that we bandy
about.

I argue that the notion of a human identity which has a location and thus acts
like a particle in classical physics is incoherent.  Certainly the set of
experiences of the brain and body do not satisfy this notion of locality.

>(I
>separate those three entities because some will claim that the brain is
>part of the body, and some will claim that the mind is connected with the
>brain.)
>
>In the case we are discussing, what would you say, JQB, if your whole
>brain and spinal cord (CNS) were suddenly to be removed and placed in,
>say, Rush Limbaugh's body, while his CNS were to be removed completely? 
>...  You would probably look down at yourself and think "gee, I look like
>Rush Limbaugh today.  I wonder why?"  Your identity would have been
>transplanted to Rush's body.
  
But this isn't the point of controversy, and thus isn't what is being
discussed.  If each of my neurons were replaced with an equivalent electronic
circuit, but was also transplanted into Rush's body, then when the JQB body
awoke it would say "so, what's new?" and go on its merry way.  The question
is, which body contains my identity?  If identities are unique, then the
question is unanswerable because we haven't specified what we mean by identity
welll enough to make the distinction.  Those who believe in some sort of
ethereal soul-identity that occupies the body might have a bit of trouble
locating it halfway through the operation.  If you want to say that identity
is stored experience, then both bodies have the same identity as long as you
restrict experience to that which is stored strictly in the brain; otherwise,
part of JQB's identity is in the JQB body and part is in the RL body, which of
course makes a mishmash of a concept of identify that isn't considered
divisible.  These are the real issues at hand.  

>If asked, this thing (which looks like Rush
>Limbaugh's body but has your CNS) would probably say that s/he is JQB,
>though somehow JQB's "identity", "mind", or "consciousness" has been
>transported to Rush's body.  "Rush", the newspapers would report, "is no
>longer doing his talk show.

The newspapers are thus *identifying* Rush by his body, rather than by his
behavior.  Which is the "correct" way to identify Rush?  The question is
confused; the notion of a unique identity is confued, insupportable,
incoherent.

>Rush appears to think his identity has been
>changed to that of JQB.

>Rush claims that he is therefore completely
>incompetent to do his talk show, since he cannot remember how to interview
>people in his unique style, and he now has liberal political views instead
>of conservative views.  Doctors are performing MRI scans on his brain on
>the off-chance that he is right and has not gone crazy." 

*Who* is making these claims?  *Who* is right?  *Who* hasn't gone crazy?  Does
this sack of bloated flesh have the identity "Rush Limbaugh" up until the MRI,
and thereafter has the identity "Jim Balter" (assuming, for sake of
discussion, that an MRI could distinguish)?  This isn't hard to deal with if
we take "identity" to just be a labeling process done by humans to help them
sort out their world.  Just as my labeling a particular file "foo" doesn't
make "fooness" some unique attribute of that file, says nothing about making
copies of the file (which is "really" foo?), says nothing about changes to the
file (is foo the same file it was yesterday?), tells us nothing about
transferring blocks of the file to another file and replacing them with other
blocks that serve the same purpose, etc.

>This general kind of thing happens all the time in folktales (including
>non-Western ones)-- a person's identity (or sense of self-consciousness)
>is transplanted to another person or animal-- so I don't that this way of
>thinking is an artifact of our "Western, scientific way" of thinking about
>the mind and the body.

No, it's a consequence of the human-universal illusion of self.

>>Even trying to give an answer to this is a result of an improper
>>reification of identity.
>><J Q B>
>
>"Identity" has been given reality by the people who talk (and write) about
>it.  We are not reifying the concept, we are merely talking about it 
>too.

Um, I think you should carefully consider what reification is.  If talking
about the existence of identity and transplantation of identities is not
reification, what would be?

reification: "The fallacy of taking abstractions and regarding them as
actually existing entities that are causally efficacious and ontologically
prior and superior to their referents."

Many of the debates here and in other philosophical arenas could be eliminated
with the elimination of reification.  But so it's been for a very long time.
Ah well.

>The "Theseus's ship" "paradox" can be resolved in the following ways:  1)
>both ships are Theseus's 2) neither is 3) each is partly his.

The paradox illustrates an incoherency in the notion of unique continuing
identity.  Neither (1) nor (3) is consistent with uniqueness (and (3) is
worded to imply that ownership is the issue, which it isn't).  (2) implies
that, when the first nail or rope was replaced on Theseus's ship, it was no
longer Theseus's ship, which is inconsistent with continuity.  This is no
resolution, it is mere avoidance.  The point of the paradox is that none of
these answers seems entirely right, none fits the intuitive notion of there
being such a thing as "Theseus's ship" (or, to avoid your confusion about
ownership, "the ship Theseus sailed from Thebes").  In order to resolve the
paradox, the notion of identity must yield.

>The
>difference between that argument and this one is that now we are talking
>about subjective viewpoints, not just material objects such as ships.

All the same points hold.  Are subjective viewpoints duplicable?
Can two entities have the same subjective viewpoints?  If not, yet the two
entities are functionally equivalent, what is it that's different?

>We
>have the idea that each person has only one viewpoint, one consciousness. 

This notion has been strongly criticised by Dennett and many others, including:

>One interesting relevant issue is split-brain research, in which each half
>of the split brain seems to tke on a separate identity.

>Then there are 
>the folktales in which a person's identity can be in hir body at the same 
>time it is in the body of a bird or other animal.  

I think in mosdt of these tales the body isn't conscious or the person is
aware of both bodies at once.  If the identity is in both bodies separately,
then "it" is having two separate sets of experiences and thus is, by your
definition, two different identities.

>Perhaps ESP exists? ;) In that case, maybe one part of a transplanted
>brain could communicate with the other part, and they would still share
>one identity.

You and I are communicating, but I don't think you would say we have the same
identity.  Why is identity unitary in one case but not the other?  Beware of
reification and incoherency.

>What if I were to transplant brain tissue from a fetal rat
>into an adult rat?  Would the adult rat's "inner child" reawaken ;)

Can't we even pretend to be slightly scientific, here?  To where would you
transplant the tissue?  What function would the tissue play in the adult?  If
the tissue carries information or memories, how are they represented and how
would they integrate into the adult?  You seem to be conflating newage (rhymes
with sewage) notions of "inner child" with childhood memories.  I see the
smile, but surely you are trying to say *something* here.  If you want to
induce childlike behavior, you might try messing with those structures that
develop into adulthood (damage to human frontal lobes can have this effect).

>How
>about from an ape to a human?  Would we suddenly have a "missing link"?  

Your ontology seems confused.

>Would this be proof to Peter Singer that animals DO possess language and 

Is language a possession?

>do have rights?

Rights are granted by society.  Animals have some.  What they *deserve* is
a normative question not subject to proof.

>My solution to this whole mess is that there is some brain area or areas
>that make a brain feel like a whole identity.  These areas are likely to
>be in the telencephalic cerebral cortex or telencephalic subcortical areas
>(and/or diencephalic thalamus), but not in the cortical language areas
>(evidence:  the split brain studies).

The hemisphere with language abilities professes to "feel like a whole
identity", so I don't see why you say that.  This certainly isn't contradicted
by the fact that the two hemispheres behave independently, any more than when
you and I "feel like a whole identity" but behave independently.  Reification
can have devastating effects upon one's logical coherency.

The issue of the concept of identity with uniqueness and locality is not to be
solved by examining the brain to find it, other than in that this effort helps
understand that the problem is one of philosophical error.

>Computers could have consciousness
>in the future, since the mind and brain are aspects of physical reality,

Not for those to whom "consciousness" is some sort of ethereal essence
and isn't fully capturable by physical reality (no, I'm not one of those).

>and we could build a system similar to the brain (theoretically) but such
>computers are, say, 1,000,000 years too advanced for us. 

I think that is absurd as saying we will have them in 10.


-- 
<J Q B>

