From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!bonnie.concordia.ca!thunder.mcrcim.mcgill.edu!snorkelwacker.mit.edu!usc!wupost!think.com!mips!pacbell.com!iggy.GW.Vitalink.COM!psinntp!psinntp!scylla!daryl Tue Mar 24 09:58:03 EST 1992
Article 4666 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!bonnie.concordia.ca!thunder.mcrcim.mcgill.edu!snorkelwacker.mit.edu!usc!wupost!think.com!mips!pacbell.com!iggy.GW.Vitalink.COM!psinntp!psinntp!scylla!daryl
>From: daryl@oracorp.com (Daryl McCullough)
Newsgroups: comp.ai.philosophy
Subject: Re: aliens eat fading qualia
Message-ID: <1992Mar23.145207.7892@oracorp.com>
Date: 23 Mar 92 14:52:07 GMT
Organization: ORA Corporation
Lines: 84

jeff@aiai.ed.ac.uk (Jeff Dalton) writes:

> One of the problems with the [fading qualia] argument is that all
> it shows is that brain simulations could be conscious. If that's
> the best anyone can do against Searle, then I'm not impressed.

If brain simulations are conscious, then Searle's arguments are wrong
(since the Chinese Room argument presumably applies even when the
rules are for a brain simulation). Chalmer's fading qualia arguments
were only advanced to show that Searle is wrong, not to be impressive.

> Let's suppose there's an alien life form that eats brains but, to keep
> from being detected, it duplicates at the interface between it and the
> rest of the brain the physical effects of the neurons it has eaten...
> Maybe there's a duplicate of you (or the you-system, that is) inside
> this creature at the end. But what happens to the non-duplicate you
> in the meantime?

I don't believe that there is a meaningful distinction between "you"
and "a functionally equivalent duplicate of you". A human being is not
a particular collection of atoms--the atoms in your body are
continually being replaced by new atoms even without hypothesizing an
alien brain-eater. What makes a collection of atoms "you" is the
organization, which is preserved in spite of replacement of atoms. If
the alien brain-eater replaces the original matter by a functional
duplicate, that will no more be your death than the replacement of
your atoms through natural processes would be. Of course, after eating
your brain, the alien could decide to end the charade and quit
maintaining a functionally equivalent "you" inside him. It is not a
good situation to be in, regardless of how feel about functionalism.

> Well, let's just run this by the fading qualia argument:
> 
>    Now of course the question Searle has to answer is what happens to
>    the consciousness along the way.  At one end, we have full
>    consciousness; at the other end, if we believe Searle, we have
>    none, but what of the intermediate states?  Searle has to accept
>    either (a) that consciousness suddenly blinks off at some stage; or
>    (b) that it gradually fades out, with states of semi-consciousness
>    along the way -- but with full functional equivalence.  For various
>    reasons I don't think that either of these are too plausible.  If
>    one doesn't accept the possibility of these two phenomena (suddenly
>    disappearing consciousness or fading consciousness, with full
>    functional equivalence), then we're led into a reductio of the
>    original assumption that one end of the spectrum isn't conscious.
> 

> So either (a) you're still conscious at the end (even though _you_
> have no brain left), or (b) fading or blinking out isn't so
> implausible after all, or (c) it's impossible to duplicate
> functionality at the interface.  (a) is false.  Both (b) and (c) are
> fatal to the fading qualia argument.

Jeff, (c) is *not* fatal to the fading qualia argument, it implies
that the fading qualia argument is *true*. We've had similar
discussions before. The conclusion of the fading qualia argument is in
the form of an implication: *If* the functional properties of the
neurons are duplicated, *then* the qualia are duplicated. In other
words, it is impossible to have a being that is functionally
equivalent to a human being that does not also possess qualia. If it
happens to be the case that only human beings are functionally
equivalent to human beings, then functionalism for qualia becomes
vaccuously true.

And your argument about brain-eating aliens has not helped to make (b)
any more plausible. *How* can qualia fade and make no functional
difference? Introspection would tell you that if your qualia became
less intense, then you would notice it; it would affect your thoughts
(and very likely your behavior, as well).

That leaves (a). As implausible as it may seem to you, it seems the
best alternative to some of us.

> Of couse, all sorts of people will write to say (a) is true.
> But think about it. You want the argument against Searle
> to _depend_ on this? Give me the systems reply any day.

It seems to me that (a) and the systems reply are the same. Any system
functionally equivalent to you will *be* you, and will have all the same
mental properties.

Daryl McCullough
ORA Corp.
Ithaca, NY


