From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!bu2.bu.edu!bu.edu!m2c!nic.umass.edu!dime!orourke Mon Mar  9 18:35:01 EST 1992
Article 4244 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!rpi!bu2.bu.edu!bu.edu!m2c!nic.umass.edu!dime!orourke
>From: orourke@unix1.cs.umass.edu (Joseph O'Rourke)
Newsgroups: comp.ai.philosophy
Subject: Re: Definition of understanding
Message-ID: <44302@dime.cs.umass.edu>
Date: 4 Mar 92 14:45:04 GMT
References: <1992Feb28.165550.13014@psych.toronto.edu> <1992Mar2.031342.27459@bronze.ucs.indiana.edu> <44170@dime.cs.umass.edu> <1992Mar3.000217.18401@bronze.ucs.indiana.edu>
Sender: news@dime.cs.umass.edu
Reply-To: orourke@sophia.smith.edu (Joseph O'Rourke)
Organization: Smith College, Northampton, MA, US
Lines: 47

In article <1992Mar3.000217.18401@bronze.ucs.indiana.edu> 
	chalmers@bronze.ucs.indiana.edu (David Chalmers) writes:
  >In article <44170@dime.cs.umass.edu> 
		orourke@sophia.smith.edu (Joseph O'Rourke) writes:
  >>...a pre'cis
  >>of your fading qualia argument as applied to the system memorizer
  >>might help raise the level of discussion.

>OK.  To make things easier, let the memorizer be a tiny little demon
>who runs around inside a human head, taking inputs from sensory organs,
>performing her memorized computations on these, and stimulating
>motor movements in the appropriate way according to the result of
>the computations.
>
>To construct the continuum from a normal brain to a memorizer, under
>as always the assumption the functionality of a single neuron is
>computable: [...remainder of description deleted]
>
>The usual fading qualia arguments apply: Did the person's qualia
>gradually fade as the neurons got replaced?  It seems implausible
>that a conscious being could be so wrong about its qualia.  Did
>the qualia suddenly disappear once a certain threshold was reached?
>That seems more implausible.  The remaining alternative is that
>the qualia stay around, but whereas they were once based in a
>causally-connected network of neurons, they're now based in an
>isomorphic causal network of computations that happens to lie inside
>in the demon's brain.

	Thanks for posting this nth version of your fading qualia
argument.  I'm not sure I can raise the level of discussion myself,
but let me make two comments.

1. Qualia are slippery, and no one seems to understand them, as you
have often pointed out.  Resting an argument on intuition about
qualia is akin to building on sand.  For me at least.  Maybe because
you have pondered qualia so thoroughly, your intuition is stronger.

2. It seems conceivable that qualia are dependent upon the neurotransmitters
passed in synapses.  Maybe our feeling of redness is just how it
feels to have certain chemicals released from neurons in certain
brain areas.  I know little of neurobiology, but suppose for the
moment that this were true.
	Then it could be that the functionality of a single neuron
is indeed computable, and silicon-substitutable.  But as more and
more neuron interconnections are replaced, less and less of the
neurotransmitters are around to be experienced as qualia.  Then
the person would indeed experience a fading of qualia.


