From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!qt.cs.utexas.edu!yale.edu!think.com!ames!agate!ocf.berkeley.edu!jvsichi Wed Feb  5 11:56:45 EST 1992
Article 3452 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!cs.utexas.edu!qt.cs.utexas.edu!yale.edu!think.com!ames!agate!ocf.berkeley.edu!jvsichi
>From: jvsichi@ocf.berkeley.edu (John Sichi)
Newsgroups: comp.ai.philosophy
Subject: Re: Multiple Personality Disorder and Strong AI
Date: 4 Feb 1992 06:48:30 GMT
Organization: U.C. Berkeley Open Computing Facility
Lines: 35
Sender: jvsichi@ocf.berkeley.edu
Message-ID: <kosdpuINN4e6@agate.berkeley.edu>
References: <kokp5aINNiuu@agate.berkeley.edu> <1992Feb4.035646.11687@cs.yale.edu>
NNTP-Posting-Host: sandstorm.berkeley.edu
Summary: Critique of responses
Keywords: consciousness,functionalism

Here's an analysis of the responses I've seen to the question I posed
regarding multiple consciousnesses arising from a single network:

Dave Chalmers (in message 4371) and Drew McDermott (in 4442) gave the tried
and true reductionist response:  consciousness is not an emergent property,
it's purely organizational (as expressed neatly by DC's house/brick analogy).
I certainly can't argue with this, since this point of view is completely
alien to me, and will be no matter how much of Hofstadter, Dennett, Minsky,
et al I try to read.  It just doesn't jive with my own experience of
consciousness (and, never having been implemented in silicon, I don't have
anything else to go on).  It's especially difficult to try to grasp this
approach since its explanation is generally accompanied by a lot of
hand-waving (and ginger ale bubbles), but little explicit demonstration of
how objective physical events correspond with such a non-physical, subjective
mode of being.  I'm not sure how one goes about bridging this gap--both
parties think their opponents are deluding themselves.

Matthew Wiener (in msg 4398), suggested that the presence of the I/O
generated by the (conceptually) separated node(s) would in some way
inhibit consciousness in the rest of the subnetwork.  I suppose this could
be true, but it seems like a very strong assumption, considering that the
number of connections of a single neuron in the brain is at least six orders
of magnitude smaller than the total number of neurons (sorry to keep
focusing on the brain, but once again, there's not much else to go on at
this point).  Such a network would certainly not be robust in its support
of consciousness.

Gordon Joly (in 4396) didn't realize I was being facetious in my title; the
multiple personalities were the absurda in my argument.  I don't think
one can equate personality and consciousness, at any rate; certainly not
the kind of "personality" of an MPD, which is behavioral.  I apologize for
the confusion.

--A dualist until further notice,
--JVS


