Newsgroups: comp.ai.alife,comp.ai.philosophy,comp.ai,alt.consciousness
Path: cantaloupe.srv.cs.cmu.edu!rochester!udel!news.mathworks.com!newshost.marcam.com!charnel.ecst.csuchico.edu!csusac!csus.edu!netcom.com!departed
From: departed@netcom.com (just passing through)
Subject: Re: Thought Question
Message-ID: <departedD4G07E.9Fq@netcom.com>
Organization: NETCOM On-line Communication Services (408 261-4700 guest)
References: <vlsi_libD45qrE.zx@netcom.com> <3iehgk$jmu@news.u.washington.edu> <departedD4FA2s.M23@netcom.com> <3iggvt$r38@mp.cs.niu.edu>
Date: Thu, 23 Feb 1995 07:45:13 GMT
Lines: 87
Sender: departed@netcom20.netcom.com
Xref: glinda.oz.cs.cmu.edu comp.ai.alife:2550 comp.ai.philosophy:25712 comp.ai:27709

In article <3iggvt$r38@mp.cs.niu.edu>, Neil Rickert <rickert@cs.niu.edu> wrote:
>In <departedD4FA2s.M23@netcom.com> departed@netcom.com (just passing through) writes:
>>In article <3iehgk$jmu@news.u.washington.edu>,
>>Gary Forbis  <forbis@cac.washington.edu> wrote:
>>>In article <departedD4BCF2.6IF@netcom.com>, departed@netcom.com (just passing through) writes:
>
>>>|> I'd say an intuitive definition would be, "anything that can demonstrate
>>>|> subjectivity is conscious."
>
>>>I'm not sure how subjectivity can be demonstrated.  I pretty sure responsivity
>>>to internal states mapped (by an external agent) to subjective states can be
>>>demonstrated.  I'm not yet willing to accept hunger is any internal state
>>>which can be mapped to the subjective experience of hunger.
>
>>Hmm, so you're trying to say that subjectivity per se does not exist?
>
>I don't think Gary is saying that.  It is more likely that he is
>saying that subjectivity is, by definition, subjective and therefore
>not objectively demonstrable.

You may be right; as soon as we can nail it down then it's evidently
objective.  Nevertheless, there are certain traits we associate with
subjectivity.  When you say that someone is 'being subjective' you're
saying that they're relying on their own interpretation, speaking out
of their worldview -- in essence maintaining their own informational
realm, discrete from the world at large.  This is the vernacular, but
I think it points at what is meant by 'subjective'.

>>I would say that anything that maintains a flow of information decoupled
>>to some extent from the outside world is 'subjective' to that extent.
>
>Subjectivity arises from sensory deprivation?  I don't think so.
>Otherwise you would be conscious while asleep.  The Turing machine is
>decoupled from the outside world except when the input is written to
>its tape, and when the final output is read from its tape.  Does that
>make a Turing machine conscious?

Different kind of decoupling, sorry.  See above.  I'm talking about
decoupling more in terms of different levels of dealing with information,
rather than blocking input.

>>Hence, from the outside, if entity X were hungry and it said, "Let's
>>go to Burger King, it's cheap"  I would consider that a better indication
>>of subjectivity than its forcing its face into a pile of food.
>
>Entity A (= my automobile) from time to time flashes a red light on
>the dashboard which, in effect, says "let's go to the gas station
>and drink some gasoline."  Is it conscious?

Naw, your car isn't transforming the information in any interesting way.
The throughput is very direct -- gas low --> blinking light.  It is you
who is being conscious, in that via your internal realm you are translating
a blinking light into information about your car (a much more complicated
gestalt) and further transforming _that_ information (with a desire to not
get stranded, having some money, so on) into a plan to go to the gas 
station.
I think there's a lot more going on in consciousness than this kind of
writing a script, but I think being able to take 1 bit of information
and transform this input via a well formed world into an elaborate
construction is pretty indicative of consciousness.
Thing is, any example I give can potentially be 'faked' arbitrarily well
by an unconscious system -- some program driven by scripts could handle
the above without difficulty.  (Which brings us back to the difficulty of
demonstrating subjectivity.)  I would suspect, though, that one of the
necessary qualities for subjectivity is a well formed world, one
capable of almost any arbitrary transformation.
You could have a doll which has a string attached so that when you yank it,
it emits the sounds, "I'm hungry, let's go to Burger King, it's cheap."
But over time, you begin to suspect that it's incapable of transforming
the string-yank input into anything different, and hence not conscious.
On the other hand, if your friend sometimes says, "I'm hungry, let's eat
at BK" or sometimes says, "I'd rather be hungry for a while" or sometimes
says, "Let's eat later and catch the movie first," then you begin to
suspect that yes, he does have an inner life which is changing what hunger
means to him.
One might conceive of machines which _will_ fail the Turing test, but only
after an arbitrarily long time -- consider a CD-ROM full of statements
and sentence fragments with which to compose possible answers.  And it 
would be rather easy to put together a program that passed the TT for at 
least five seconds.
The only way to make sure that your intelligent machine is being conscious
would be to look inside its workings and see that they corresponded to your 
formal definition of consciousness.
Either that, or quiz it for an arbitrarily long period of time.

-- Richard Wesson (departed@netcom.com)

