Newsgroups: comp.ai.alife,comp.ai.philosophy,comp.ai,alt.consciousness
Path: cantaloupe.srv.cs.cmu.edu!rochester!cornell!travelers.mail.cornell.edu!news.kei.com!news.mathworks.com!news.duke.edu!convex!cs.utexas.edu!howland.reston.ans.net!ix.netcom.com!netcom.com!departed
From: departed@netcom.com (just passing through)
Subject: Re: Thought Question
Message-ID: <departedD4IzwM.665@netcom.com>
Organization: NETCOM On-line Communication Services (408 261-4700 guest)
References: <vlsi_libD45qrE.zx@netcom.com> <3ijdad$4hu@mp.cs.niu.edu> <departedD4HoF4.Jnr@netcom.com> <3il380$as1@mp.cs.niu.edu>
Date: Fri, 24 Feb 1995 22:31:33 GMT
Lines: 92
Sender: departed@netcom13.netcom.com
Xref: glinda.oz.cs.cmu.edu comp.ai.alife:2581 comp.ai.philosophy:25749 comp.ai:27763

In article <3il380$as1@mp.cs.niu.edu>, Neil Rickert <rickert@cs.niu.edu> wrote:
>In <departedD4HoF4.Jnr@netcom.com> departed@netcom.com (just passing through) writes:
>>In article <3ijdad$4hu@mp.cs.niu.edu>, Neil Rickert <rickert@cs.niu.edu> wrote:
>>Nah.  (I knew somebody was going to pick up on 'interesting' as
>>anthropomorphic.)  What I mean by interesting way is able to translate
>>that information into a different domain.
>
>No, that is not what you mean at all.  After all, the blinking light
>takes information about the gas tank contents, and converts it to
>the domain of modulated light.

This is not a substantially different domain of information.  1 bit -> 1 bit.
As far as this system is concerned, what is the information domain of the
gas tank?  Higher than X or lower than X.  What is the domain of the light?
On, or off.  How is the mapping done?  Higher->off, lower->on.  Higher/lower
on/off are very nearly the same domain.

>
>What you really mean is something that you -- a human -- will consider
>sufficiently different and complex to count.
>

No - it all counts -- some things just count very nearly not at all.

>>>>The throughput is very direct -- gas low --> blinking light.  It is you
>>>>who is being conscious,
>>>
>>>Of course, I agree.  That is why I introduced the blinking light.  I
>>>think it demonstrates that your proposed way of identifying
>>>consciousness doesn't work.
>
>>???  What's your point?  Does your car ever do anything about low gas
>>that indicates it's mapping that information through a complex space ??? 
>
>>Is the red light supposed to be a complex remapping since it 'means'
>>'low on gas, go to station'?  I think not; the information mapping
>>that your car is using (if examined) is dirt-simple.  Repeated experiments
>>won't show any other remapping for different inputs.  That's about all
>>you can do.
>
>Aha.  We are getting to the point.  It is complexity that counts.
>You understand the basic principles of the car, so it is not complex
>enough.  You don't understand the basic principles of the human, and
>therefore it is complex enough to make a human conscious.  If
>cognitive scientists ever work out how we work, then we will cease to
>be considered too complex and puff -- our consciousness will
>disappear in an instant.

No, I think the car's light/gas system IS 'conscious' -- to the very very
smallest possible degree.  One, quite uninteresting transformation.  Your
C-meter will barely budge.

It's certainly NOT conscious in the sense of "I'm thirsty, let's drink
some gas at a gas station" because it's not doing that kind of transformation.
It's information space is not complex enough to encompass 'thirsty'
'drink' or even 'fuel' -- i.e. it won't be able to perform any transformations
on any of these.  All it knows is high/low, which it CAN transform.  Very
slightly.  Hence, very slightly conscious.

I don't regard consciousness as a binary quality, and in fact if we wish to
study it, there are good reasons for hoping that it's not binary (i.e. just 
there or not there).  If it is [THERE] xor [NOT-THERE], then there's something 
funny going on with with whatever engenders consciousness -- like the existence
of a soul, maybe.  (ech).

Why would I say that anything you can understand is not conscious?
I'm not being in the least mystical about it, I'd like to say that the
'complexity' and 'interestingness' of transformations through an
information space is objectively measurable.  There are some rough tests
one could suggest, an interesting topic about which, unfortunately,
I've not thought much.

Like I said before (which you apparently neglected to read), different
processes will be more or less conscious.

[In a sense I would agree we are 'not conscious' -- that our consciousness
comes down to unconsciousness, ultimately.  Like a 'living' cell, which
is composed of 'dead' atoms.  I would hope so, or studying it shall be
very difficult.]

Anyhow, you appear to be defending a position that I probably mostly agree
with, against an attack that I'm not making.  Does this make it clearer?

-- Richard Wesson (departed@netcom.com)

Last point:  it's probably sort of a mistake to consider 'consciousness'
the property of an entity -- it should be considered the property of a
process.  This might clear up some confusion; i.e. is a human being
observed over a microsecond a 'conscious' entity?  You couldn't prove it.



