Newsgroups: comp.ai.philosophy
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!pipex!uunet!swissbank!root
From: gerryg@il.us.swissbank.com (Gerald Gleason)
Subject: Re: Thought Question
Message-ID: <1995Jan16.152734.12430@il.us.swissbank.com>
Sender: root@il.us.swissbank.com (Operator)
Nntp-Posting-Host: ch1d264nwk
Organization: Swiss Bank Corporation CM&T Division
References: <1995Jan14.043829.29350@galileo.cc.rochester.edu>
Date: Mon, 16 Jan 1995 15:27:34 GMT
Lines: 57

Greg Stevens writes

> Okay, you're missing the point again, I think.  Consider a machine which
> had no consicouness, but was programmed to behave EXACTLY as you do. No.
> we don't have the technology, and possibly there is not enough memory 
> capacity in the universe to do such a program without the kind of
> process the gives rise to consciousness, but this is a thought-
> experiment, right?

> I think what was being asked for us to consider was this: Consider a
> machine that was programmed to respond to stimuli the same as us, but
> had no consciousness.  There would be no evolutionary reason for it to
> be selected out, with us superior, if its behaviors were the same, and
> all it was lacking was subjectivity.  Thus, it seems that there is no
> evolutionary benefit to subjective experience per se.

I don't think this is how most people are looking at natural selection and  
evolution.  In the past, there was a lot of talk about fitness as as scale  
from most fit down to least fit, and thus the idea of evolutionary  
"progress".  It doesn't really work that way.  There are forms that are  
successful, and that succuss is always relative to an environment filled  
with other creatures, plants, microbes, etc.  Asking whether consciousness  
contributes to life is like asking whether living contributes to life.  It  
survives and is fit because it could get started and it continues to  
survive.  Consciousness like life is a fact.  When reductionism is applied  
ruthlessly, it leads to the absurd conclusion that these are not facts.   
Most of us know better.

> Now, you may find that subjective experience is something that coincides
> with our physical reactions, but many people have questioned whether
> this is necessarily so -- i.e., the artificial intelligence simulation
> debates about whether a computer that ACTED conscious would BE
> conscious.  It is exactly that debate that is being addressed.  You
> comments about the evolutionary disadvantages of being a leper are
> interesting, but not finally the point.  We are talking about something
> that can and does equally respond to environmental stimuli, but has no
> subjective experience or "qualia."  Such a thing, we have no advantage
> (evolutionarily) over.  Procreation and survival do not depend on mind,
> but on behavior.

First tell me what actions are evidence of consciousness.  I can think of  
a number of human behaviors that I cannot immagine a computer simulation  
of that would pass a Turing Test like inspection, that would not also be  
consious.  Of course many of these behaviors would be hard to test for  
this way.  For example, humans create societies,  A group of conscious  
computer systems would be capable of creating an AI society, which  
probably wouldn't be very understandable to most (if not all) humans.

Is it possible that consciousness is a type of behavior, nothing more,  
nothing less?  The range of conscious behavior being strongly limited by  
the hardware and software of the machines producing this behavior (wether  
human or digital machine).  Perhaps computers can never be "conscious like  
we are", but can be "conscious like an AI" instead.

> Greg Stevens

Gerry Gleason
