Newsgroups: comp.ai.alife,comp.ai.philosophy,comp.ai,alt.consciousness
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!usenet.ins.cwru.edu!news.ecn.bgu.edu!siemens!princeton!flagstaff.princeton.edu!schechtr
From: schechtr@flagstaff.princeton.edu (Joshua B. Schechter)
Subject: Re: Thought Question
Message-ID: <1995Jan14.225341.24541@Princeton.EDU>
Originator: news@hedgehog.Princeton.EDU
Sender: news@Princeton.EDU (USENET News System)
Nntp-Posting-Host: flagstaff.princeton.edu
Organization: Princeton University
References: <1995Jan12.184559.2530@galileo.cc.rochester.edu> <3f5nuu$mks@ixnews2.ix.netcom.com> <1995Jan14.043829.29350@galileo.cc.rochester.edu>
Date: Sat, 14 Jan 1995 22:53:41 GMT
Lines: 35
Xref: glinda.oz.cs.cmu.edu comp.ai.alife:1787 comp.ai.philosophy:24646 comp.ai:26484

In article <1995Jan14.043829.29350@galileo.cc.rochester.edu> stevens@prodigal.psych.rochester.edu (Greg Stevens) writes:
>Okay, you're missing the point again, I think.  Consider a machine which
>had no consicouness, but was programmed to behave EXACTLY as you do. No.
>we don't have the technology, and possibly there is not enough memory 
>capacity in the universe to do such a program without the kind of process
>the gives rise to consciousness, but this is a thought-experiment, right?
>
>I think what was being asked for us to consider was this: Consider a machine
>that was programmed to respond to stimuli the same as us, but had no
>consciousness.  There would be no evolutionary reason for it to be
>selected out, with us superior, if its behaviors were the same, and all
>it was lacking was subjectivity.  Thus, it seems that there is no
>evolutionary benefit to subjective experience per se.

I think a valid question is "Is such a mechanism possible?" Thought
experiments are all nice and dandy, but if the object of the
experiment is a logical impossibility (which I'm not necessarily
saying this is) the experiment is useless.

Is it possible to have such a complex mechanism which is unconscious?
It seems (to me) that consciousness has to be part of the system somewhere...

But, to adress this experiment. For arguments sake, taking the
possibility of such a mechanism for granted, wouldn't the program (a
variation of a humongous lookup table) have its "subjectivity" encoded
in the table itself? It would directly be encoded there by its
programmer. (This is a more advanced version of what would happen if a
chess master encoded every possible move he would make, in every
possible situation, in a program. He would encode his chess making
ability in the program.) If it looks like a duck, and quacks like a
duck...

Perhaps, I am not sure exactly what is meant by "subjectivity".

Josh
