Newsgroups: comp.ai.alife,comp.ai.philosophy,comp.ai,alt.consciousness
Path: cantaloupe.srv.cs.cmu.edu!fs7.ece.cmu.edu!hudson.lm.com!godot.cc.duq.edu!news.duke.edu!eff!usenet.ins.cwru.edu!howland.reston.ans.net!pipex!sunic!news.chalmers.se!news.gu.se!gd-news!d6242
From: sa209@utb.shv.hb.se (Claes Andersson)
Subject: Re: Thought Question
Message-ID: <1995Jan20.152545.26556@gdunix.gd.chalmers.se>
Sender: usenet@gdunix.gd.chalmers.se (USENET News System)
Nntp-Posting-Host: d6242.shv.hb.se
Organization: Dept. of economy and computer science.
X-Newsreader: News Xpress Version 1.0 Beta #2.1
References: <1995Jan12.184559.2530@galileo.cc.rochester.edu> <3f5nuu$mks@ixnews2.ix.netcom.com> <1995Jan14.043829.29350@galileo.cc.rochester.edu> <D2KE10.42o@gremlin.nrtc.northrop.com>
Date: Fri, 20 Jan 1995 21:42:27 GMT
Lines: 41
Xref: glinda.oz.cs.cmu.edu comp.ai.alife:1891 comp.ai.philosophy:24851 comp.ai:26678

mcohen@charming.nrtc.northrop.com (Martin Cohen) wrote:
>In article <1995Jan14.043829.29350@galileo.cc.rochester.edu> stevens@prodigal.psych.rochester.edu (Greg Stevens) writes:
>>
>>Okay, you're missing the point again, I think.  Consider a machine which
>>had no consicouness, but was programmed to behave EXACTLY as you do. No.
>>we don't have the technology, and possibly there is not enough memory
>>capacity in the universe to do such a program without the kind of process
>>the gives rise to consciousness, but this is a thought-experiment, right?
>>
>>I think what was being asked for us to consider was this: Consider a machine
>>that was programmed to respond to stimuli the same as us, but had no
>>consciousness.  There would be no evolutionary reason for it to be
>>selected out, with us superior, if its behaviors were the same, and all
>>it was lacking was subjectivity.  Thus, it seems that there is no
>>evolutionary benefit to subjective experience per se.
>>
>>Greg Stevens
>>
>>stevens@prodigal.psych.rochester.edu
>>
>
>But, I beleive that if such a machine existed, it would HAVE
>to be conscious in order to respond the way I do. My responses
>are affected (and, perhaps, dominated) by the part of me
>(whatever it is) that is conscious.

I don't think so, it would have to act in an advanced elaborate
way but what says that it has to be aware of its own presens?
A machine with an AI can act in a much more life-like way than
any simple organic organism, but no one thinks that the
machine i conscious?

When I say unconscious beings, I don't mean Zombies walking
around without plan. I mean lifeforms acting exactly like we
do but like "machines" just because none of them have that
strange sparkle of self-awareness. What is strange about it is
the gradual advantage of and increasing qualitative self-
awareness.

Claes Andersson. University of Bors. Sweden

