Newsgroups: comp.ai.philosophy
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!ix.netcom.com!netcom.com!vlsi_lib
From: vlsi_lib@netcom.com (Gerard Malecki)
Subject: Re: Bag the Turing test (was: Penrose and
Message-ID: <vlsi_libD0F6sq.CBD@netcom.com>
Organization: VLSI Libraries Incorporated
References: <3c0vo1$si4@news1.shell> <3c2on7$geg@cantaloupe.srv.cs.cmu.edu>
Date: Wed, 7 Dec 1994 02:55:37 GMT
Lines: 47

In article <3c2on7$geg@cantaloupe.srv.cs.cmu.edu> hpm@cs.cmu.edu writes:
>
>>Hal Finney:
>>I still see a difference between the HLT and the wall.  Nobody ran an AI
>>simulation to create the wall; its existence does not constrain the
>>universe to contain a mind which has had certain thoughts.  All these
>>examples except the Turing test passers are like the wall.  A video game
>>character is not conscious.  Nobody feels pain when a Mortal Kombat
>>character gets his head ripped off.  Only the TT cases show us a mind.
>
>I really don't believe that.  I can sympathize with the pain of a
>character in a novel.  If there isn't something feeling pain, what am
>I sympathizing with?  Am I not sympathizing with a platonic entity
>feeling real pain?  The novel's characterization fuzzily defines this
>entity, and additional details about the character's life would
>sharpen the definition by chopping away at the fuzz of alternatives.

Sympathizing with a character in a novel or a movie doesn't lend 
consciousness to that character. The above argument is incredulous,
to say the least. Imagine if it were true. I would just visualize
my enemies burn in hell, and lo, they would suffer!

Your sympathy or any other feeling towards any entity, real or 
imaginary, represents a state of your own consciousness. Nothing more,
nothing less.
>
>The real difference between the Turing test passer and the rock is
>that the TT passer defines a platonic mind very precisely to us, in
>our language, while the rock provides no help: to us it`s all an
>undistinguished fuzz of possible alternative platonic entities,
>including a vast majority that are mindless.  In between, the
>characterization in the novel provides enough definition to eliminate
>the mindless alternatives.  Maybe enough, even, to compile the novel's
>character into an AI program that could pass a Turing test.  We do
>something very close to the latter, anyway, when we imagine or dream
>about interacting with a character we've read about.

Again, there is a question of algorithmic complexity. The amount of text
in a typical novel of about a 1000 pages is insufficient to represent
an individual's complete personality. My guess is that it may be about
ten to 100 times as intelligent as Eliza. Novels only reflect what the
character in question did under a finite set of circumstances. In 
general, one could not know from the biography of a real person his
opinion on scientology, unless the author specifically mentions it.

Shankar Ramakrishnan
shankar@vlibs.com
