Newsgroups: comp.ai.philosophy
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!news.sei.cmu.edu!cis.ohio-state.edu!magnus.acs.ohio-state.edu!usenet.ins.cwru.edu!howland.reston.ans.net!agate!darkstar.UCSC.EDU!news.hal.COM!decwrl!netcomsv!netcom.com!vlsi_lib
From: vlsi_lib@netcom.com (Gerard Malecki)
Subject: Re: Bag the Turing test (was: Penrose and
Message-ID: <vlsi_libD0o672.69p@netcom.com>
Organization: VLSI Libraries Incorporated
References: <vlsi_libD0F6sq.CBD@netcom.com> <3c59b8$dd@cantaloupe.srv.cs.cmu.edu> <vlsi_libD0IGtI.J8o@netcom.com>
Date: Sun, 11 Dec 1994 23:21:02 GMT
Lines: 89

I am reposting this article, since my original posting might have not
reached a large number of sites due to some Usenet gridlock. My
apologies if it had already reached you.

Shankar Ramakrishnan 



In article <3c59b8$dd@cantaloupe.srv.cs.cmu.edu> hpm@cs.cmu.edu writes:
>
>Shankar Ramakrishnan:
>>Sympathizing with a character in a novel or a movie doesn't lend 
>>consciousness to that character. The above argument is incredulous,
>>to say the least. Imagine if it were true. I would just visualize
>>my enemies burn in hell, and lo, they would suffer!
>
>No, the character you are imagining would suffer.  Your enemy is a
>different character, living a different story.
>
>>Your sympathy or any other feeling towards any entity, real or 
>>imaginary, represents a state of your own consciousness. Nothing more,
>>nothing less.
>
>And your consciousness is itself a reality, like a simulation.
>Characters in a simulation are as real as characters anywhere.

But it depends on how detailed the simulation is. Someone raised the
question as to whether a Mortal Kombat character really feels pain
when killed. The answer is an emphatic no, since the character is 
just a graphic image with very limited intelligence (which is 
encoded in the software running the game). The central nervous system,
which is primarily responsible for consciousness in human beings,
has a complexity several orders of magnitude more than the complexity
of typical video game software. It is the same when we visualize
someone suffering. We only simulate the external physical attributes
of the character (such as crying, writhing in pain, etc.). Our 
cerebral cortices are incapable of simulating something as complex
as the human central nervous system, hence to say that the character
really feels pain is highly debatable.
>
>>Again, there is a question of algorithmic complexity. The amount of text
>>in a typical novel of about a 1000 pages is insufficient to represent
>>an individual's complete personality. My guess is that it may be about
>>ten to 100 times as intelligent as Eliza. Novels only reflect what the
>>character in question did under a finite set of circumstances. In 
>>general, one could not know from the biography of a real person his
>>opinion on scientology, unless the author specifically mentions it.
>
>If the novel describes the working of the character's consciousness,
>then it certainly is defining a conscious character.  The low
>information merely fails to precisely pick all the details of that
>character from a host of similar characters.  So you have a range of
>characters defined, all conscious.  Often you may be satisfied with
>the indistinctness.  If you need more details (say, you are making a
>movie from the book, or constructing a Turing test passer based on the
>character) then you can arbitrarily fill in the undefined details.
>There are things in signal recovery called maximum entropy methods
>that fill in unspecified details in a way that introduces the least
>amount of spurious detail.  You might improve the resolution of a
>character in a book the same way, assuming it is pretty average in all
>ways not specified.  So, you can read the novel's characterization as
>the deviations from default values in an average character, and thus a
>pretty full, complete, description.
>
>		-- Hans Moravec   CMU Robotics
>
>
This kind of averaging or smoothening out the "unnecessary" details
essentially erases the very factors that distinguish one person's
personality from another's. Things that seem so trivial, such as 
a crack in one's favorite coffee mug, can define personality. Suppose
we have two individuals who are identical in all respects, except
for the fact that one remembers the crack while the other doesn't. 
Also, suppose we switch the cracked mug with another which is 
identical except for the crack. The individual who is unaware of
the crack would sip his coffee merrily, while the other one would
be startled to find the crack mysteriously gone. One can devise any
litmus test that can differentiate personalities based on the most
trivial experiences of their past. Of course, they may pass off as
being each other on the Turing Test, but a Turing test would be
my last choice in distinguishing personalities.

Shankar Ramakrishnan
shankar@vlibs.com





