Newsgroups: comp.ai.philosophy
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!gatech!nntp.msstate.edu!olivea!news.hal.COM!decwrl!netcomsv!netcom.com!jqb
From: jqb@netcom.com (Jim Balter)
Subject: Re: Bag the Turing test (was: Penrose and Searle)
Message-ID: <jqbD0KMxu.Mw2@netcom.com>
Organization: NETCOM On-line Communication Services (408 261-4700 guest)
References: <CzFr3J.990@cogsci.ed.ac.uk> <3c5nml$370@news1.shell> <3c68og$ql8@agate.berkeley.edu> <3c7dli$a5m@news1.shell>
Date: Sat, 10 Dec 1994 01:32:18 GMT
Lines: 53

In article <3c7dli$a5m@news1.shell>, Hal <hfinney@shell.portal.com> wrote:
>jerrybro@uclink2.berkeley.edu (Gerardo Browne) writes:
>
>>Hal (hfinney@shell.portal.com) wrote:
>
>>: I maintain that reasonable people will
>>: believe that such a beast is not conscious in the sense that you and I
>>: are, and that they are not necessarily confused or in error.
>
>>But why?  What is the basis for this judgement?  To me it seems at this
>>point a mere visceral discomfort.
>
>(I want to clarify that I did not mean that _all_ reasonable people would
>disbelieve in the conscious HLT, only that some would.)
>
>The HLT seems to have no representation of the richness of our internal
>mental life.  It is little more than a tape recording of all possible
>conversations.
>
>These internal mental states that are apparently missing from an HLT can
>be observed to some extent in the brain.  With electrical probes we can
>observe states of arousal, moments of decision, and other correlates of
>the subjective aspects of consciousness.  But I maintain that there is no
>way even in principle to observe these phenomena in the HLT, because they
>are not there.

If it is these phenomena that are essential to being conscious "in the sense
that you are I are" (I don't know, because no one seems to be willing to
say what aspects are essential; they just point to the whole and say "that's
it" and then claim that there's a fact of the matter as to whether some thing
or the other is "really" in that "natural category" or somesuch), then they
can be added simply by adding a bunch of intermediate states to the HLT;
entries that contain utterances or partial utterances, but aren't final
output states, they just lead to states that are.  Perhaps we could throw
in some delay loops ("moments of decision"). Is that's what's needed
to ascribe consciousness?  Would that satisfy *you*?  What would, short of
an exact replica of a human brain?

>These missing internal mental states are what justify denying that
>the HLT has a mind.  The fundamental structural difference between the
>HLT and biological minds (which are the only things we really know to
>be conscious) give reason to hesitate in extrapolating from our
>personal experiences of consciousness to the assumption that a
>recording of conversations could be conscious.

Yep.  "humanism" all the way.  As long as consciousness is defined in terms
of artifacts of human mentation, then "is it conscious" will be determined
by distance from human structure.  But is that really how we want to define
consciousness?


-- 
<J Q B>
