From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!bonnie.concordia.ca!uunet!mcsun!uknet!edcastle!aiai!jeff Tue Nov 26 12:30:41 EST 1991
Article 1413 of comp.ai.philosophy:
Xref: newshub.ccs.yorku.ca comp.ai.philosophy:1413 rec.arts.books:10166 sci.philosophy.tech:1005
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!bonnie.concordia.ca!uunet!mcsun!uknet!edcastle!aiai!jeff
>From: jeff@aiai.ed.ac.uk (Jeff Dalton)
Newsgroups: comp.ai.philosophy,rec.arts.books,sci.philosophy.tech
Subject: Re: Daniel Dennett
Keywords: Godel, Turing, Dennett, Charlatan
Message-ID: <5657@skye.ed.ac.uk>
Date: 19 Nov 91 19:30:52 GMT
References: <11779@star.cs.vu.nl> <11785@star.cs.vu.nl>
Reply-To: jeff@aiai.UUCP (Jeff Dalton)
Organization: AIAI, University of Edinburgh, Scotland
Lines: 64

In article <11785@star.cs.vu.nl> peter@cs.vu.nl (Grunwald PD) writes:
>> Appeals to eminent authority fail to impress me. Defend the Turing Thesis.

[Was corrected to: Turing Test.]

>Why are you convinced that you are not the only conscious being in
>the world?

Not solely on the basis that other humans pass the Turing Test.

>The only evidence you have is the one you get by observing other people:

And from observing myself.

>they behave in a way that absolutely convinces you of them being
>conscious of them-selves, being aware of the world.

It doesn't _absolutely_ convince me.

>So if someday a computer would be just as convincing: what makes you 
>conclude that it is an imitation? that it is not conscious?

Just as convincing is not the same as passing the Turing Test.

One of the big problems with these Searle, TT, etc discussions is
that they go over the same ground again and again and again.  The
same behaviorist point you're making has been made many times.
What are some people not convinced?  Because we've never heard
the argument?  [no]  Because we think we're the only conscious
entities?  [no]  Because we dualists, or for some other reason
think mind can't be material in nature?  [sometimes; not always]

For me, the answer goes like this.  I'm conscious, and I have
good reasons to believe that other people work more or less in
the same way I do.  So when they seem to be conscious, I assume
they are.  But I don't think it's impossible to be fooled.
I don't know enough about consciousness, or about what the
possible "tricks" are to rule that out.  And neither, so far
as I can tell, does anyone else.

If a computer came along that could pass the Turing Test,
and do other interesting things, then I'd want to know how 
it worked.  I'd want to decide whether it was a clever trick
or whether it wasn't or whether I couldn't tell.

It's not clear right now what criteria I'd use to decide this.
But then there aren't any Turing-capable machines either.  When
there are such machines, when we know how they work, and when we
know more about how consciousness works in humans, we might
well have learned something that would let us distinguish real
consciousness from tricks.

The behaviorists want to say this is just impossible on principle.
They only good argument they have is that everything I'd use to
decide whether something was real consciousness or a trick would
have to be something "observable".  Exactly what will eventually
be observable is not clear.  Maybe we'll develop telepathy and
have good reasons for regarding it as a text of consciousness.
But even if we leave that out, even if we stick to what's 
observable right now, "everyting observable" is not the Turing
Test.  It's much more like Steve Harnad's (hope I've spelled this
right) Total Turing Test.

-- jd


