From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!csd.unb.ca!morgan.ucs.mun.ca!nstn.ns.ca!aunro!ukma!darwin.sura.net!europa.asd.contel.com!uunet!mcsun!uknet!edcastle!aiai!jeff Thu Feb 20 15:20:34 EST 1992
Article 3711 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.ecf!utgpu!csd.unb.ca!morgan.ucs.mun.ca!nstn.ns.ca!aunro!ukma!darwin.sura.net!europa.asd.contel.com!uunet!mcsun!uknet!edcastle!aiai!jeff
>From: jeff@aiai.ed.ac.uk (Jeff Dalton)
Newsgroups: comp.ai.philosophy
Subject: Re: Intelligence Testing
Message-ID: <6184@skye.ed.ac.uk>
Date: 13 Feb 92 21:54:26 GMT
References: <1992Jan31.142711.17883@oracorp.com>
Sender: news@aiai.ed.ac.uk
Organization: AIAI, University of Edinburgh, Scotland
Lines: 161

In article <1992Jan31.142711.17883@oracorp.com> daryl@oracorp.com writes:
>Jeff Dalton writes (in response to Daryl McCullough):
>
>>>Whether or not this is a reasonable argument, it is certainly not
>>>anything like the reasoning *I* use. I came to the conclusion that
>>>other people were conscious long before I knew anything much about the
>>>brain's role in mental processes (I *still* don't know much about it).
>
>>I don't see anything wrong with improving the argument one uses.
>>Before I knew anything about brains, they had no part in it, of
>>course.
>
>I don't consider the argument from similarity of brains to be any
>improvement, since I had no doubts beforehand. What key fact about a
>brain do you think is of the most help in convincing you that a person
>is conscious? What brain deformity is so severe that you would doubt
>that a person with such a deformity could be conscious, in spite of
>the person's behaving normally?

"No doubts beforehand"?  So Searle and the problem of other minds and
all the rest come along, and you don't even think of reconsidering
your position?  "I had no doubt before, so why should any of these
arguments make any difference"?  Is that it?

The questions after that are almost beside the point.  The argument
is not "look! the brain has feature X; hence consciousness".  I don't
know nearly enough about the brain for that.  

>> So when other people behave like they're conscious (for instance),
>> I conclude they are.  But I'm quite willing to decide someone is
>> conscious before they've passed the Turing Test, or done much of
>> anything in the way of verbal behavior, and I rather suspect you
>> do as well.
>
>Jeff, the discussion is always about the *sufficiency* of the Turing
>Test, not its necessity. 

You seemed to be saying you relied entirely on behavior.  I don't
think that's so.  I think that you, like me, conclude people are
conscious before they pass any Turing Test.  What basis do you
use then?  It's certainly not the Turing Test.

>>>In a hypothetical situation where a frog (something I wouldn't
>>>normally consider intelligent) starts acting intelligently, I would
>>>eagerly adjust my opinion of the frog. 
>
>>That's interesting. A robot came up to me once in Harvard Square
>>and started talking to me. I didn't conclude it was intelligent.
>>I concluded that someone was controlling it, perhaps by radio.
>>I think I was right. Perhaps you'd say I was wrong.
>
>You didn't conclude that *it* was intelligent, but you concluded that
>you were communicating with an intelligent being!

I wouldn't conclude _the frog_ was intellingent.  Not without 
further investigation, at least.  That was my point, I think
I said it clearly, and it even looks like you understood me
("didn't conclude *it* was intelligent").  You, on the other hand,
said "I would eagerly adjust my opinion of the frog".

I'm sorry, but I happen to think my reaction is a more reasonable
one.  And I think you'd probably agree with me if you weren't so
concerned with justifying the Turing Test.  Indeed, you at least 
say (later on) "and I assure myself that it isn't a trick (a hidden
speaker, or ventriloquism)".

>I am claiming that facts about brains are irrelevant for why people
>believe that other beings are conscious, even though it may play a
>role in justifying those beliefs.

Well, _something_ has to be relevant, apart from passing the TT,
because people conclude that other people are conscious without
giving them Turing Tests.

>You're missing the point. Suppose that we meet a race of talking frogs
>with brains sufficiently different from ordinary frogs to allow them
>to be able to talk, but with brains still significantly different from
>humans. Would you doubt that they were conscious? 

It depends on what I knew about them.  If I knew that they had some
rules that they were following in order to produce answers, I might
wonder whether they actually understood English or whether they
were, in effect, using a sophisticated phrase book.

It depends, indeed, on what other explanations are available.
No speaker, no ventriloquism, no phrase book -- pretty soon there
are only a few explanations left; and then inference to the best
explanation might say: consciousness. 

You seem to think that all cases with the right behavior (and
no obvious tricks) have to be treated the same, so you think
you can substitute any example for any other.  But because I
don't agree that all cases with the right behavior are the
same, I may give different answers to different examples.
I consider much more than the Turing Test when answering
(unless,of course, nothing else is available).

>> A number of things we thought were true would have to be false.
      [for frogs to be able to engage us in conversation]
>> Evidently you have greater faith in the Turing Test than in those
>> other things. I would be more cautious.
>
>Why do you think that denying that a being is conscious is the
>*cautious* approach? Isn't it worse to treat a conscious being as
>unconscious, than vice-versa?

I think it would be incautious, unreasonable, and generally a
bad idea to think the Turing Test was more reliable than all
those other things, at least without some pretty strong additional
evidence.

>>>So, I actually *do* use the Turing Test in judging whether others are
>>>conscious, so it would take a pretty strong argument as to why I can't
>>>use it in judging a machine, or why a machine should be judged any
>>>differently than humans.
>
>>I suspect that if you use it at all you do so very seldom.  I
>>doubt that, before you conclude some random person on the street
>>is conscious, has intentionality, etc, you make sure they can
>>discourse on poetry, make sure they're not a table lookup machine
>>by asking them about current events, or try to see if they repeat
>>themselves too often.
>
>Do you routinely examine the brains of people on the street to
>determine whether their brains are sufficiently like yours? The above
>paragraph is silly, since it supports your brain analogy theory even
>less than the Turing Test theory.

My original point was physical similarity.  Brains are a part of
that.  And yes, given what I know of the present state of science,
the experience of doctors, etc, when I see a human being, I rather
do tend to consclude that they have a brain.  Do you seriously
want to suggest this is an unreasonable conclusion to make?

>Of course, I can reason from past
>experience that most human beings have certain behavioral
>characteristics, so I don't need to check for those characteristics
>each time. However, I claim that I am reasoning from past experience
>of human behavior, not from past experience with human brain
>structure.

You have to identify them as human beings, and that's where physical
similarity comes in.

Moreover, we don't ever require that humans be able to pass difficult
Turing Tests (like being able to discourse interestingly on poetry)
before judging them conscious.

>I still think that your "reasoning by analogy" is fishy. What,
>precisely, do you look for in a person's brain to know whether it is
>similar enough to yours for you to consider the person conscious?

I have never in all our discussion, in news or e-mail, said there
was some precise thing about brains I looked for.  It would be a
lot easier for you if I had made everything depend on that, but
I didn't.  The only reason brains get a specific mention is that
changes to the brain do affect mental life, so it's clearly a 
relevant part of the body (or as clear as these things get in
medicine).

-- jd


