From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael Tue Mar 24 09:55:30 EST 1992
Article 4441 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!psych.toronto.edu!michael
>From: michael@psych.toronto.edu (Michael Gemar)
Subject: Re: Definition of understanding
Organization: Department of Psychology, University of Toronto
References: <1992Mar10.204754.1137@gpu.utcs.utoronto.ca> <1992Mar11.164816.18444@psych.toronto.edu> <1992Mar12.162059.28643@gpu.utcs.utoronto.ca>
Message-ID: <1992Mar13.000150.9603@psych.toronto.edu>
Keywords: meaning, understanding
Date: Fri, 13 Mar 1992 00:01:50 GMT

In article <1992Mar12.162059.28643@gpu.utcs.utoronto.ca> pindor@gpu.utcs.utoronto.ca (Andrzej Pindor) writes:
>In article <1992Mar11.164816.18444@psych.toronto.edu> michael@psych.toronto.edu (Michael Gemar) writes:
>>In article <1992Mar10.204754.1137@gpu.utcs.utoronto.ca> pindor@gpu.utcs.utoronto.ca (Andrzej Pindor) writes:
>>>>
>>>Problem, which I have tried to point out in the past, is in the content of
>>>the database for the Chinese squiggles. English word `hamburger` correlates in
>>>the English person's mind for instance with a mental picture of hamburger - the
>>>person had seen a hamburger in the past and knew this object was 'a hamburger'.
>>>If the database for Chinese squiggles had a picture of hamburger correlated 
>>>with the corresponding squiggle (and the same for other squiggles), would you
>>>still maintain that the person would not understand what he/she is doing?
>>
>>Yes, I would still maintain it, if the picture were encoded, say, as pictures
>>are on a laser disc (a perfectly reasonable coding scheme for a computer to
>>use).  Can *you* understand .GIF files by "looking" at them?
>>
>Aren't you getting confused? If this is the way the info for Chinese squiggles
>is coded, then there is no connection between the man's mind and the CR system
>whether part of his brain is used to house the database, the rule book and 
>'scraps of paper' or not. The man is just a part of the system and demanding
>that he understands is just as sensible as demanding that any single part 
>of the brain understands. 

To be honest, I *am* getting confused, as I thought the point you were making
was that the *computer* didn't have the appropriate information in its
database to understand what a hamburger was.  The point I was (perhaps
poorly) trying to make above is that pictures have no more *inherent*
meaning than any of the other symbols that the computer is shuffling.  All
of it just gets encoded as marks.

As far as a connection between the man's mind and the CR system, I don't
see why the above situation changes things.   

>>However, I don't think that this response addresses the problem, which is
>>*still* that syntax can't yield semantics.  Adding some kind of enriched
>>senory signal will not solve this, as Searle argued in his response to the
>>"Robot Reply". 
>> 
>I find this all talk as long as you don't tell me what this 'semantics' is 
>that it cannot arise from processing sensory input. 

I would suggest that you re-read Searle's response to the Robot Reply for
the full account.  Briefly, the input signals you get from the sensors are
just like any other signals in the system, namely, just more marks being
shuffled.  They do not carry any inherent meaning with them, and manipulating
them is no different than manipulating the other marks in the system.   

>>>If you insist that the person has to `understand` what the squiggles represent,
>>>you have to provide him/her with the same info about the squiggles as he/she
>>>has about English words.
>>
>>I do not claim to have a complete answer as to how people attach meaning to
>>symbols.  *No one* claims to have solved this problem in a satisfying manner.
>>However, what we seem to have is a pretty good idea of how it *can't*
>>be solved, and one of these ways is by invoking pure syntax.  
>>
>You mean you are sure that it (semantics) cannot be produced by processing
>sensory input? If it can, isn't it then in the final count reducible to 'pure
>syntax'? If it cannot, then what do you suggest?

Well, *we* certainly obtain meaning from something that happens in our
brains to sensory input.  In that sense, meaning *does* arise, at least in
part, from sensory stimulation.  But this is not to say that whatever happens
in the brain is syntactic processing of symbols.  As I noted above, I
*don't* know how the brain produces meaning.  But I know it does, *and* I
know that one way it *doesn't* is by purely syntactic manipulations (if
it's any comfort, I also know it doesn't produce it by Searle's suggested
"causal powers," whatever the heck *those* might be).

 
Andrzej, I'll make the same suggestion to you that I've made to others.  I
think that we have come to loggerheads over foundational premises.  I believe
that syntax can't yield semantics.  You seem to believe otherwise.  If I
haven't convince you you're mistaken, and you haven't convinced me, I'm
not sure if it's worth pursuing this particular line of discussion without
some novel argument on one side or the other.  Why don't we just
leave things like this, and have a go at something else...

- michael



