From newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!utgpu!pindor Tue Mar 24 09:54:41 EST 1992
Article 4379 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!ists!helios.physics.utoronto.ca!news-server.csri.toronto.edu!utgpu!pindor
>From: pindor@gpu.utcs.utoronto.ca (Andrzej Pindor)
Subject: Re: Definition of Understanding
Message-ID: <1992Mar10.184546.24673@gpu.utcs.utoronto.ca>
Organization: UTCS Public Access
References: <8diu35K00WBNE3KbNS@andrew.cmu.edu>
Date: Tue, 10 Mar 1992 18:45:46 GMT

In article <8diu35K00WBNE3KbNS@andrew.cmu.edu> fb0m+@andrew.cmu.edu (Franklin Boyle) writes:
>Andrzej Pindor writes (in response to my post):
>
>>So humans are *non-physical* pattern matching systems? How do we do it?
>>The existing experimental (as opposed to speculative) evidence seems to
>suggest
>>that the information the brain has is a combination of high and low voltages
>>and perhaps arrangement of molecules (like parts of computer memory). Do you
>>have any other suggestions?
>
>No, I didn't say they were non-physical, just that the physical process
>of pattern matching which occurs in digital computers, for example, is
>not the physical way that humans recognize patterns.  *How* the voltage
>combinations are causal is different.
>
Very good! Then how do humans recognize patterns? Are you saying that the actual
_physical_ process is crucial for the functions of the mind? How do you know
that? Neural nets are presumably closer to the way the brain functions, would
you accept that a neural net computer might duplicate mind? If I understand it,
anti-AI crusaders wouldn't have this either.
 
>>Very hard to say, in particular that we do not know how a mental image arises
>>in our mind. Or perhaps you know and that is why you are so sure that
>hamburger
>>image can not come to the "computer's mind"  in a way analogouos to the way
>>we experience it? Why don't you share with us this knowledge?
>
>Let's just say that, in a computer, I can have any structures I want
>for the pattern/matcher pairs, since their particular structures, because
>they physically fit, have nothing to do with the I/O behavior. Moreover,

I am not sure if I understand you correctly. Why is it important that structures
for the pattern/matcher pair have something to do with the I/O _behaviour_?
And of course you exaggerate saying 'any structures'. Only the structures which
are rich (or flexible) enough to to do sufficiently discriminatory job.
I agree that it presumably still leaves us a lot of alternatives.

>different algorithms will produce the same I/O behavior.  Therefore, you have

So what? Different ways of analysing input may produce the same output also
in humans, like when you have several different justifications for a certain
action, not an unusual occurence!

>arbitrary physical structures as well as a large choice of process structures
>that yield the same behavior.  However, insofar as there is structural
>information about what a hamburger looks like, the actual physical structures
>in a computer can be anything, implying that there is not necessarily
>information about the appearance of hamburgers in the computer, or anything
>else, for that matter.  Yet the computer can behave as if it "knows" about
>hamburgers.
>
I do not understand your point at all. We can express the same things using
English, Chinese or a sign language, right? Does it mean there is no informa-
tion exchanged using these languages?

>>Why are you sure that a computer could not have the same *sensation* when
>>accessing a picture of hamburger in memory and when actually seeing one?
>>I am really puzzled!
>
>What does it mean for a computer to "actually see one"?  If the image from

A very good question! And what does it mean for you to "actually see a hambur-
ger"? Anything else except a pattern of hight and low voltages entering your
brain from optic nerves causing a particular pattern of neuron firings in
the brain leading perhaps to some molecular rearrangements (beyond those which
are involved in conducting electric signals in the brain pathways)?  Maybe
something more, but the present state of the brain research says that's it (if
I am not up to date, please someone correct me).

>a camera is encoded as arbitrary bit strings (arbitrary because as long
>as there are matchers that physically fit it, it can be a bitmap, prop., etc.),
>then what is the computer "seeing"? 
>
See a comment above about different languages. We can encode the same info
in arbitrary sequence of sounds and for the info to be exchanged we just need
a matcher suitable for the encoding used, do you agree? So what is your
argument about?

>>Is there a reason that they could not have? (if I understand you
>correctly, you
>>mean computer-based 'pattern matching systems', right?)
>
>Any pattern matching system, as long as the physical process is a structural
>fit between pattern and matcher.
>
>>Sorry, but I do not see these reasons. If this information is
>physically >encoded
>>then *in principle* we can give it to CR and CR can process it. Of course, you
>>might escape into this 'non-physical' stuff, but unless you have any evidence
>>that such thing exists, it is religion, not science. 
>>BTW, I have nothing against religion (at least personal one, organized is a 
>>different thing), but it is good not to confuse these two things. 
>
>No religion here.
>
I am relieved. Discussion about religious views can only be concluded by 
resorting to a 'holy war' :-(.

>-Frank


-- 
Andrzej Pindor
University of Toronto
Computing Services
pindor@gpu.utcs.utoronto.ca


