From newshub.ccs.yorku.ca!torn!utcsri!rpi!usc!sdd.hp.com!ux1.cso.uiuc.edu!mp.cs.niu.edu!rickert Wed Aug 12 16:52:03 EDT 1992
Article 6540 of comp.ai.philosophy:
Newsgroups: comp.ai.philosophy
Path: newshub.ccs.yorku.ca!torn!utcsri!rpi!usc!sdd.hp.com!ux1.cso.uiuc.edu!mp.cs.niu.edu!rickert
>From: rickert@mp.cs.niu.edu (Neil Rickert)
Subject: Re: Defining Intelligence
Message-ID: <1992Aug1.120921.11694@mp.cs.niu.edu>
Organization: Northern Illinois University
References: <1992Jul24.023513.25326@mp.cs.niu.edu> <1992Aug1.043453.6538@ntuix.ntu.ac.sg>
Date: Sat, 1 Aug 1992 12:09:21 GMT
Lines: 29

In article <1992Aug1.043453.6538@ntuix.ntu.ac.sg> eoahmad@ntuix.ntu.ac.sg (Othman Ahmad) writes:

>The retrievel systems has similarities to human memory in that it stores and
>retrieves information. Information theory is very exact because it does not
>specify how the information looks like, it only defines it as a pattern store.
>How the pattern looks like is immaterial. It is an abstract concept but very
>exact. I'm not sure if you understand it but I would certainly like to know
>why you do not understand.

 and

>The example is the problem of flight:
>
>The first problem is defining flight. Is a kite a flying machine? A glider?

  I'm not quite sure what your point is.

  Perhaps you are saying that in my comments I am just quibbling, the
equivalent of saying that planes don't fly.

  If that is your point, it is certainly not mine.  That sort of argument
gets you nowhere.

  Memory and learning are tremendously important.  If you get them wrong,
you won't get the right type of intelligence.  What my comments on this
line have been trying to say, is that if you base it on a strict store/
retrieve model of memory, you will get it wrong.  To use your analogy,
it would be like basing a theory of flight on the flapping of wings.



