Newsgroups: comp.ai,comp.ai.edu,comp.ai.philosophy
Path: cantaloupe.srv.cs.cmu.edu!rochester!cornellcs!newsstand.cit.cornell.edu!news.kei.com!uhog.mit.edu!news!minsky
From: minsky@media.mit.edu (Marvin Minsky)
Subject: Re: Expert Systems, AI and Philosophy
Message-ID: <1995Dec5.173446.15149@media.mit.edu>
Sender: news@media.mit.edu (USENET News System)
Cc: minsky
Organization: MIT Media Laboratory
References: <49jj75$2g3@charm.magnus.acs.ohio-state.edu> <49siiv$asr@news.ox.ac.uk> <c57cb$f3421.34f@news.dma.be>
Date: Tue, 5 Dec 1995 17:34:46 GMT
Lines: 37
Xref: glinda.oz.cs.cmu.edu comp.ai:35151 comp.ai.edu:2993 comp.ai.philosophy:35483

In article <c57cb$f3421.34f@news.dma.be> MASTER TWIT <SCE@smiley.be> writes:
>Humans don't understand the symbols of a language either, but 
>they learn.
>Words are a references to a objects, actions etc.

No.  This is where virtually all philosophers have screwed up.  Words
are involved with brain processes primarily (but not at all
exclusively) in the language areas of the brain--and they generally
refer to structures and activities in other parts of the brain.
These, in turn, may be related to external "real-world" events, but
not usually, and only indirectly at best.

>For a computer to understand words it must be able to know 
>what they are refering to.
>It has to know what to do with the them.

Agreed.

>To be able to make up a sentence, it needs to know the 
>language rules.

I doubt this.  What it needs is to have processes that can convert
representations into the right serial forms.  These processes surely
do indeed involve knowing adequate rules, but I doubt that they will
much resemble the kinds of rules that contemporary language folks
still use.

>If it knows synonym's it can search for a specific meaning in 
>a test or specific information..

Absolutely.  The speech production process often involves a lot of
reasoning, thinking, and testing.
>
>
>


