Newsgroups: comp.ai,comp.ai.edu,comp.ai.philosophy
Path: cantaloupe.srv.cs.cmu.edu!bb3.andrew.cmu.edu!nntp.sei.cmu.edu!news.psc.edu!hudson.lm.com!news.math.psu.edu!chi-news.cic.net!usc!howland.reston.ans.net!nntp.coast.net!zombie.ncsc.mil!news.mathworks.com!uunet!in2.uu.net!EU.net!peer-news.britain.eu.net!bcc.ac.uk!zcacsst
From: zcacsst@cs.ucl.ac.uk (Simone Stumpf)
Subject: Re: Expert Systems, AI and Philosophy
Message-ID: <zcacsst.818159367@cs.ucl.ac.uk>
Date: Tue, 5 Dec 1995 10:29:27 GMT
Distribution: inet
References: <498thr$jit@charm.magnus.acs.ohio-state.edu> <Pine.SOL.3.91.951127235109.14424D-100000@amor.rz.hu-berlin.de> <199511292203.RAA02021@syr.edu> <Pine.SOL.3.91.951205002441.14631B-100000@amor.rz.hu-berlin.de>
Organization: University College London
Lines: 62
Xref: glinda.oz.cs.cmu.edu comp.ai:35140 comp.ai.edu:2988 comp.ai.philosophy:35472

Oliver Seidel <h0444xmd@rz.hu-berlin.de> writes:


[some lines snipped]
>about "measuring" real understanding we can apply it to the turing test.
>second, maybe you know the "church-turing" thesis, which turns your 
>question around: "a turing machine can computate everything any kind of 
>machine can do - in respect to physical laws involving every "machines" 
>the human brain is also such kind of machine thus can be computated by 
>the theoretical turing machine." you see, "attaining real understanding" 
>is not necessary for this thesis, but i think its still an interesting 
>question.

Quick interjection: Some problems are not Turing-machine computable
(such as the Post Correspondence Problem) but can be solved by Humans.
How do you explain that with the above test?

>i think the generation of computers we are dealing with today are not 
>able to achieve "real understanding" in some kind of "cognitive matter".
>maybe we'll never know wheater a computer "really" understands something 
>or just pretending it.

Understanding = Consciousness?

>ugh, "language" as a program in my understanding is a wrong thought.
>in the early years of mankind, people had no language, but they wanted to 
>communicate. so they evolved in many thousand years representations of 
>optical impressions (what they have seen...) and language got more and more 
>specialized for best defining things.
>note that language is just a "tool" created by ourselves. you sometimes 
>have the impression that you "think" in language, but thats why from our 
>birth on we get teached in combinations of "speech patterns" and optical 
>impressions (like - pointing at your father - "this is daddy" ..) and 
>associating "speech patterns" (language elements) with "real elements".

This raises the question then of how we ever managed to learn words
for emotions, mind activities etc. It could be argued that this is due
to social conditioning, however, one must be be able to associate
different internal states (which are not visible outside) to language.
Does this mean that we have to understand first before we speak?

>the last part of your question "understanding is limited by language" i 
>would rename "communication is limited by language" to be a true statement.
>language is not fixed, its still evolving. today we cannot explain our 
>exact feelings, or the "range" of some impressions we are exchanging with 
>other humans. there's a german proverb "ein bild sagt mehr als tausend 
>worte" (in english something like " a picture says more than thousand 
>words").
> 
>language today is just a tool made by mankind for exchanging a small 
>fraction of the "world of understanding".

Considering that storage of memory might be visual and the
translation process is inadequate, what does that say about
understanding?

Simone
--
***********************************************************************
S. Stumpf               "The avalanche has already started,
<zcacsst@cs.ucl.ac.uk>   it is too late for the pebbles to vote"
http://www.cs.ucl.ac.ul/students/zcacsst/zcacsst.html
