From newshub.ccs.yorku.ca!torn!utcsri!rutgers!jvnc.net!darwin.sura.net!mips!sdd.hp.com!cs.utexas.edu!uunet!munnari.oz.au!metro!grivel!loki!rmurali Thu Jul  9 16:20:26 EDT 1992
Article 6414 of comp.ai.philosophy:
Path: newshub.ccs.yorku.ca!torn!utcsri!rutgers!jvnc.net!darwin.sura.net!mips!sdd.hp.com!cs.utexas.edu!uunet!munnari.oz.au!metro!grivel!loki!rmurali
>From: rmurali@loki.une.edu.au (Murali Ramakrishnan)
Newsgroups: comp.ai.philosophy
Subject: Re: Emotional Frogs?
Message-ID: <2905@loki.une.edu.au>
Date: 3 Jul 92 05:06:02 GMT
Organization: University of New England - Northern Rivers, (Lismore)
Lines: 26

> It was asked if there has been AI research which involved emotions in any way.
> Some that I know of is that done by Michael Dyer, where he builds a model
> of emotions to aid in interpreting narrative text, as presented in his work
> "In Depth Understanding", MIT Press, 1982.

                  Yes , it might be possible to build such systems and
  even program them to "react" to emotions / feelings / instincts etc.
  Such "concepts" can be "translated" into "machine-understandable codes"
  and a "system" made to react to such codes. 

                  But , the point is , once such "mental states" are
  "translated" , can they still be categorised as "mental states" ?
  As far as the "system" or the computer is concerned , they are only a 
  bunch of binary codes - only electrical pulses. This is , again , 
  nothing but the "Chinese-room problem" , ofcourse.

                  There have been articles on this topic - e.g. the 
  article "Can A computer Feel Pain ?" ( or some such title) in 
  Dennett's book "Brainstorm". But , it does not answer many of the 
  fundamental questions. Seems , it is "built" on Logic , rather than
  Reason. 

-- 
Ramakrishnan Murali      	Internet:rmurali@loki.une.edu.au
Phone:	+61 (066) 203634
     University of NewEngland, Northern Rivers Lismore NSW Australia


