From newshub.ccs.yorku.ca!torn!utcsri!rpi!zaphod.mps.ohio-state.edu!pacific.mps.ohio-state.edu!linac!att!pacbell.com!decwrl!world!eff!ibmpcug!pipex!unipalm!uknet!warwick!mrccrc!daresbury!icdoc!syma!edmunds Thu Jul  9 16:20:31 EDT 1992
Article 6421 of comp.ai.philosophy:
Xref: newshub.ccs.yorku.ca comp.ai.philosophy:6421 sci.psychology:3758 comp.ai:3587
Path: newshub.ccs.yorku.ca!torn!utcsri!rpi!zaphod.mps.ohio-state.edu!pacific.mps.ohio-state.edu!linac!att!pacbell.com!decwrl!world!eff!ibmpcug!pipex!unipalm!uknet!warwick!mrccrc!daresbury!icdoc!syma!edmunds
>From: edmunds@syma.sussex.ac.uk (Edmund Shing)
Newsgroups: comp.ai.philosophy,sci.psychology,comp.ai
Subject: Re: Emotional Frogs?
Keywords: Emotion,Computational Modelling
Message-ID: <1992Jul6.083903.14066@syma.sussex.ac.uk>
Date: 6 Jul 92 08:39:03 GMT
Organization: University of Sussex
Lines: 116

To: way@gothamcity.jsc.nasa.gov (Bob Way)
In-reply-to: way@gothamcity.jsc.nasa.gov's message of 25 Jun 1992 18:53:52 GMT
Subject: Emotional Frogs?
--text follows this line--
On 25 Jun 1992 18:53:52 GMT, way@gothamcity.jsc.nasa.gov (Bob Way) said:


BW> A while back there was an interesting discussion of levels 
BW> of intelligence. Specifically, he one related to rocks, 
BW> frogs, mice, apes & humans.

BW> In interesting (to me) similarity among the non-mineral 
BW> categories is that they all seem to exhibit behaviors which 
BW> suggest that they may possess certain levels of *Feelings* 
BW> and *Emotions*.

BW> For this discussion let me briefly draw a distinction between 
BW> *Feelings* and *Emotions*. When I say *Feelings* I mean things 
BW> specific to the physical operation of the body. (i.e. hunger, 
BW> fatigue, pain, etc.) When I refer to *Emotions* I mean 
BW> psychologically(?) related states. (i.e. fear, anticipation, 
BW> happiness, curiosity, confusion, etc.)

There are many problems in trying to define what emotions and feelings
are, probably due to the vagueness and inadequacy of everyday
definitions of these terms. Any definition that encompasses the notion
of consciousness is by definition going to be subject to attacks of
vagueness (handwaving) due to the fact that the term consciousness is
itself ill-defined. I personally would say that your definitions of
feelings and emotions are almost certainly going to be insufficient
for basing any argument on, although it could well be argued that this
can be said of virtually all definitions of emotions and emotional
states.  

BW> I speculate that even the low level frog (from the previous 
BW> thread) at least possesses the feeling pain and the emotion fear.

>From my own experience I believe that changes in my own *Feelings* 
BW> and *Emotions* have a great deal to do with my associative memory. 
BW> When IUm happy I usually begin to remember previous times I was 
BW> happy. The same is often true of fear or confusion. These strange 
BW> associations, based only on emotions, often help me to find my way 
BW> in novel circumstances.

BW> *End Monologue*


BW> Has there been any AI research which involves emotions in any way?

There is AI/cognitive science research on the computational modelling
of emotions. Several interesting models have been put forward, such as
those of Oatley & Johnson-Laird (1987), Sloman (1987) ,Ortony, Clore &
Collins (1988) and Frijda (1986). Computational models of the latter
two theories have been implemented by Elliott (1991) and Swagerman
(1987) respectively, and models based on expansions of Sloman's
design-based approach to emotion and motivation are being constructed
here at Birmingham (I am one of his PhD students). In addition,
Pfeifer (1988) has written an overview of AI models of emotion,
including models such as Colby's PARRY (a model of neurosis).

The basic premise of most of these theories is that emotions are
functional and play a crucial role in the everyday functioning of
human beings. For instance, Oatley & Johnson-Laird's theory suggests a
dual communicative role for emotions, the first role being one of
global signalling within the entire cognitive system of "important"
stimuli (such as an approaching tiger in the jungle). In this case the
global signalling system would bring this potentially life-threatening
stimulus to one's attention and trigger the flight/fight mechanism. 
The second function is that of social communication between
intelligent agents such as people. In many cases the types of plans
that people wish to carry out (based on mutual goals) depend on
cooperation between people which in turn requires communication between
people. Emotions (which are communicated between people by facial
expression, bodily posture, tone of voice etc.) can then give valuable
information about what people are thinking/planning to do even when
they don't wish to give this information away (e.g. when someone who
is upset cannot control their tone of voice, and their wavering of
tone indicates to other people that this person is upset).

BW> What are the possibilities for (effects of) using emotions like 
BW> these in a memory based reasoning system?

Based on cognitive studies of state-dependent memory recall and
recognition (e.g. Teasdale et al. on faster recall of
negatively-valenced memories by clinical depressives when depressed,
but not when not depressed), there have been models of this kind
suggested by people like Bower & Cohen (1982). I suppose that
state-dependent memory can be useful in selecting memories relevant to
the situation at hand in the same way that context-dependent memory is
(the context cues the recall of the associated memories); this may be
one way to order memories efficiently in the brain so as to facilitate
easier recall. 

BW> What is the possibility for low level intelligence emerging from a 
BW> robotic system based on emotions and associated memory?

What type of behaviour are you thinking of when you refer to low-level
intelligence? It may be that a design for a robot that can accomplish
simple functions in a limited domain may be aided in its plan
selection by choosing plans partly based on the context or "state" it
is in. 

BW> Obviously too much time on my hands...

BW> ----------------------------------------------------------------------
BW> Bob Way
BW> Computer Sciences Corporation
BW> NASA / Johnson Space Center


---------------------------------------------------------------------------
Edmund Shing			|  School of Computer Science
E-Mail: exs@cs.bham.ac.uk	|  University of Birmingham
'Chacun ses gouts'		|  Edgbaston
				|  Birmingham B15 2TT  UK.
---------------------------------------------------------------------------


