Newsgroups: comp.ai,comp.ai.philosophy,sci.cognitive
Path: cantaloupe.srv.cs.cmu.edu!rochester!rutgers!news.sgi.com!news.corp.sgi.com!enews.sgi.com!ix.netcom.com!jqb
From: jqb@netcom.com (Jim Balter)
Subject: Re: Emotional computers
Message-ID: <jqbE44w03.9v7@netcom.com>
Organization: Netcom
References: <32DBBF95.64C0@esumail.emporia.edu> <5bjiq1$1b0@usenet.srv.cis.pitt.edu> <jqbE43Hv8.GvJ@netcom.com> <5blv51$mkq@lastactionhero.rs.itd.umich.edu>
Date: Fri, 17 Jan 1997 03:50:27 GMT
Lines: 97
Sender: jqb@netcom.netcom.com
Xref: glinda.oz.cs.cmu.edu comp.ai:43539 comp.ai.philosophy:51140 sci.cognitive:14488

In article <5blv51$mkq@lastactionhero.rs.itd.umich.edu>,
Greg Stevens <gregs@umich.edu> wrote:
>Jim Balter (jqb@netcom.com) wrote:
>: Anders N Weinstein <andersw+@pitt.edu> wrote:
>: >Ian P Wright  <ipw@cs.bham.ac.uk> wrote:
>
>: >>                   . Herbert Simon (1967), in his seminal paper on
>: >>motivation and emotion, describes how a resource bound agent, operating
>: >>in a dynamic environment presenting many threats and opportunities to
>: >>the agent's goals, will require an `attention' interrupt mechanism that
>: >>speedily replaces current goals with new goals. [...]
>
>: >I thought the question was something like: why couldn't this goal
>: >reordering be completely dispassionate (free of emotion).
>
>This (above) is the question, if you were reading carefully, Jim.

This is what Anders "thought the question was something like"; I have
no idea who or what he purports to be paraphrasing here.

All I can see here is all sorts of implicit anthropomorphic assumptions
about what emotion is, that it would be detectable as "passion", etc.
I think the notion of "affect" needs to be more carefully treated
in order to play any sort of explanatory role.

>: > Mr. Spock
>: >presumably perceives certain threats -- to his person, to the crew,
>: >whatever it his that moves him --  and reorders his goals too, calmly
>: >seeing what needs to be done to survive and taking appropriate action.
>
>This (above) is an illustration, which is what you responded to.

The above is misdirection about what it is to be emotional or unemotional,
by equating it with calmness.  I have no idea "whatever it is that moves
him" if that is not affect, or emotion.  Why is "surviving" his priority?
Whose survival?  His person?  His crew?  "Whatever it is that moves him"?
This just seems to beg the question.  Why isn't it rational of Spock to
just breathe deeply and allow it all to happen?  After all, he and the rest
of the crew are mortal; what is gained by prolonging life?  Why does it matter?

>: So calmness == lack of emotion.  How incredibly naive.
>
>Rather than sidetracking on this strawman, why not answer
>the question?

Because the question was incoherent, and I pointed to that.

> I could even word it differently for you:
>
>If all of the following count as "behavioral/attentional
>interrupt signals for self-preservation":
>
>1) a reflex reaction not accompanied by arousal or any phenomenological
>   experience of emotion.
>2) the thought, "Gee... I should do X, or I will die" accompanied
>   by a change of behavior, but in the absence of feelings of or
>   physiological changes associated with emotions.
>3) Running away because you feel really, really scared.

These are all informal descriptions that carry all sorts of assumptions
and lack any sort of *model* by which they can be clearly delineated.
I am quite convinced that the philosophical zombie notion
is incoherent; 1 may be unrealizable, and I am quite dubious about 2.
3 involves unelaborated notions of causation; the "reason" here may be mere
attribution.  I have no particular reason to think that any of these are
coherent descriptions of anything actually going on.

>Then why do you think simply implementing such an interrupt-signal
>would be sufficient to implement emotion?  It may be necessary,
>but since it does not differentiate 1 - 3 above, it must not
>be sufficient.

I never said anything about anything being sufficient for anything.
Where did you get such a bizarre idea?  I was commenting on Anders' naive
notions of "lack of emotion", certainly not proposing some naive notion
of emotion = interrupt signal.

And if your point here is that this is a paraphase of Ander's question,
I am really lost, since Herbert Simon, are reported by Ian Wright above,
referred to a "requirement" for such an interrupt signal, which is
a necessity condition, not a sufficiency.  Who thinks that emotion
is just an attention signal?  Certainly not I.  Yet Anders seems to
think that it is the presence of "passion", non-"calm"ness, some sort of
naive common association with "emotion".  To me this is reminiscent
of the distance between folk notions of survival of the fittest
as a matter of lions roaming the jungle from the subtle niche-filling
of Darwin's theory.

Star Trek can be a good source of insight into such issues primarily in a
*negative* way, in how *wrong* its premises are.  The characters of Spock and
Data, despite their "calmness" and denials, are chock full of emotion:
affection, concern, curiousity, haughty superiority, etc. etc.  All sorts of
things that serve to *differentiate* among goals in various complex situations.

-- 
<J Q B>

