Newsgroups: comp.ai,comp.ai.philosophy,sci.cognitive
Path: cantaloupe.srv.cs.cmu.edu!rochester!udel-eecis!netnews.com!howland.erols.net!worldnet.att.net!ix.netcom.com!jqb
From: jqb@netcom.com (Jim Balter)
Subject: Re: Emotional computers
Message-ID: <jqbE43Hv8.GvJ@netcom.com>
Organization: NETCOM On-line Communication Services (408 261-4700 guest)
References: <32DBBF95.64C0@esumail.emporia.edu> <5bgeab$o2n@lastactionhero.rs.itd.umich.edu> <32DCCC23.41C6@cs.bham.ac.uk> <5bjiq1$1b0@usenet.srv.cis.pitt.edu>
Date: Thu, 16 Jan 1997 09:47:32 GMT
Lines: 33
Sender: jqb@netcom.netcom.com
Xref: glinda.oz.cs.cmu.edu comp.ai:43510 comp.ai.philosophy:51098 sci.cognitive:14474

In article <5bjiq1$1b0@usenet.srv.cis.pitt.edu>,
Anders N Weinstein <andersw+@pitt.edu> wrote:
>In article <32DCCC23.41C6@cs.bham.ac.uk>,
>Ian P Wright  <ipw@cs.bham.ac.uk> wrote:
>>                   . Herbert Simon (1967), in his seminal paper on
>>motivation and emotion, describes how a resource bound agent, operating
>>in a dynamic environment presenting many threats and opportunities to
>>the agent's goals, will require an `attention' interrupt mechanism that
>>speedily replaces current goals with new goals. For example, an animal
>>may be feeding, only to notice a predator, which causes the interruption
>>of its current goal and its replacement by a goal of flight, with
>>accompanying physiological arousal. 
>
>I thought the question was something like: why couldn't this goal
>reordering be completely dispassionate (free of emotion). Mr. Spock
>presumably perceives certain threats -- to his person, to the crew,
>whatever it his that moves him --  and reorders his goals too, calmly
>seeing what needs to be done to survive and taking appropriate action.
>He might exert himself to move very quickly if that is what is needed,
>but, ex hypothesi without evident emotion.

So calmness == lack of emotion.  How incredibly naive.

>I.e. a thought experiment: can't we imagine goal reordering without
>emotion? Of course these are always controversial, but that was the question.
>
>(I always wonder: Why would Mr. Spock bother? Does he have a Kantian purely
>rational will?)

This is exactly the point: emotion == bother.
-- 
<J Q B>

