Operator: attend

Problem space: top-ps
Operator Overview:
Attention operators, patterned after Mark Wiesmeyer's thesis work (An Operator-Based Model of Human Covert Visual Attention) but greatly extended, allow us to select one of a number of mostly undifferentiated inputs. As Allen says in Unified Theories of Cognition: "Attend is the operator that brings cognition to focus on the stimulus element." (more)

Operator Proposal:
In this domain, attention proposals are primarily learned. Both the scan-page and the scan-step problem-spaces can lead to attention operator proposals. These preferences are returned to the top by the production generic*ao*subspace-ACTI, which appears in generic.soar6.

The attention operator can also be proposed spontaneously from the top space as a result of a new auditory input that is outside the current focus of attention. Because this operator can remain unselected, the input may never result in an attention shift and so may still be lost.

Operator Application:
The operator application generally results in output to the attention-link, which is implemented in the I/O code for the simulation in the file attention.c. The only other effect the operator has is to remove any existing information about the satisfaction of task criteria, since the shift of attention may change this result.

Operator Reconsider:
Automatic when output is sent to the attention link.
Productions are in file: attend.soar6