Algorithms that alert on constraint violations and threats in a straightforward manner inundate the user in dynamic domains. Unwanted alerts were a problem in both our domains and in many other domains as well, such as medical monitoring . An aid that gives alerts every second will quickly be discarded by the user in stressful situations (if not immediately). To be useful, an execution aid must produce high-value, user-appropriate alerts. Alerts and their presentation may also have to be adjusted to the situation, including the user's cognitive state (or the computational state of a software agent). For example, in high-stress situations, tolerances could be increased or certain types of alerts might be ignored or postponed. In this section, we provide a conceptual framework for the alerting algorithms in our monitoring framework and our domain-specific EAs.
Our approach is grounded in the concept of determining the value of an alert. First, the system must estimate the value of new information to the user. Information theory derives from communication theory and the work by Shannon . In this theory, the value of information refers to the reduction in uncertainty resulting from the receipt of a message, and not to the meaning that the message (or the uncertainty reduction) has to the receiver . We use the term value of information (VOI) in a different sense, namely, the pragmatic import the information has relative to its receiver. (Of course, the reduction of uncertainty often has pragmatic import.) Like Weinberger , we assume that the practical value of information derives from its usefulness in making informed decisions.
However, alerting the user to all valuable information could have a negative impact in certain situations, such as when the alert distracts the user from more important tasks, or when too many alerts overwhelm the user. We therefore introduce the concept of value of an alert (VOA), which is the pragmatic import (for making informed decisions) of taking an action to focus the user's attention on a piece of information. VOA takes VOI into account but weighs it against the costs and benefits of interrupting the user. If the user is busy doing something significantly more important, then issuing an alert might not be valuable, even when VOI is high. VOA must generally estimate the user's cognitive state and current activities. VOA will generally determine the modality and other qualities of alert presentation (e.g., whether one should flash red text on a computer display or issue a loud audible warning).
VOI and VOA are highly correlated in most situations, and most general comments about VOI apply to VOA as well. However, VOA may be low while VOI is high if the user is highly stressed or preoccupied with more important tasks. It is also possible to have a high VOA and low VOI. For example, mission-specific monitors might alert the user to information that has been known for some time (and thus has little or no value as information) because the information is crucial to an upcoming decision and the user may have forgotten it, or may be behaving in a way that indicates a lack of awareness.
Weinberger gives a quantitative definition of pragmatic information, assuming a finite set of alternatives that lead to well-defined outcomes, each of which has some value to the decision maker. In realistic domains like ours, alternatives and outcomes are not precisely defined. Furthermore, information and decision theories (including Weinberger's) assume that the decision maker is aware of (or has processed) previous information and can devote sufficient resources to analyzing the current information. Under such assumptions of unlimited processing power, VOA and VOI are the same. In most realistic domains, these assumptions do not hold. Humans are resource bounded and, during fast-paced operations, alerts and information may be ignored and the user may not realize the implications of information on a complex plan that coordinates many team members.