The principal contribution of this work lies in the explicit representation of decision steps and the implications this has for the handling of knowledge goals. Cassandra is, we believe, the first planner in which decisions are represented as explicit actions in the plans that it constructs. Cassandra's knowledge goals arise specifically from the need to decide between alternative courses of action, as preconditions of the decision actions. Cassandra is thus consistent with the view that planning is the process of making decisions in advance. In this view, contingency plans are plans that defer some decisions until the information on which they are based will be available [Pryor 1995]. Different plan branches correspond to different decision outcomes.
Through its use of explicit decision steps, Cassandra distinguishes between sensing or information-gathering actions on the one hand, and decision making on the other. One important reason for making this distinction is that a decision may depend on more than one piece of information, each available through performing different actions. In addition, separating information-gathering from decision-making provides a basis for introducing alternative methods for making decisions. For example, the extension to Cassandra described in Section 6.5.5 introduces a type of decision that directs the executing agent to perform all branches resulting from a given source of uncertainty, which allows the construction of plans that can succeed in situations in which there is no way of telling what the actual outcome is (e.g., the bomb-in-the-toilet problem). We believe that the explicit representation of different methods for making decisions is an important direction for future research.
Because knowledge goals arise as preconditions of decisions in Cassandra, the need to know whether a particular plan branch will work is distinguished from the need to know the actual outcome of an uncertainty. Cassandra does not plan to determine outcomes unless they are relevant to the achievement or otherwise of its goals. Moreover, Cassandra does not treat knowledge goals as special cases: plans to achieve them may be as complex as plans to achieve any other goals. As well as planning to achieve knowledge goals that arise as preconditions of decisions, Cassandra can also produce plans for top-level knowledge goals.
Two other features of Cassandra are worth noting: the flexibility afforded by its labeling scheme; and the potential for learning and adaptation afforded by its representation of uncertainty.
Cassandra's labeling scheme, although complex, allows the agent executing the plan to distinguish between three classes of action: those that must be executed in a given contingency; those that must not; and those whose execution will not affect the achievement of the goal in that contingency. This feature paves the way for the extension described above that allows Cassandra to build plans requiring the execution of all branches resulting from a source of uncertainty.
Cassandra's representation makes no assumptions as to the intrinsic nature of uncertainty. An unknown precondition simply denotes that the information as to what context will produce a particular effect from an action is not available to the planner. It may be that this information is in principle unknowable (in domains involving quantum effects, for example); it is much more likely that the uncertainty results from the limitations of the planner or of the information available to it. In general, an agent operating in a real-world domain will be much more effective if it can learn to improve its performance and adapt to changing conditions. The use of unknown preconditions to represent uncertainty means that in some circumstances it would be relatively simple to incorporate the results of such learning and adaptation into the planner's domain knowledge. For example, the planner might discover how to predict certain outcomes: it could then change the unknown preconditions into ones reflecting the new knowledge. If, on the other hand, it discovered that predicted effects were consistently failing to occur, it could change the relevant preconditions into unknown ones.