Robustness analysis employs sets of distributions to model perturbations in the parameters of a probability distribution [Berger1985, Berger1990, Huber1980, Kadane1984, Wasserman1992b]. Robust Bayesian inference is the calculation of bounds on posterior values given such perturbations. This paper focuses on perturbations that can be modeled by finitely convex sets of distributions and presents the first convergent approximations for calculation of posterior bounds in Bayesian networks.
Models that attempt to combine Bayesian networks with probability intervals have faced serious difficulties. There has been no general model of probability intervals with the same style of efficient propagation used in Bayesian networks. The inefficiency comes from the fact that successive application of Bayes rule can create exponentially large number of vertices in a polytope representation. Two possibilities arise. One is to focus on particular classes of intervals that admit efficient algorithms [Chrisman1996b]. Another is to settle for approximations of inference bounds [Breese & Fertig1991, Tessem1992]. So far, no approximation with convergence guarantees or provably small error bounds has been found.
This paper derives exact solutions and convergent approximations for robustness analysis of Bayesian networks associated with finitely generated sets of distributions. Exact algorithms which generalize previous ideas [Cano, Delgado, & Moral1993] are presented and their complexity is characterized. We describe the situations that lead to high complexity in exact algorithms. Two classes of algorithms for numeric approximations are developed. The first class of algorithms reduces the robust inference problem to the estimation of probabilistic parameters in a Bayesian network. The second class of algorithms uses Lavine's bracketing method [Lavine1991] to simplify the optimization procedures required for robust inferences.
We use the theory of convex sets of probability distributions, referred to as Quasi-Bayesian theory [Giron & Rios1980]. Other names for the mathematical theory are theory of lower expectations [Huber1980] and theory of lower previsions [Walley1991]. In this theory, the imprecision in probability assessments can be due either to difficulties in eliciting information from experts, or to difficulties in processing or combining data.
Section 2 discusses the importance of robustness analysis in Bayesian networks. We then introduce some basic results and notation in Section 3, and briefly introduce Quasi-Bayesian theory in Section 4. Section 5 defines the concept of finitely generated convex sets of distributions and network representations. Section 6 presents exact algorithms for robust inferences and the trade-offs between them. Section 6 also presents a transformation that reduces a Quasi-Bayesian network to a Bayesian network, which is central to this paper. Section 7 also discusses the complexity of such inferences and some generic strategies to circumvent it, suggesting a simulated scheme that yields approximate bounds. Section 8 shows how to use the transformed network to reduce robust inference to a problem of parameter estimation, and how to use gradient descent techniques to solve them. Section 9 presents algorithms for calculation of posterior bounds which are based on Lavine's bracketing algorithm. A series of classes of distributions is considered in section 10: epsi-contaminated class (Section 10.1), lower density bounded class (Section 10.2), belief function class (Section 10.3), the sub-sigma class (Section 10.4, the density bounded class (Section 10.5), the total variation class (Section 10.6) and the density ratio (Section 10.7). Inference algorithms are derived by reducing each one of these classes to the class of finitely generated sets of distributions. Section 11 discusses the generalization of the results to expected utility and variance problems.
Tue Jan 21 15:59:56 EST 1997