next up previous
Next: Approximate inferences through parameter Up: Robustness Analysis of Bayesian Previous: Exact algorithms and the

Complexity and generic approximations

 

As more and more variables are associated with credal sets, algorithmic complexity increases, since the number of vertices M increases exponentially with the number of credal sets. For exact algorithms, this fact either leads to exponential time complexity (first algorithm above) or exponential memory requirements (second algorithm above). To tackle the most complicated problems, we must use approximations.

One general strategy to produce approximations is use optimization algorithms that maximize a generic non-linear function defined over a discrete set. There are heuristic algorithms developed in the learning community and a variety of genetically-inspired algorithms that can be used as optimization tools for this purpose. One class of algorithms that can be useful for discrete optimization is the class of simulated annealing algorithms. The idea is to visit a set of configurations for the artificial variables and select only the configurations that are either better than the current configuration, or that are accepted based on a Gibbs distribution [Geman & Geman84, Press et al. 1992]. The Gibbs distribution is regulated through a ``temperature schedule'', which is set heuristically. Casting the problem of robust inference as a general optimization problem is an interesting perspective for future research, but this paper concentrates on more specific approximations which use the structure of the networks for efficiency.



© Fabio Cozman[Send Mail?]

Tue Jan 21 15:59:56 EST 1997