Conclusion

Computational complexity remains a major problem in application of probability theory and decision theory in knowledge-based systems. It is important to develop schemes that improve the performance of updating algorithms -- even though the theoretically demonstrated worst case will remain NP-hard, many practical cases may become tractable.

In this paper, we studied importance sampling in Bayesian
networks. After reviewing the most important theoretical results
related to importance sampling in finite-dimensional integrals, we
proposed a new algorithm for importance sampling in Bayesian
networks that we call *adaptive importance sampling* (AIS-BN).
While the process of learning the optimal importance function for
the AIS-BN algorithm is computationally intractable, based on the
theory of importance sampling in finite-dimensional integrals we
proposed several heuristics that seem to work very well in
practice. We proposed heuristic methods for initializing the
importance function that we have shown to accelerate the learning
process, a smooth learning method for updating importance function
using the structural advantages of Bayesian networks, and a
dynamic weighting function for combining samples from different
stages of the algorithm. All these methods help the AIS-BN
algorithm to get fairly accurate estimates of the posterior
probabilities in a limited time. Of the two applied heuristics,
adjustment of small probabilities, seems to lead to the largest
improvement in performance, although the largest decrease in
*MSE* is achieved by a combination of the two heuristics with the
AIS-BN algorithm.

The AIS-BN algorithm can lead to a dramatic improvement in the
convergence rates in large Bayesian networks with evidence
compared to the existing state of the art algorithms. We compared
the performance of the AIS-BN algorithm to the performance of
likelihood weighting and self-importance sampling on a large
practical model, the CPCS network, with evidence as unlikely as
5.54×10^{-42} and typically
7×1.0^{-24}. In our
experiments, we observed that the AIS-BN algorithm was always
better than likelihood weighting and self-importance sampling and
in over 60% of the cases it reached over two orders of
magnitude improvement in accuracy. Tests performed on the other
two networks, PATHFINDER and ANDES, yielded similar results.

Although there may exist other approximate algorithms that will prove superior to AIS-BN in networks with special structure or distribution, the AIS-BN algorithm is simple and robust for general evidential reasoning problems in large multiply-connected Bayesian networks.