A density ratio class consists of all probability densities p(A) so that for any event A [DeRobertis & Hartigan1981]:

Gamma^{R}_{l,u}(p(A)) = {

A global neighborhood can be constructed with a sub-class of the density ratio class. Take a base Bayesian network and a positive constant k>1. Consider the set of all joint distributions such that for some alpha:

Gamma^{R}_{k}(p(x)) =
{{_{i} p_{i} <= alphar(x) <= k prod_{i} p_{i}

{ r(A) }/{ r(B) } <= k^{2} { p(A) }/{ p(B) } .

Since the class is marginalization and conditionalization invariant, the first step is to obtain p(x|e), the marginal posterior for the base distribution of the class. Now we can set up a linear programming problem:

_{x is in x} u(x) r(x|e)

r(x|e) <= k^{2} { p(x|e) }/{ p(y|e) } r(y|e),

For expected value calculations, maximization/minimization of expectation for
u(x) = x_{q} can be easily performed. For calculation
of posterior marginals, take
u(x) = delta_{a}(x_{q}), where delta_{a}(x_{q}) is
one if x_{q} = a and zero otherwise. In this case
~~E~~[u] = ~~p~~(x_{q} = a | e), the posterior
probability for x_{q}. The linear programming problem can be
solved in closed-form [Seidenfeld &
Wasserman1993]:

~~r~~(x_{q} = a|e) =
{ k p(x_{q} = a|e) }/{ k p(x_{q} = a|e) + p(x_{q} = a^{c}|e) } ,

__r__(x_{q} = a|e) =
{ p(x_{q} = a|e) }/{ p(x_{q} = a|e) + k p(x_{q} = a^{c}|e) } .

The linear programming problem above can be intractable if x
has too many variables. In this case a Gibbs sampling procedure can
be applied to the problem. Consider a sample of the
posterior distribution p(x|e) with N elements X_{j},
which can be produced through Gibbs sampling
techniques [York1992]. The following expression converges
to the upper expectation of a function u() [Wasserman & Kadane1992]:

_{j} (_{j} }/{ N } + { Z_{0} }/{ N }

Z_{0} = sum_{j} u(X_{j})

Z_{j} = sum_{l >= j} u_{(l)}

Thu Jan 23 15:54:13 EST 1997