We now consider the class of (multidimensional) Markov chains to which DR can be applied. We will start with a simpler (limited) class of 2D Markov chains, which we refer to as the foreground-background (FB) process. The FB process includes the Markov chain that models an M/PH/2 queue with two priority classes (Figure 3.4(c)) as well as the Markov chains in Figure 3.1. We will then discuss two types of generalization of the FB process: the recursive FB process (RFB process) and the generalized FB process (GFB) process (see Figure 3.7). Both processes can be analyzed via DR. The RFB process includes Markov chains on higher () dimensions, and the GFB process includes 2D Markov chains with fewer restrictions on their structure. In this section, we will provide only an intuitive idea of the structure of the FB, RFB, and GFB processes. Precise definitions will be given in Section 3.4.

Roughly speaking, an FB process consists of a foreground process and a
background process, where the behavior of the foreground process can
depend on the state (e.g., number of jobs) of the background process.
For example, in the case of an M/M/2 queue with two priority classes,
the Markov chain for the high priority jobs
(Figure 3.2(a)) can be seen as the background process,
and the Markov chain for the low priority jobs
(Figure 3.2(b)-(d)) can be seen as the foreground
process. In general, we assume that a
foreground process is a QBD process as defined in
Section 3.2, but the behavior of the foreground
process can depend on the ``level'' (e.g., number of jobs) of the
background process^{3.1}.

The RFB process generalizes the FB process to higher () dimensions. Roughly speaking, an RFB process consists of processes, where the behavior of the -th process can depend on the state (e.g., number of jobs) of the ()-th process for . We will see that DR can be applied recursively to the RFB process.

The GFB process allows a background process to also depend on a
foreground process. Recall the 2D Markov chain (FB process) in
Figure 3.3(c) and the 1D Markov chain in
Figure 3.3(d) that we obtain via DR.
Observe that the top two rows in the 2D Markov chain in Figure 3.3(c)
do not have to have any special structure
so that it can be reduced to a 1D Markov chain as in Figure 3.3(d).
In the GFB process, we assume that there exists a level
in the *background* process such that
the background and foreground processes behave independently of each other
while the *background* process is in levels ,
but the background and foreground process can have complex interaction
while the background process is in levels .