next up previous contents
Next: Road map Up: Difficulty of performance analysis Previous: Approximating general distributions   Contents

Multidimensional Markov chains

One of the major achievements in computational probability is the development of algorithmic methods, which are known as matrix geometric methods [136] and matrix analytic methods [111]. Matrix geometric/analytic methods allow us to efficiently analyze a Markov chain on a state space that is infinite in one dimension and finite in another dimension (1D Markov chain), provided that the Markov chain has a certain structure (Figure 1.3 shows an example of such a 1D Markov chain). This is in contrast to classical approaches (that provide exact and explicit expressions), which result in very complicated expressions that are difficult to evaluate numerically, for the general 1D Markov chains.

Figure 1.3: A 1D Markov chain.
\includegraphics[width=.5\linewidth]{fig/MC/1D.eps}

The difficulty in the analysis of many multiserver systems stems from the fact that the Markov chain that models the behavior of the system often has a state space that is infinite in multiple dimensions (multidimensional Markov chain). Most multidimensional Markov chains cannot be analyzed efficiently via matrix analytic methods or other computational methods. Unfortunately, the multidimensional Markov chain appears quite frequently in the analysis of multiserver systems with resource sharing or job prioritization, as these multiserver systems involve multiple classes/types of jobs.

Figure 1.4: A multidimensional Markov chain that appears in a multiserver analysis. Here, the beneficiary jobs arrive according to a Poisson process with rate $\lambda_B$, and the donor jobs arrive according to a Poisson process with rate $\lambda_D$. The service demands of these jobs have exponential distributions. The beneficiary jobs are usually served by the beneficiary server with rate $\mu_B$, and the donor jobs are served by the donor server with rate $\mu_D$. When there are no donor jobs and there are at least two beneficiary jobs, the donor server helps the beneficiary server (i.e., the donor server processes a beneficiary job), and hence the beneficiary jobs are completed with rate $2\mu_B$.
\includegraphics[width=.8\linewidth]{fig/model/CSloadbalance.eps}
$\Longrightarrow$
modeling
\includegraphics[width=.8\linewidth]{fig/MC/CS2D.eps}

For example, consider two servers, Betty and Dan, each serving its own queue, but Dan (donor) allows Betty (beneficiary) to steal his idle cycles. That is, when Dan has no jobs in his queue (and Betty has more than one job in her queue), we allow Dan to process Betty's job. We can model the behavior of this simple cycle stealing system by a Markov chain, but the Markov chain has a state space that is infinite in two dimensions (2D Markov chain). This is illustrated in Figure 1.4. Since there is an inherent dependency between the number of Dan's jobs and the number of Betty's jobs, the 2D Markov chain cannot simply be decomposed into two 1D Markov chains. As a result, the performance analysis of this cycle stealing system requires an analysis of the multidimensional Markov chain. Here, a fundamental problem arises in the performance analysis of multiserver systems.

How can we analyze Markov chains on multidimensionally infinite state spaces?


next up previous contents
Next: Road map Up: Difficulty of performance analysis Previous: Approximating general distributions   Contents
Takayuki Osogami 2005-07-19