next up previous contents
Next: Analysis of priority M/PH/2 Up: Overview Previous: Overview   Contents

Analysis of priority M/M/2 queue via DR

In this section, we introduce the basic idea behind DR by sketching an analysis of an M/M/2 queue with two priority classes via DR. Specifically, we reduce the 2D Markov chain in Figure 3.1(b) to a 1D Markov chain that closely approximates the 2D Markov chain, such that the 1D Markov chain can be analyzed efficiently via existing approaches.

First, observe that the performance of high priority jobs can be analyzed independently of the low priority jobs, since the behavior of the high priority jobs is not affected by the (number of) low priority jobs in the system. Specifically, Figure 3.2(a) shows the Markov chain that exactly models the number of high priority jobs in the system, and an analysis of this Markov chain allows us to calculate the mean response time of high priority jobs.

By contrast, the behavior of the low priority jobs is affected by the number of high priority jobs in the system. Specifically, when there are no high priority jobs in the system, two servers are available for the low priority jobs (see Figure 3.2(b) for the behavior of the low priority jobs in this case); when there is one high priority job in the system, one server is available for the low priority jobs (see Figure 3.2(c)); when there are more than one high priority jobs in the system, no server is available for the low priority jobs (see Figure 3.2(d)).

Figure 3.2: Markov chains for an M/M/2 queue with two preemptive priority classes. (a) Markov chain for the number of high priority jobs. (b)-(d) Markov chains for the number of low priority jobs conditioned on the number of high priority jobs.
Number of high priority jobs

\includegraphics[width=.8\linewidth]{fig/MC/PrioH1D.eps}
(a)
Number of low priority jobs

\includegraphics[width=.8\linewidth]{fig/MC/PrioL1DLevel0.eps}
(b) when there are no high priority jobs

\includegraphics[width=.8\linewidth]{fig/MC/PrioL1DLevel1.eps}
(c) when there is one high priority job

\includegraphics[width=.8\linewidth]{fig/MC/PrioL1DLevel2.eps}
(d) when there are $\geq 2$ high priority jobs

Observe that the 2D Markov chain in Figure 3.1(b) consists of the Markov chains in Figure 3.2. Specifically, each column in the 2D Markov chain corresponds to the Markov chain in Figure 3.2(a); the first row (respectively, the second row) in the 2D Markov chain corresponds to the Markov chain in Figure 3.2(b) (respectively, Figure 3.2(c)), and each of the other rows in the 2D Markov chain corresponds to the Markov chain in Figure 3.2(d).

Our goal is to reduce this 2D Markov chain into a 1D Markov chain that closely approximates the 2D Markov chain. Toward this end, observe that the behavior of the low priority jobs is determined only by whether there are 0, 1, or $\geq 2$ high priority jobs. As long as there are $\geq 2$ high priority jobs in the system, the precise number of high priority jobs does not affect the behavior of the low priority jobs. Also, observe that the infinite number of rows in the 2D Markov chain stems from the infinite number of states (levels) in the Markov chain for the high priority jobs (Figure 3.2(a)).

This motivates us to construct a Markov chain, for the high priority jobs, that differentiates only between 0, 1, or $\geq 2$ high priority jobs (see Figure 3.3(b)). In Figure 3.3(b), the sojourn time in states with $\geq 2$ high priority jobs is approximated by a two-phase Coxian$^+$ PH distribution, which we introduce in Section 2.2. Below, we refer to this sojourn time in states with $\geq 2$ high priority as the ``busy period'' (of the high priority jobs), since all the servers are busy with serving high priority jobs during this period. Specifically, a busy period denotes the time from when two servers become busy until only one server is busy. We will use the moment matching algorithm developed in Chapter 2 to well-represent (see Definition 1) the duration of the busy period by a PH distribution.

Figure 3.3: Dimensionality reduction of 2D Markov chain.
Number of high priority jobs

\includegraphics[width=.8\linewidth]{fig/MC/PrioH1D.eps}
(a) Markov chain on an infinite state space
$\Longrightarrow$
\includegraphics[width=.75\linewidth]{fig/MC/PrioH1Dfinite.eps}
(b) Markov chain on a finite state space


Number of high priority and low priority jobs

\includegraphics[width=.95\linewidth]{fig/MC/Prio2D.eps}
(c) 2D Markov chain
$\Longrightarrow$
\includegraphics[width=.95\linewidth]{fig/MC/Prio1D.eps}
(d) 1D Markov chain

Using Figure 3.3(b) as the Markov chain for the high priority jobs, we obtain a 1D Markov chain that models the M/M/2 queue with two priority classes (see Figure 3.3(d)). Observe that the 1D Markov chain tracks the number of low priority jobs exactly, but differentiates only between 0, 1, and $\geq 2$ high priority jobs. Note that if the duration of the busy period is exactly matched by the PH distribution, the behavior of the low priority jobs in the 1D Markov chain (Figure 3.3(d)) would be exactly the same as that in the 2D Markov chain (Figure 3.3(c)). We will see that the 1D Markov chain in Figure 3.3(d) can be analyzed efficiently via matrix analytic methods (see Section 3.2), and this allows us to compute the mean response time of low priority jobs.


next up previous contents
Next: Analysis of priority M/PH/2 Up: Overview Previous: Overview   Contents
Takayuki Osogami 2005-07-19