Next, we introduce the full range of ideas behind DR by sketching an analysis of an M/PH/2 queue with two priority classes via DR. We will see that the approach that we followed in the case of an M/M/2 queue cannot simply be applied in the case of an M/PH/2 queue. In particular, we will see that a collection of PH distributions is needed to approximate the duration of the busy period. We assume that the service demand of high priority jobs has a twophase Coxian PH distribution shown in Figure 3.4(a), and the service demand of low priority jobs has an exponential distribution with rate . As before, the low (respectively, high) priority jobs arrive according to a Poisson process with rate (respectively, ).
(c) 
As in the case of an M/M/2 queue, the mean response time of high priority jobs can be analyzed independently of low priority jobs, since the behavior of the high priority jobs is not affected by the (number of) low priority jobs in the system. Specifically, Figure 3.4(b) shows the Markov chain that exactly models the number of high priority jobs in the system. Here, the states with high priority jobs are differentiated by the ``phases'' of the jobs in service (i.e., phases of the corresponding Coxian PH distributions) for each . In the figure, the numbers shown in states (circles) denote the ``phases'' of the jobs in service (i.e., phases of the corresponding Coxian PH distributions). That is, denotes that two jobs in service are both in phase 1; denotes that two jobs are in service, where one job is in phase 1 and the other is in phase 2; denotes that two jobs in service are both in phase 2.
As in the case of an M/M/2 queue, the behavior of the low priority jobs is determined by whether there are 0, 1, or high priority jobs in the system. By conditioning on the number of high priority jobs in the system, the number of low priority jobs can be modeled by the same Markov chains as in the case of an M/M/2 queue (i.e., Figures 3.2(b)(d)).
Figure 3.4(c) shows a portion of the 2D Markov chain, where there are low priority jobs, that models our M/PH/2 queue with two priority classes. Observe that each column in the 2D Markov chain corresponds to the Markov chain in Figure 3.4(b), and each row in the 2D Markov chain corresponds to one of the three Markov chains in Figures 3.2(b)(d).
Again, our goal is to reduce the 2D Markov chain into a 1D Markov chain that closely approximates the 2D Markov chain. In the case of an M/M/2 queue, this dimensionality reduction is made possible by approximating the duration of the busy period (of the high priority jobs) by a single PH distribution (Figure 3.3(a) to Figure 3.3(b)). Unfortunately, the same approach does not work in the case of the M/PH/2 queue, since the duration of the busy period in the M/PH/2 queue depends on the ``phase'' of the job in service both at the beginning and end of the busy period (more precisely, immediately before the busy period and immediately after the busy period). In the case of the twophase Coxian PH distribution, there are four different types of busy periods depending on the phase of the job in service immediately before the busy period starts (phase 1 or 2) and the phase of the job in service immediately after the busy period ends (phase 1 or 2).

Figure 3.5 shows a Markov chain on a finite state space for the high priority jobs, where the four types of busy periods are replaced by four twophase Coxian PH distributions, respectively. The right half of the Markov chain represents the busy period, where there are high priority jobs in the system. Specifically, the two circles on the right side labeled busy period of type represents the busy period where the job in service at the beginning of the busy period is in phase and the job in service at the end of the busy period is also in phase for .
Using Figure 3.5 as the Markov chain for the high priority jobs, we obtain a 1D Markov chain that models the M/PH/2 queue with two priority classes (see Figure 3.6 for a portion of the 1D Markov chain). Observe that the 1D Markov chain tracks the number of low priority jobs exactly, but differentiates only between 0, 1, and high priority jobs. When there is one high priority job in the system, the 1D Markov chain also tracks the phase of the high priority job in service.

Note that if the following conditions are satisfied, the behavior of the low priority jobs in the 1D Markov chain (Figure 3.6) would be exactly the same as that in the 2D Markov chain (Figure 3.4(c)).