MARKOV CHAINS:CONTINUOUS TIME MARKOV CHAINS
CONTINUOU S TIME MARKOV CHAINS In all the previous sections, we assumed that the time parameter t was discrete (that is, t = 0, 1, 2, . . .). Such an assumption is suitable for many problems, but there are certain cases (such as for some queueing models considered in Chap. 17) where a continuous time parameter (call it t ') is required, because the evolution of the process is being observed continuousl y over time. The definition of a Markov chain given in Sec. 29.2 also extends to such continuous processes. This section focuses on describing these “continuous time Markov chains” and their properties. Steady-State Probabilities Just as the transition probabilities for a discrete time Markov chain satisfy the Chapman- Kolmogorov equations, the continuous time transition probability function also satisfies these equations. Therefore, for any states i and j and nonnegative numbers t and s (0 < s < t ), The steady-state equation for state j has an intuitive interp