References & Further Reading

References

  1. S. M. Ross, Introduction to Probability Models, Academic Press, 11th ed., 2014

    Chapters 4--5 cover DTMCs and CTMCs with many worked examples and applications. Accessible and application-oriented.

  2. J. R. Norris, Markov Chains, Cambridge University Press, 1998

    A concise, rigorous treatment of both discrete-time and continuous-time Markov chains. Our development of recurrence, transience, and convergence follows Norris closely.

  3. D. A. Levin, Y. Peres, and E. L. Wilmer, Markov Chains and Mixing Times, American Mathematical Society, 2nd ed., 2017

    The definitive reference on mixing times, coupling arguments, and spectral methods. Chapters 1--5 cover the foundations; later chapters treat advanced topics like cutoff phenomena.

  4. P. Bremaud, Markov Chains: Gibbs Fields, Monte Carlo Simulation, and Queues, Springer, 1999

    Comprehensive treatment connecting Markov chains to MCMC, statistical physics, and queueing theory.

  5. G. R. Grimmett and D. R. Stirzaker, Probability and Random Processes, Oxford University Press, 3rd ed., 2001

    Chapters 6--7 cover Markov chains at an intermediate level with clear proofs and many exercises.

  6. A. Papoulis and S. U. Pillai, Probability, Random Variables, and Stochastic Processes, McGraw-Hill, 4th ed., 2002

    Standard engineering reference. Chapter 11 covers Markov sequences and applications.

  7. R. G. Gallager, Stochastic Processes: Theory for Applications, Cambridge University Press, 2013

    Chapters 4--5 give an information-theoretic perspective on Markov chains and renewal theory, with telecommunications applications throughout.

  8. S. Meyn and R. L. Tweedie, Markov Chains and Stochastic Stability, Cambridge University Press, 2nd ed., 2009

    The authoritative reference on stability, ergodicity, and convergence rates for general state-space Markov chains. Advanced.

  9. N. Metropolis, A. W. Rosenbluth, M. N. Rosenbluth, A. H. Teller, and E. Teller, Equation of State Calculations by Fast Computing Machines, 1953

    The original Metropolis algorithm paper. One of the most cited papers in computational science.

  10. E. N. Gilbert, Capacity of a Burst-Noise Channel, 1960

    Introduces the two-state Markov channel model for bursty errors, widely used in communications system analysis.

  11. E. O. Elliott, Estimates of Error Rates for Codes on Burst-Noise Channels, 1963

    Extends Gilbert's model by allowing errors in the good state. The combined Gilbert-Elliott model remains a standard channel abstraction.

  12. J. G. Proakis and M. Salehi, Digital Communications, McGraw-Hill, 5th ed., 2008

    The standard digital communications textbook. Chapter 14 covers channel models including the Gilbert-Elliott model and ARQ protocols.

  13. Y. Sun, E. Uysal-Biyikoglu, and G. Caire, Age of Information Under Random Updates, 2021

    CommIT group contribution: Markov chain analysis of status update systems, optimizing age of information under energy constraints.

Further Reading

Suggestions for deeper exploration of topics introduced in this chapter.

  • Mixing times and cutoff phenomena

    Levin, Peres, Wilmer β€” Markov Chains and Mixing Times, Chapters 7--18

    The mixing time determines how quickly MCMC converges. Understanding spectral gaps, conductance, and coupling provides tools for bounding mixing time.

  • Markov decision processes

    M. L. Puterman β€” Markov Decision Processes, Wiley, 2005

    When a controller can influence transitions, the DTMC becomes a Markov decision process (MDP). MDPs are the foundation of reinforcement learning and optimal resource allocation in networks.

  • Hidden Markov models

    L. R. Rabiner β€” A Tutorial on Hidden Markov Models, Proc. IEEE, 1989

    The Gilbert-Elliott channel is an HMM. The Baum-Welch (EM) algorithm and Viterbi decoding are key tools for channel estimation and sequence detection.

  • Random walks on groups and expander graphs

    S. Hoory, N. Linial, A. Wigderson β€” Expander Graphs and Their Applications, Bull. AMS, 2006

    Expander graphs have rapid mixing, connecting Markov chain theory to coding theory (LDPC codes), derandomization, and network design.

  • Continuous-time Markov chains and Poisson processes

    FSP Chapter 18

    The natural extension of DTMCs to continuous time, essential for queueing theory and network analysis.