References & Further Reading

References

  1. C. E. Shannon, A mathematical theory of communication, 1948

    The foundational paper of information theory. Establishes the channel coding theorem, source coding theorem, and separation principle.

  2. C. E. Shannon, The zero error capacity of a noisy channel, 1956

    Introduces the zero-error capacity concept and proves that feedback does not increase DMC capacity.

  3. T. M. Cover and J. A. Thomas, Elements of Information Theory, Wiley-Interscience, 2nd ed., 2006

    Chapters 7-8 cover channel capacity, the channel coding theorem, and differential entropy. The standard textbook treatment.

  4. G. Caire, Information Theory and Coding β€” Chapter 4: Channel Capacity, 2024

    Lecture notes from Prof. Caire's course. The proof structure in this chapter follows these notes.

  5. A. El Gamal and Y.-H. Kim, Network Information Theory, Cambridge University Press, 2011

    Chapters 3-5 cover point-to-point channel coding with a network information theory perspective.

  6. R. Blahut, Computation of channel capacity and rate-distortion functions, 1972

    Introduces the alternating optimization algorithm for computing channel capacity. The algorithm predates and anticipates the EM algorithm.

  7. S. Arimoto, An algorithm for computing the capacity of arbitrary discrete memoryless channels, 1972

    Independent derivation of the capacity computation algorithm, published the same year as Blahut's paper.

  8. J. P. M. Schalkwijk and T. Kailath, A coding scheme for additive noise channels with feedback β€” I: No bandwidth constraint, 1966

    The celebrated feedback scheme for the Gaussian channel achieving doubly exponential error decay.

  9. J. Wolfowitz, Coding Theorems of Information Theory, Springer-Verlag, 1961

    Rigorous treatment of channel coding theorems including the strong converse for DMCs.

  10. I. Csiszar and J. Korner, Information Theory: Coding Theorems for Discrete Memoryless Systems, Cambridge University Press, 2nd ed., 2011

    The most rigorous treatment of DMC coding theorems using the method of types. Chapters 2-3 cover capacity and error exponents.

Further Reading

For deeper exploration of channel coding theory and its extensions.

  • Error exponents and reliability function

    R. G. Gallager, 'Information Theory and Reliable Communication,' Wiley, 1968, Chapters 5-6.

    Develops the random coding exponent, sphere-packing exponent, and the complete reliability function for DMCs.

  • Polar codes and channel polarization

    E. Arikan, 'Channel Polarization: A Method for Constructing Capacity-Achieving Codes for Symmetric Binary-Input Discrete Memoryless Channels,' IEEE Trans. IT, 2009.

    The first explicit, low-complexity code construction proven to achieve DMC capacity.

  • Burnashev's exponent with feedback

    M. V. Burnashev, 'Data transmission over a discrete channel with feedback: Random transmission time,' Problemy Peredachi Informatsii, 1976.

    Establishes the optimal error exponent with feedback, showing the dramatic improvement over the no-feedback case.