References & Further Reading
References
- T. M. Cover and J. A. Thomas, Elements of Information Theory, Wiley, 2nd ed., 2006
Chapter 10 covers the rate-distortion function and theorem. Chapter 13 discusses universal lossy coding and successive refinement.
- C. E. Shannon, Coding Theorems for a Discrete Source with a Fidelity Criterion, 1959
Shannon's original paper defining the rate-distortion function and proving the coding theorem. Introduces the separation theorem for joint source-channel coding.
- T. Berger, Rate Distortion Theory: A Mathematical Basis for Data Compression, Prentice-Hall, 1971
The first comprehensive textbook on rate-distortion theory. Develops the theory in full generality including continuous sources and abstract distortion measures.
- A. D. Wyner and J. Ziv, The Rate-Distortion Function for Source Coding with Side Information at the Decoder, 1976
Introduces the Wyner-Ziv problem and proves the surprising result that for Gaussian sources with squared error, side information at the encoder is not needed.
- W. H. R. Equitz and T. M. Cover, Successive Refinement of Information, 1991
Characterizes when a source is successively refinable. Shows that Gaussian sources have this property for all distortion levels.
- H. Gish and J. N. Pierce, Asymptotically Efficient Quantizing, 1968
Proves the 1.53 dB gap result: uniform quantizer plus entropy coding achieves distortion within $\\pi e/6$ of the rate-distortion bound for any smooth distribution.
- R. M. Gray and D. L. Neuhoff, Quantization, 1998
A comprehensive survey of quantization theory connecting rate-distortion theory to practical quantizer design. Covers Lloyd-Max, lattice, and vector quantization.
Further Reading
For readers who want to go deeper into rate-distortion theory and its applications.
Vector quantization and lattice codes
T. M. Cover and J. A. Thomas, *Elements of Information Theory*, 2nd ed., Wiley, 2006, Ch. 13.7
Vector quantization can close the 1.53 dB shaping gap of scalar quantization. Lattice quantizers provide structured approaches achieving near-R(D) performance.
Rate-distortion theory for non-i.i.d. sources
T. Berger, *Rate Distortion Theory*, Prentice-Hall, 1971, Ch. 6β7
Extends rate-distortion theory to sources with memory, including Markov sources and stationary ergodic processes. The entropy rate replaces the single-letter entropy.
Perceptual distortion measures
Z. Wang, A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli, 'Image Quality Assessment: From Error Visibility to Structural Similarity,' IEEE Trans. Image Processing, vol. 13, no. 4, pp. 600β612, Apr. 2004
Squared error is mathematically convenient but perceptually inadequate. SSIM and its descendants (MS-SSIM, LPIPS) are widely used as alternatives. The rate-distortion theory extends to these measures but the resulting R(D) curves are harder to compute.
Distributed lossy compression
A. El Gamal and Y.-H. Kim, *Network Information Theory*, Cambridge University Press, 2012, Ch. 12
Extends Wyner-Ziv to multi-terminal settings with multiple sources and decoders. Covered in Chapters 7β8 of this book.