Chapter Summary

Chapter Summary

Key Points

  • 1.

    The Interference Channel. The capacity of the two-user interference channel remains the most famous open problem in information theory. The Han-Kobayashi scheme (rate splitting into common and private parts) is the best known inner bound. Etkin, Tse, and Wang showed that a simple HK variant achieves within 1 bit of capacity for all Gaussian IC parameters β€” the problem is "almost solved" for engineering purposes, but the exact capacity in the moderate interference regime remains open.

  • 2.

    The General Relay Network. The cut-set bound provides an elegant outer bound for relay networks, but it is not always tight because it ignores causality constraints. Compress-forward achieves within 1/2 bit for the Gaussian relay channel. Noisy network coding generalizes this to multi-relay networks but the gap grows with network size. The exact capacity of even the simplest relay channel (one source, one relay, one destination) remains open in general.

  • 3.

    Non-Shannon Inequalities. For four or more random variables, there exist valid information inequalities that cannot be derived from Shannon's axioms. The Zhang-Yeung inequality (1998) was the first such discovery. Non-Shannon inequalities have implications for network coding capacity and distributed source coding, making the characterization of multi-terminal problems fundamentally harder than expected.

  • 4.

    Information Theory for 6G. ISAC (integrated sensing and communication) introduces the capacity-distortion tradeoff β€” the price of sensing in terms of communication rate. Near-field information theory with extremely large arrays opens new spatial multiplexing possibilities. RIS-aided communication and AI-native protocols are active research frontiers where information theory provides the fundamental limits.

  • 5.

    Reading and Writing IT Papers. A capacity proof has two halves: achievability (random coding) and converse (Fano's inequality). The most common errors are conflating inner bounds with capacity, ignoring CSI assumptions, mistaking DoF for finite-SNR performance, and applying asymptotic results at finite blocklength. Critical reading starts with the assumptions, not the conclusions.

Looking Ahead

This chapter concludes the ITA book. The reader who has worked through all 30 chapters has a comprehensive foundation in information theory β€” from Shannon's entropy to the frontiers of semantic communication and 6G. The open problems described here are invitations: the field is far from complete, and the most exciting results may be yet to come. We encourage the reader to pick a problem, sharpen the tools learned in this book, and contribute to the next chapter of information theory.