You are previewing Network Information Theory.
O'Reilly logo
Network Information Theory

Book Description

This comprehensive treatment of network information theory and its applications provides the first unified coverage of both classical and recent results. With an approach that balances the introduction of new models and new coding techniques, readers are guided through Shannon's point-to-point information theory, single-hop networks, multihop networks, and extensions to distributed computing, secrecy, wireless communication, and networking. Elementary mathematical tools and techniques are used throughout, requiring only basic knowledge of probability, whilst unified proofs of coding theorems are based on a few simple lemmas, making the text accessible to newcomers. Key topics covered include successive cancellation and superposition coding, MIMO wireless communication, network coding, and cooperative relaying. Also covered are feedback and interactive communication, capacity approximations and scaling laws, and asynchronous and random access channels. This book is ideal for use in the classroom, for self-study, and as a reference for researchers and engineers in industry and academia.

Table of Contents

  1. Cover
  2. Title
  3. Copyright
  4. Dedication
  5. Contents
  6. Preface
  7. Acknowledgments
  8. Notation
  9. 1 Introduction
    1. 1.1 Network Information Flow Problem
    2. 1.2 Max-Flow Min-Cut Theorem
    3. 1.3 Point-to-Point Information Theory
    4. 1.4 Network Information Theory
  10. Part I Preliminaries
    1. 2 Information Measures and Typicality
      1. 2.1 Entropy
      2. 2.2 Differential Entropy
      3. 2.3 Mutual Information
      4. 2.4 Typical Sequences
      5. 2.5 Jointly Typical Sequences
      6. Summary
      7. Bibliographic Notes
      8. Problems
      9. Appendix 2A Proof of the Conditional Typicality Lemma
    2. 3 Point-to-Point Information Theory
      1. 3.1 Channel Coding
      2. 3.2 Packing Lemma
      3. 3.3 Channel Coding with Input Cost
      4. 3.4 Gaussian Channel
      5. 3.5 Lossless Source Coding
      6. 3.6 Lossy Source Coding
      7. 3.7 Covering Lemma
      8. 3.8 Quadratic Gaussian Source Coding
      9. 3.9 Joint Source–Channel Coding
      10. Summary
      11. Bibliographic Notes
      12. Problems
      13. Appendix 3A Proof of Lemma 3.2
  11. Part II Single-Hop Networks
    1. 4 Multiple Access Channels
      1. 4.1 Discrete Memoryless Multiple Access Channel
      2. 4.2 Simple Bounds on the Capacity Region
      3. 4.3 Multiletter Characterization of the Capacity Region
      4. 4.4 Time Sharing
      5. 4.5 Single-Letter Characterization of the Capacity Region
      6. 4.6 Gaussian Multiple Access Channel
      7. 4.7 Extensions to More than Two Senders
      8. Summary
      9. Bibliographic Notes
      10. Problems
      11. Appendix 4A Cardinality Bound on Q
    2. 5 Degraded Broadcast Channels
      1. 5.1 Discrete Memoryless Broadcast Channel
      2. 5.2 Simple Bounds on the Capacity Region
      3. 5.3 Superposition Coding Inner Bound
      4. 5.4 Degraded DM-BC
      5. 5.5 Gaussian Broadcast Channel
      6. 5.6 Less Noisy and More Capable Broadcast Channels
      7. 5.7 Extensions
      8. Summary
      9. Bibliographic Notes
      10. Problems
    3. 6 Interference Channels
      1. 6.1 Discrete Memoryless Interference Channel
      2. 6.2 Simple Coding Schemes
      3. 6.3 Strong Interference
      4. 6.4 Gaussian Interference Channel
      5. 6.5 Han–Kobayashi Inner Bound
      6. 6.6 Injective Deterministic IC
      7. 6.7 Capacity Region of the Gaussian IC within Half a Bit
      8. 6.8 Deterministic Approximation of the Gaussian IC
      9. 6.9 Extensions to More than Two User Pairs
      10. Summary
      11. Bibliographic Notes
      12. Problems
      13. Appendix 6A Proof of Lemma 6.2
      14. Appendix 6B Proof of Proposition 6.1
    4. 7 Channels with State
      1. 7.1 Discrete Memoryless Channel with State
      2. 7.2 Compound Channel
      3. 7.3 Arbitrarily Varying Channel
      4. 7.4 Channels with Random State
      5. 7.5 Causal State Information Available at the Encoder
      6. 7.6 Noncausal State Information Available at the Encoder
      7. 7.7 Writing on Dirty Paper
      8. 7.8 Coded State Information
      9. Summary
      10. Bibliographic Notes
      11. Problems
    5. 8 General Broadcast Channels
      1. 8.1 DM-BC with Degraded Message Sets
      2. 8.2 Three-Receiver Multilevel DM-BC
      3. 8.3 Marton’s Inner Bound
      4. 8.4 Marton’s Inner Bound with Common Message
      5. 8.5 Outer Bounds
      6. 8.6 Inner Bounds for More than Two Receivers
      7. Summary
      8. Bibliographic Notes
      9. Problems
      10. Appendix 8A Proof of the Mutual Covering Lemma
      11. Appendix 8B Proof of the Nair–El Gamal Outer Bound
    6. 9 Gaussian Vector Channels
      1. 9.1 Gaussian Vector Point-to-Point Channel
      2. 9.2 Gaussian Vector Multiple Access Channel
      3. 9.3 Gaussian Vector Broadcast Channel
      4. 9.4 Gaussian Product Broadcast Channel
      5. 9.5 Vector Writing on Dirty Paper
      6. 9.6 Gaussian Vector BC with Private Messages
      7. Summary
      8. Bibliographic Notes
      9. Problems
      10. Appendix 9A Proof of the BC–MAC Duality Lemma
      11. Appendix 9B Uniqueness of the Supporting Line
    7. 10 Distributed Lossless Compression
      1. 10.1 Distributed Lossless Source Coding for a 2-DMS
      2. 10.2 Inner and Outer Bounds on the Optimal Rate Region
      3. 10.3 Slepian–Wolf Theorem
      4. 10.4 Lossless Source Coding with a Helper
      5. 10.5 Extensions to More than Two Sources
      6. Summary
      7. Bibliographic Notes
      8. Problems
    8. 11 Lossy Compression with Side Information
      1. 11.1 Simple Special Cases
      2. 11.2 Causal Side Information Available at the Decoder
      3. 11.3 Noncausal Side Information Available at the Decoder
      4. 11.4 Source Coding When Side Information May Be Absent
      5. Summary
      6. Bibliographic Notes
      7. Problems
      8. Appendix 11A Proof of Lemma 11.1
    9. 12 Distributed Lossy Compression
      1. 12.1 Berger–Tung Inner Bound
      2. 12.2 Berger–Tung Outer Bound
      3. 12.3 Quadratic Gaussian Distributed Source Coding
      4. 12.4 Quadratic Gaussian CEO Problem
      5. 12.5 Suboptimality of Berger–Tung Coding
      6. Summary
      7. Bibliographic Notes
      8. Problems
      9. Appendix 12A Proof of the Markov Lemma
      10. Appendix 12B Proof of Lemma 12.3
      11. Appendix 12C Proof of Lemma 12.4
      12. Appendix 12D Proof of Lemma 12.6
    10. 13 Multiple Description Coding
      1. 13.1 Multiple Description Coding for a DMS
      2. 13.2 Simple Special Cases
      3. 13.3 El Gamal–Cover Inner Bound
      4. 13.4 Quadratic Gaussian Multiple Description Coding
      5. 13.5 Successive Refinement
      6. 13.6 Zhang–Berger Inner Bound
      7. Summary
      8. Bibliographic Notes
      9. Problems
    11. 14 Joint Source–Channel Coding
      1. 14.1 Lossless Communication of a 2-DMS over a DM-MAC
      2. 14.2 Lossless Communication of a 2-DMS over a DM-BC
      3. 14.3 A General Single-Hop Network
      4. Summary
      5. Bibliographic Notes
      6. Problems
      7. Appendix 14A Proof of Lemma 14.1
  12. Part III Multihop Networks
    1. 15 Graphical Networks
      1. 15.1 Graphical Multicast Network
      2. 15.2 Capacity of Graphical Unicast Network
      3. 15.3 Capacity of Graphical Multicast Network
      4. 15.4 Graphical Multimessage Network
      5. Summary
      6. Bibliographic Notes
      7. Problems
      8. Appendix 15A Proof of Lemma 15.1
    2. 16 Relay Channels
      1. 16.1 Discrete Memoryless Relay Channel
      2. 16.2 Cutset Upper Bound on the Capacity
      3. 16.3 Direct-Transmission Lower Bound
      4. 16.4 Decode–Forward Lower Bound
      5. 16.5 Gaussian Relay Channel
      6. 16.6 Partial Decode–Forward Lower Bound
      7. 16.7 Compress–Forward Lower Bound
      8. 16.8 RFD Gaussian Relay Channel
      9. 16.9 Lookahead Relay Channels
      10. Summary
      11. Bibliographic Notes
      12. Problems
      13. Appendix 16A Cutset Bound for the Gaussian RC
      14. Appendix 16B Partial Decode–Forward for the Gaussian RC
      15. Appendix 16C Equivalent Compress–Forward Lower Bound
    3. 17 Interactive Channel Coding
      1. 17.1 Point-to-Point Communication with Feedback
      2. 17.2 Multiple Access Channel with Feedback
      3. 17.3 Broadcast Channel with Feedback
      4. 17.4 Relay Channel with Feedback
      5. 17.5 Two-Way Channel
      6. 17.6 Directed Information
      7. Summary
      8. Bibliographic Notes
      9. Problems
      10. Appendix 17A Proof of Lemma 17.1
    4. 18 Discrete Memoryless Networks
      1. 18.1 Discrete Memoryless Multicast Network
      2. 18.2 Network Decode–Forward
      3. 18.3 Noisy Network Coding
      4. 18.4 Discrete Memoryless Multimessage Network
      5. Summary
      6. Bibliographic Notes
      7. Problems
    5. 19 Gaussian Networks
      1. 19.1 Gaussian Multimessage Network
      2. 19.2 Capacity Scaling Laws
      3. 19.3 Gupta–Kumar Random Network
      4. Summary
      5. Bibliographic Notes
      6. Problems
      7. Appendix 19A Proof of Lemma 19.1
      8. Appendix 19B Proof of Lemma 19.2
    6. 20 Compression over Graphical Networks
      1. 20.1 Distributed Lossless Source–Network Coding
      2. 20.2 Multiple Description Network Coding
      3. 20.3 Interactive Source Coding
      4. Summary
      5. Bibliographic Notes
      6. Problems
      7. Appendix 20A Proof of Lemma 20.1
  13. Part IV Extensions
    1. 21 Communication for Computing
      1. 21.1 Coding for Computing with Side Information
      2. 21.2 Distributed Coding for Computing
      3. 21.3 Interactive Coding for Computing
      4. 21.4 Cascade Coding for Computing
      5. 21.5 Distributed Lossy Averaging
      6. 21.6 Computing over a MAC
      7. Summary
      8. Bibliographic Notes
      9. Problems
    2. 22 Information Theoretic Secrecy
      1. 22.1 Wiretap Channel
      2. 22.2 Confidential Communication via Shared Key
      3. 22.3 Secret Key Agreement: Source Model
      4. 22.4 Secret Key Agreement: Channel Model
      5. Summary
      6. Bibliographic Notes
      7. Problems
      8. Appendix 22A Proof of Lemma 22.1
      9. Appendix 22B Proof of Lemma 22.2
      10. Appendix 22C Proof of Lemma 22.3
    3. 23 Wireless Fading Channels
      1. 23.1 Gaussian Fading Channel
      2. 23.2 Coding under Fast Fading
      3. 23.3 Coding under Slow Fading
      4. 23.4 Gaussian Vector Fading Channel
      5. 23.5 Gaussian Fading MAC
      6. 23.6 Gaussian Fading BC
      7. 23.7 Gaussian Fading IC
      8. Summary
      9. Bibliographic Notes
      10. Problems
    4. 24 Networking and Information Theory
      1. 24.1 Random Data Arrivals
      2. 24.2 Random Access Channel
      3. 24.3 Asynchronous MAC
      4. Summary
      5. Bibliographic Notes
      6. Problems
      7. Appendix 24A Proof of Lemma 24.1
      8. Appendix 24B Proof of Lemma 24.2
  14. Appendices
    1. A Convex Sets and Functions
    2. B Probability and Estimation
    3. C Cardinality Bounding Techniques
    4. D Fourier–Motzkin Elimination
    5. E Convex Optimization
    6. Bibliography
  15. Common Symbols
  16. Author Index
  17. Subject Index