You are previewing Lattice Coding for Signals and Networks.
O'Reilly logo
Lattice Coding for Signals and Networks

Book Description

Unifying information theory and digital communication through the language of lattice codes, this book provides a detailed overview for students, researchers and industry practitioners. It covers classical work by leading researchers in the field of lattice codes and complementary work on dithered quantization and infinite constellations, and then introduces the more recent results on 'algebraic binning' for side-information problems, and linear/lattice codes for networks. It shows how high dimensional lattice codes can close the gap to the optimal information theoretic solution, including the characterisation of error exponents. The solutions presented are based on lattice codes, and are therefore close to practical implementations, with many advanced setups and techniques, such as shaping, entropy-coding, side-information and multi-terminal systems. Moreover, some of the network setups shown demonstrate how lattice codes are potentially more efficient than traditional random-coding solutions, for instance when generalising the framework to Gaussian networks.

Table of Contents

  1. Cover
  2. Half title
  3. Title
  4. Copyright
  5. Dedication
  6. Table of Contents
  7. Preface
  8. Acknowledgements
  9. List of notation
  10. 1 Introduction
    1. 1.1 Source and channel coding
    2. 1.2 The information theoretic view
    3. 1.3 Structured codes
    4. 1.4 Preview
  11. 2 Lattices
    1. 2.1 Representation
    2. 2.2 Partition
    3. 2.3 Equivalent cells and coset leaders
    4. 2.4 Transformation and tiling
    5. 2.5 Algebraic constructions
    6. Summary
    7. Problems
    8. Interesting facts about lattices
  12. 3 Figures of merit
    1. 3.1 Sphere packing and covering
    2. 3.2 Quantization: normalized second moment
    3. 3.3 Modulation: volume to noise ratio
    4. Summary
    5. Problems
    6. Historical notes
  13. 4 Dithering and estimation
    1. 4.1 Crypto lemma
    2. 4.2 Generalized dither
    3. 4.3 White dither spectrum
    4. 4.4 Wiener estimation
    5. 4.5 Filtered dithered quantization
    6. Summary
    7. Problems
    8. Historical notes
  14. 5 Entropy-coded quantization
    1. 5.1 The Shannon entropy
    2. 5.2 Quantizer entropy
    3. 5.3 Joint and sequential entropy coding*
    4. 5.4 Entropy-distortion trade-off
    5. 5.5 Redundancy over Shannon
    6. 5.6 Optimum test-channel simulation
    7. 5.7 Comparison with Lloyd’s conditions
    8. 5.8 Is random dither really necessary?
    9. 5.9 Universal quantization*
    10. Summary
    11. Problems
    12. Historical notes
  15. 6 Infinite constellation for modulation
    1. 6.1 Rate per unit volume
    2. 6.2 ML decoding and error probability
    3. 6.3 Gap to capacity
    4. 6.4 Non-AWGN and mismatch
    5. 6.5 Non-equiprobable signaling
    6. 6.6 Maximum a posteriori decoding*
    7. Summary
    8. Problems
    9. Historical notes
  16. 7 Asymptotic goodness
    1. 7.1 Sphere bounds
    2. 7.2 Sphere-Gaussian equivalence
    3. 7.3 Good covering and quantization
    4. 7.4 Does packing imply modulation?
    5. 7.5 The Minkowski–Hlawka theorem
    6. 7.6 Good packing
    7. 7.7 Good modulation
    8. 7.8 Non-AWGN
    9. 7.9 Simultaneous goodness
    10. Summary
    11. Problems
    12. Historical notes
  17. 8 Nested lattices
    1. 8.1 Definition and properties
    2. 8.2 Cosets and Voronoi codebooks
    3. 8.3 Nested linear, lattice and trellis codes
    4. 8.4 Dithered codebook
    5. 8.5 Good nested lattices
    6. Summary
    7. Problems
    8. Historical notes
  18. 9 Lattice shaping
    1. 9.1 Voronoi modulation
    2. 9.2 Syndrome dilution scheme
    3. 9.3 The high SNR case
    4. 9.4 Shannon meets Wiener (at medium SNR)
    5. 9.5 The mod Λ channel
    6. 9.6 Achieving C[sub(AWGN)] for all SNR
    7. 9.7 Geometric interpretation
    8. 9.8 Noise-matched decoding
    9. 9.9 Is the dither really necessary?
    10. 9.10 Voronoi quantization
    11. Summary
    12. Problems
    13. Historical notes
  19. 10 Side-information problems
    1. 10.1 Syndrome coding
    2. 10.2 Gaussian multi-terminal problems
    3. 10.3 Rate distortion with side information
    4. 10.4 Lattice Wyner–Ziv coding
    5. 10.5 Channels with side information
    6. 10.6 Lattice dirty-paper coding
    7. Summary
    8. Problems
    9. Historical notes
  20. 11 Modulo-lattice modulation
    1. 11.1 Separation versus JSCC
    2. 11.2 Figures of merit for JSCC
    3. 11.3 Joint Wyner–Ziv/dirty-paper coding
    4. 11.4 Bandwidth conversion
    5. Summary
    6. Problems
    7. Historical notes
  21. 12 Gaussian networks
    1. 12.1 The two-help-one problem
    2. 12.2 Dirty multiple-access channel
    3. 12.3 Lattice network coding
    4. 12.4 Interference alignment
    5. 12.5 Summary and outlook
    6. Summary
    7. Problems
    8. Historical notes
  22. 13 Error exponents
    1. 13.1 Sphere packing exponent
    2. 13.2 Measures of lattice to noise density
    3. 13.3 Threshold-decoding exponent
    4. 13.4 Nearest-neighbor decoding exponent
    5. 13.5 Distance spectrum and pairwise errors
    6. 13.6 Minimum-distance exponent
    7. 13.7 The expurgated MHS ensemble
    8. 13.8 Error exponents of Voronoi codes
    9. Summary
    10. Problems
    11. Historical notes
  23. Appendix
    1. A.1 Entropy and mutual information
    2. A.2 Success-threshold exponent
    3. A.3 Coset density and entropy
    4. A.4 Convolution of log-concave functions
    5. A.5 Mixture versus Gaussian noise
    6. A.6 Lattice-distributed noise
  24. References
  25. Index