1Introduction to Information Theory

1.1. Introduction

Information theory was developed by Claude Shannon in the 1940s [SHA 48, SHA 59b]. This is a mathematical theory which introduced source coding and channel coding, two fundamental concepts of communication systems. This theory allows us to determine, depending on the sources’ properties, the theoretical limits of lossless source coding (exact message reconstruction of the source) or lossly source coding (message reconstruction under a fidelity criterion). It also gives us the reachable rates for a given transmission channel due to channel coding.

In this chapter, after reviewing the basics of discrete and continuous probabilities in section 1.2, we will start by introducing the fundamental notions of information theory such as the entropy and average mutual information in section 1.3. We will then focus on the fundamental theorems of information theory for communication systems. We will state the lossly and lossless source coding theorem in sections 1.4 and 1.5, respectively. Then we will determine the theoretical limits of communication without error in a noisy channel. In section 1.5, we will introduce different channel models and finally in section 1.7, we will compute the capacity of these channels and state the channel coding theorem.

1.2. Review of probabilities

Probability theory is a mathematical domain that describes and models random processes. In this section, we present a summary of this theory. We recommend ...

Get Digital Communications 1 now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.