O'Reilly logo

Information Theory and Coding by Example by Yuri Suhov, Mark Kelbert

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Theorem 1.4.17For an MBC,

image

(1.4.24)

with equality if the input symbols X1,…,XN are independent.

Proof Since image, the conditional entropy h (Y(N)|X(N)) equals the sum image. Then the mutual information

image

The equality holds iff Y1,…,YN are independent. But Y1,…,YN are independent if X1,…, XN are.

Remark 1.4.18  Compare with inequalities (1.4.24) and (1.2.27) ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required