O'Reilly logo

Concentration of Measure for the Analysis of Randomized Algorithms by Alessandro Panconesi, Devdatt P. Dubhashi

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

14

Log-Sobolev Inequalities and Concentration

14.1 Introduction

In this chapter, we give an introduction to log-Sobolev inequalities and their use in deriving concentration of measure results. This is a third important methodology for concentration of measure (the other two being martingales and transportation cost) and it appears to be the most powerful of the three.

Given a probability space (Ω, P) and a function f : Ω → R, define the entropy of f by

image

(14.1)

By Jensen’s inequality applied to the convex function ψ(x) := x log x, EntP (f ) ≥ 0 for any f.

A logarithmic Sobolev inequality or just log-Sobolev inequality bounds EntP [f], ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required