You are previewing Probability and Computing.
O'Reilly logo
Probability and Computing

Book Description

Randomization and probabilistic techniques play an important role in modern computer science, with applications ranging from combinatorial optimization and machine learning to communication networks and secure protocols. This textbook is designed to accompany a one- or two-semester course for advanced undergraduates or beginning graduate students in computer science and applied mathematics. It gives an excellent introduction to the probabilistic techniques and paradigms used in the development of probabilistic algorithms and analyses. It assumes only an elementary background in discrete mathematics and gives a rigorous yet accessible treatment of the material, with numerous examples and applications. The first half of the book covers core material, including random sampling, expectations, Markov's inequality, Chevyshev's inequality, Chernoff bounds, balls and bins models, the probabilistic method, and Markov chains. In the second half, the authors delve into more advanced topics such as continuous probability, applications of limited independence, entropy, Markov chain Monte Carlo methods, coupling, martingales, and balanced allocations. With its comprehensive selection of topics, along with many examples and exercises, this book is an indispensable teaching tool.

Table of Contents

  1. Cover
  2. Title
  3. Copyright
  4. Contents
  5. Preface
  6. 1 Events and Probability
    1. 1.1 Application: Verifying Polynomial Identities
    2. 1.2 Axioms of Probability
    3. 1.3 Application: Verifying Matrix Multiplication
    4. 1.4 Application: A Randomized Min-Cut Algorithm
    5. 1.5 Exercises
  7. 2 Discrete Random Variables and Expectation
    1. 2.1 Random Variables and Expectation
      1. 2.1.1 Linearity of Expectations
      2. 2.1.2 Jensen’s Inequality
    2. 2.2 The Bernoulli and Binomial Random Variables
    3. 2.3 Conditional Expectation
    4. 2.4 The Geometric Distribution
      1. 2.4.1 Example: Coupon Collector’s Problem
    5. 2.5 Application: The Expected Run-Time of Quicksort
    6. 2.6 Exercises
  8. 3 Moments and Deviations
    1. 3.1 Markov’s Inequality
    2. 3.2 Variance and Moments of a Random Variable
      1. 3.2.1 Example: Variance of a Binomial Random Variable
    3. 3.3 Chebyshev’s Inequality
      1. 3.3.1 Example: Coupon Collector’s Problem
    4. 3.4 Application: A Randomized Algorithm for Computing the Median
      1. 3.4.1 The Algorithm
      2. 3.4.2 Analysis of the Algorithm
    5. 3.5 Exercises
  9. 4 Chernoff Bounds
    1. 4.1 Moment Generating Functions
    2. 4.2 Deriving and Applying Chernoff Bounds
      1. 4.2.1 Chernoff Bounds for the Sum of Poisson Trials
      2. 4.2.2 Example: Coin Flips
      3. 4.2.3 Application: Estimating a Parameter
    3. 4.3 Better Bounds for Some Special Cases
    4. 4.4 Application: Set Balancing
    5. 4.5* Application: Packet Routing in Sparse Networks
    6. 4.5.1 Permutation Routing on the Hypercube
      1. 4.5.2 Permutation Routing on the Butterfly
    7. 4.6 Exercises
  10. 5 Balls, Bins, and Random Graphs
    1. 5.1 Example: The Birthday Paradox
    2. 5.2 Balls into Bins
      1. 5.2.1 The Balls-and-Bins Model
      2. 5.2.2 Application: Bucket Sort
    3. 5.3 The Poisson Distribution
      1. 5.3.1 Limit of the Binomial Distribution
    4. 5.4 The Poisson Approximation
      1. 5.4.1* Example: Coupon Collector’s Problem, Revisited
    5. 5.5 Application: Hashing
      1. 5.5.1 Chain Hashing
      2. 5.5.2 Hashing: Bit Strings
      3. 5.5.3 Bloom Filters
      4. 5.5.4 Breaking Symmetry
    6. 5.6 Random Graphs
      1. 5.6.1 Random Graph Models
      2. 5.6.2 Application: Hamiltonian Cycles in Random Graphs
    7. 5.7 Exercises
    8. 5.8 An Exploratory Assignment
  11. 6 The Probabilistic Method
    1. 6.1 The Basic Counting Argument
    2. 6.2 The Expectation Argument
      1. 6.2.1 Application: Finding a Large Cut
      2. 6.2.2 Application: Maximum Satisfiability
    3. 6.3 Derandomization Using Conditional Expectations
    4. 6.4 Sample and Modify
      1. 6.4.1 Application: Independent Sets
      2. 6.4.2 Application: Graphs with Large Girth
    5. 6.5 The Second Moment Method
      1. 6.5.1 Application: Threshold Behavior in Random Graphs
    6. 6.6 The Conditional Expectation Inequality
    7. 6.7 The Lovasz Local Lemma
      1. 6.7.1 Application: Edge-Disjoint Paths
      2. 6.7.2 Application: Satisfiability
    8. 6.8* Explicit Constructions Using the Local Lemma
      1. 6.8.1 Application: A Satisfiability Algorithm
    9. 6.9 Lovasz Local Lemma: The General Case
    10. 6.10 Exercises
  12. 7 Markov Chains and Random Walks
    1. 7.1 Markov Chains: Definitions and Representations
      1. 7.1.1 Application: A Randomized Algorithm for 2-Satisfiability
      2. 7.1.2 Application: A Randomized Algorithm for 3-Satisfiability
    2. 7.2 Classification of States
      1. 7.2.1 Example: The Gambler’s Ruin
    3. 7.3 Stationary Distributions
      1. 7.3.1 Example: A Simple Queue
    4. 7.4 Random Walks on Undirected Graphs
      1. 7.4.1 Application: An s-t Connectivity Algorithm
    5. 7.5 Parrondo’s Paradox
    6. 7.6 Exercises
  13. 8 Continuous Distributions and the Poisson Process
    1. 8.1 Continuous Random Variables
      1. 8.1.1 Probability Distributions in ℝ
      2. 8.1.2 Joint Distributions and Conditional Probability
    2. 8.2 The Uniform Distribution
      1. 8.2.1 Additional Properties of the Uniform Distribution
    3. 8.3 The Exponential Distribution
      1. 8.3.1 Additional Properties of the Exponential Distribution
      2. 8.3.2* Example: Balls and Bins with Feedback
    4. 8.4 The Poisson Process
      1. 8.4.1 Interarrival Distribution
      2. 8.4.2 Combining and Splitting Poisson Processes
      3. 8.4.3 Conditional Arrival Time Distribution
    5. 8.5 Continuous Time Markov Processes
    6. 8.6 Example: Markovian Queues
      1. 8.6.1 M/M/1 Queue in Equilibrium
      2. 8.6.2 M/M/1/K Queue in Equilibrium
      3. 8.6.3 The Number of Customers in an M/M/∞ Queue
    7. 8.7 Exercises
  14. 9 Entropy, Randomness, and Information
    1. 9.1 The Entropy Function
    2. 9.2 Entropy and Binomial Coefficients
    3. 9.3 Entropy: A Measure of Randomness
    4. 9.4 Compression
    5. 9.5* Coding: Shannon’s Theorem
    6. 9.6 Exercises
  15. 10 The Monte Carlo Method
    1. 10.1 The Monte Carlo Method
    2. 10.2 Application: The DNF Counting Problem
      1. 10.2.1 The Naïve Approach
      2. 10.2.2 A Fully Polynomial Randomized Scheme for DNF Counting
    3. 10.3 From Approximate Sampling to Approximate Counting
    4. 10.4 The Markov Chain Monte Carlo Method
      1. 10.4.1 The Metropolis Algorithm
    5. 10.5 Exercises
    6. 10.6 An Exploratory Assignment on Minimum Spanning Trees
  16. 11* Coupling of Markov Chains
    1. 11.1 Variation Distance and Mixing Time
    2. 11.2 Coupling
      1. 11.2.1 Example: Shuffling Cards
      2. 11.2.2 Example: Random Walks on the Hypercube
      3. 11.2.3 Example: Independent Sets of Fixed Size
    3. 11.3 Application: Variation Distance Is Nonincreasing
    4. 11.4 Geometric Convergence
    5. 11.5 Application: Approximately Sampling Proper Colorings
    6. 11.6 Path Coupling
    7. 11.7 Exercises
  17. 12 Martingales
    1. 12.1 Martingales
    2. 12.2 Stopping Times
      1. 12.2.1 Example: A Ballot Theorem
    3. 12.3 Wald’s Equation
    4. 12.4 Tail Inequalities for Martingales
    5. 12.5 Applications of the Azuma-Hoeffding Inequality
      1. 12.5.1 General Formalization
      2. 12.5.2 Application: Pattern Matching
      3. 12.5.3 Application: Balls and Bins
      4. 12.5.4 Application: Chromatic Number
    6. 12.6 Exercises
  18. 13 Pairwise Independence and Universal Hash Functions
    1. 13.1 Pairwise Independence
      1. 13.1.1 Example: A Construction of Pairwise Independent Bits
      2. 13.1.2 Application: Derandomizing an Algorithm for Large Cuts
      3. 13.1.3 Example: Constructing Pairwise Independent Values Modulo a Prime
    2. 13.2 Chebyshev’s Inequality for Pairwise Independent Variables
      1. 13.2.1 Application: Sampling Using Fewer Random Bits
    3. 13.3 Families of Universal Hash Functions
      1. 13.3.1 Example: A 2-Universal Family of Hash Functions
      2. 13.3.2 Example: A Strongly 2-Universal Family of Hash Functions
      3. 13.3.3 Application: Perfect Hashing
      4. 13.4 Application: Finding Heavy Hitters in Data Streams
    4. 13.5 Exercises
  19. 14* Balanced Allocations
    1. 14.1 The Power of Two Choices
      1. 14.1.1 The Upper Bound
    2. 14.2 Two Choices: The Lower Bound
    3. 14.3 Applications of the Power of Two Choices
      1. 14.3.1 Hashing
      2. 14.3.2 Dynamic Resource Allocation
    4. 14.4 Exercises
  20. Further Reading
  21. Index