Introduction to Probability

Book description

Developed from celebrated Harvard statistics lectures, Introduction to Probability provides essential language and tools for understanding statistics, randomness, and uncertainty. The book explores a wide variety of applications and examples, ranging from coincidences and paradoxes to Google PageRank and Markov chain Monte Carlo (MCMC). Additional application areas explored include genetics, medicine, computer science, and information theory. The print book version includes a code that provides free access to an eBook version.

The authors present the material in an accessible style and motivate concepts using real-world examples. Throughout, they use stories to uncover connections between the fundamental distributions in statistics and conditioning to reduce complicated problems to manageable pieces.

The book includes many intuitive explanations, diagrams, and practice problems. Each chapter ends with a section showing how to perform relevant simulations and calculations in R, a free statistical software environment.

Table of contents

  1. Preliminaries
  2. Preface
  3. Chapter 1 Probability and counting
    1. 1.1 Why study probability?
    2. 1.2 Sample spaces and Pebble World
    3. 1.3 Naive definition of probability
    4. 1.4 How to count
      1. 1.4.1 Multiplication rule
      2. 1.4.2 Adjusting for overcounting
    5. 1.5 Story proofs
    6. 1.6 Non-naive definition of probability
    7. 1.7 Recap
    8. 1.8 R
    9. 1.9 Exercises
      1. Figure 1.1
      2. Figure 1.2
      3. Figure 1.3
      4. Figure 1.4
      5. Figure 1.5
      6. Figure 1.6
  4. Chapter 2 Conditional probability
    1. 2.1 The importance of thinking conditionally
    2. 2.2 Definition and intuition
    3. 2.3 Bayes’ rule and the law of total probability
    4. 2.4 Conditional probabilities are probabilities
    5. 2.5 Independence of events
    6. 2.6 Coherency of Bayes’ rule
    7. 2.7 Conditioning as a problem-solving tool
      1. 2.7.1 Strategy: condition on what you wish you knew
      2. 2.7.2 Strategy: condition on the first step
    8. 2.8 Pitfalls and paradoxes
    9. 2.9 Recap
    10. 2.10 R
    11. 2.11 Exercises
      1. Figure 2.1
      2. Figure 2.2
      3. Figure 2.3
      4. Figure 2.4
      5. Figure 2.5
      6. Figure 2.6
      7. Figure 2.7
  5. Chapter 3 Random variables and their distributions
    1. 3.1 Random variables
    2. 3.2 Distributions and probability mass functions
    3. 3.3 Bernoulli and Binomial
    4. 3.4 Hypergeometric
    5. 3.5 Discrete Uniform
    6. 3.6 Cumulative distribution functions
    7. 3.7 Functions of random variables
    8. 3.8 Independence of r.v.s
    9. 3.9 Connections between Binomial and Hypergeometric
    10. 3.10 Recap
    11. 3.11 R
    12. 3.12 Exercises
      1. Figure 3.1
      2. Figure 3.2
      3. Figure 3.3
      4. Figure 3.4
      5. Figure 3.5
      6. Figure 3.6
      7. Figure 3.7
      8. Figure 3.8
      9. Figure 3.9
      10. Figure 3.10
      11. Figure 3.11
      12. Figure 3.12
  6. Chapter 4 Expectation
    1. 4.1 Definition of expectation
    2. 4.2 Linearity of expectation
    3. 4.3 Geometric and Negative Binomial
    4. 4.4 Indicator r.v.s and the fundamental bridge
    5. 4.5 Law of the unconscious statistician (LOTUS)
    6. 4.6 Variance
    7. 4.7 Poisson
    8. 4.8 Connections between Poisson and Binomial
    9. 4.9 *Using probability and expectation to prove existence
      1. 4.9.1 *Communicating over a noisy channel
    10. 4.10 Recap
    11. 4.11 R
    12. 4.12 Exercises
      1. Figure 4.1
      2. Figure 4.2
      3. Figure 4.3
      4. Figure 4.4
      5. Figure 4.5
      6. Figure 4.6
      7. Figure 4.7
      8. Figure 4.8
      9. Figure 4.9
  7. Chapter 5 Continuous random variables
    1. 5.1 Probability density functions
    2. 5.2 Uniform
    3. 5.3 Universality of the Uniform
    4. 5.4 Normal
    5. 5.5 Exponential
    6. 5.6 Poisson processes
    7. 5.7 Symmetry of i.i.d. continuous r.v.s
    8. 5.8 Recap
    9. 5.9 R
    10. 5.10 Exercises
      1. Figure 5.1
      2. Figure 5.2
      3. Figure 5.3
      4. Figure 5.4
      5. Figure 5.5
      6. Figure 5.6
      7. Figure 5.7
      8. Figure 5.8
      9. Figure 5.9
      10. Figure 5.10
      11. Figure 5.11
      12. Figure 5.12
  8. Chapter 6 Moments
    1. 6.1 Summaries of a distribution
    2. 6.2 Interpreting moments
    3. 6.3 Sample moments
    4. 6.4 Moment generating functions
    5. 6.5 Generating moments with MGFs
    6. 6.6 Sums of independent r.v.s via MGFs
    7. 6.7 *Probability generating functions
    8. 6.8 Recap
    9. 6.9 R
    10. 6.10 Exercises
      1. Figure 6.1
      2. Figure 6.2
      3. Figure 6.3
      4. Figure 6.4
      5. Figure 6.5
      6. Figure 6.6
      7. Figure 6.7
      8. Figure 6.8
  9. Chapter 7 Joint distributions
    1. 7.1 Joint, marginal, and conditional
      1. 7.1.1 Discrete
      2. 7.1.2 Continuous
      3. 7.1.3 Hybrid
    2. 7.2 2D LOTUS
    3. 7.3 Covariance and correlation
    4. 7.4 Multinomial
    5. 7.5 Multivariate Normal
    6. 7.6 Recap
    7. 7.7 R
    8. 7.8 Exercises
      1. Figure 7.1
      2. Figure 7.2
      3. Figure 7.3
      4. Figure 7.4
      5. Figure 7.5
      6. Figure 7.6
      7. Figure 7.7
      8. Figure 7.8
      9. Figure 7.9
      10. Figure 7.10
      11. Figure 7.11
      12. Figure 7.12
  10. Chapter 8 Transformations
    1. 8.1 Change of variables
    2. 8.2 Convolutions
    3. 8.3 Beta
    4. 8.4 Gamma
    5. 8.5 Beta-Gamma connections
    6. 8.6 Order statistics
    7. 8.7 Recap
    8. 8.8 R
    9. 8.9 Exercises
      1. Figure 8.1
      2. Figure 8.2
      3. Figure 8.3
      4. Figure 8.4
      5. Figure 8.5
      6. Figure 8.6
      7. Figure 8.7
      8. Figure 8.8
      9. Figure 8.9
      10. Figure 8.10
      11. Figure 8.11
  11. Chapter 9 Conditional expectation
    1. 9.1 Conditional expectation given an event
    2. 9.2 Conditional expectation given an r.v.
    3. 9.3 Properties of conditional expectation
    4. 9.4 *Geometric interpretation of conditional expectation
    5. 9.5 Conditional variance
    6. 9.6 Adam and Eve examples
    7. 9.7 Recap
    8. 9.8 R
    9. 9.9 Exercises
      1. Figure 9.1
      2. Figure 9.2
      3. Figure 9.3
      4. Figure 9.4
      5. Figure 9.5
      6. Figure 9.6
      7. Figure 9.7
      8. Figure 9.8
      9. Figure 9.9
      10. Figure 9.10
  12. Chapter 10 Inequalities and limit theorems
    1. 10.1 Inequalities
      1. 10.1.1 Cauchy-Schwarz: a marginal bound on a joint expectation
      2. 10.1.2 Jensen: an inequality for convexity
      3. 10.1.3 Markov, Chebyshev, Chernoff: bounds on tail probabilities
    2. 10.2 Law of large numbers
    3. 10.3 Central limit theorem
    4. 10.4 Chi-Square and Student-t
    5. 10.5 Recap
    6. 10.6 R
    7. 10.7 Exercises
      1. Figure 10.1
      2. Figure 10.2
      3. Figure 10.3
      4. Figure 10.4
      5. Figure 10.5
      6. Figure 10.6
      7. Figure 10.7
      8. Figure 10.8
  13. Chapter 11 Markov chains
    1. 11.1 Markov property and transition matrix
    2. 11.2 Classification of states
    3. 11.3 Stationary distribution
      1. 11.3.1 Existence and uniqueness
      2. 11.3.2 Convergence
      3. 11.3.3 Google PageRank
    4. 11.4 Reversibility
    5. 11.5 Recap
    6. 11.6 R
    7. 11.7 Exercises
      1. Figure 11.1
      2. Figure 11.2
      3. Figure 11.3
      4. Figure 11.4
      5. Figure 11.5
      6. Figure 11.6
  14. Chapter 12 Markov chain Monte Carlo
    1. 12.1 Metropolis-Hastings
    2. 12.2 Gibbs sampling
    3. 12.3 Recap
    4. 12.4 R
    5. 12.5 Exercises
      1. Figure 12.1
      2. Figure 12.2
      3. Figure 12.3
      4. Figure 12.4
      5. Figure 12.5
  15. Chapter 13 Poisson processes
    1. 13.1 Poisson processes in one dimension
    2. 13.2 Conditioning, superposition, thinning
      1. 13.2.1 Conditioning
      2. 13.2.2 Superposition
      3. 13.2.3 Thinning
    3. 13.3 Poisson processes in multiple dimensions
    4. 13.4 Recap
    5. 13.5 R
    6. 13.6 Exercises
      1. Figure 13.1
      2. Figure 13.2
      3. Figure 13.3
      4. Figure 13.4
      5. Figure 13.5
      6. Figure 13.6
      7. Figure 13.7
      8. Figure 13.8
      9. Figure 13.9
  16. A Math
    1. A.1 Sets
      1. A.1.1 The empty set
      2. A.1.2 Subsets
      3. A.1.3 Unions, intersections, and complements
      4. A.1.4 Partitions
      5. A.1.5 Cardinality
      6. A.1.6 Cartesian product
    2. A.2 Functions
      1. A.2.1 One-to-one functions
      2. A.2.2 Increasing and decreasing functions
      3. A.2.3 Even and odd functions
      4. A.2.4 Convex and concave functions
      5. A.2.5 Exponential and logarithmic functions
      6. A.2.6 Floor function and ceiling function
      7. A.2.7 Factorial function and gamma function
    3. A.3 Matrices
      1. A.3.1 Matrix addition and multiplication
      2. A.3.2 Eigenvalues and eigenvectors
    4. A.4 Difference equations
    5. A.5 Differential equations
    6. A.6 Partial derivatives
    7. A.7 Multiple integrals
      1. A.7.1 Change of order of integration
      2. A.7.2 Change of variables
    8. A.8 Sums
      1. A.8.1 Geometric series
      2. A.8.2 Taylor series for ex
      3. A.8.3 Harmonic series and other sums with a fixed exponent
      4. A.8.4 Binomial theorem
    9. A.9 Pattern recognition
    10. A.10 Common sense and checking answers
      1. Figure A.1
      2. Figure A.2
      3. Figure A.3
  17. B R
    1. B.1 Vectors
    2. B.2 Matrices
    3. B.3 Math
    4. B.4 Sampling and simulation
    5. B.5 Plotting
    6. B.6 Programming
    7. B.7 Summary statistics
    8. B.8 Distributions
  18. C Table of distributions
  19. Bibliography

Product information

  • Title: Introduction to Probability
  • Author(s): Joseph K. Blitzstein, Jessica Hwang
  • Release date: September 2015
  • Publisher(s): CRC Press
  • ISBN: 9781498759762