*9*

*Sums of Random Variables*

Random variables of the form

appear repeatedly in probability theory and applications. We could in principle derive the probability model of *W*_{n} from the PMF or PDF of *X*_{1},…, *X*_{n}. However, in many practical applications, the nature of the analysis or the properties of the random variables allow us to apply techniques that are simpler than analyzing a general *n*-dimensional probability model. In Section 9.1 we consider applications in which our interest is confined to expected values related to *W*_{n}, rather than a complete model of *W*_{n}. Subsequent sections emphasize techniques that apply when *X*_{1}, …, X_{n} are mutually independent. A useful way to analyze the sum of independent random variables is to transform the PDF or PMF of each random variable to a *moment generating function*.

The central limit theorem reveals a fascinating property of the sum of independent random variables. It states that the CDF of the sum converges to a Gaussian CDF as the number of terms grows without limit. This theorem allows us to use the properties of Gaussian random variables to obtain accurate estimates of probabilities associated with sums of other random variables. In many cases exact calculation of these probabilities is extremely difficult.

## 9.1 Expected Values of Sums

The expected value of a sum of *any* random variables is the sum of the expected values. The variance of the sum ...