6.6 INDEPENDENT AND IDENTICALLY DISTRIBUTED

Since a random process is viewed as a collection of random variables indexed by time, we can extend the idea of independent random variables to a process. Consider the following example of an independent random sequence.

Example 6.6. At each instant of time , an outcome X[k] of the random sequence X[k] is drawn from a Bernoulli random variable with fixed parameter p. This is equivalent to a series of coin tosses with P(H) = p. At each time instant, the coin is tossed, the outcome is observed, and the result is “appended” to prior outcomes to construct a realization (a discrete-time function). The pdf of the random variable remains unchanged over k, which means the random sequence is first-order stationary. Because each toss is independent, the random sequence is also strictly stationary. Assume the coin has been tossed N times for which X[k] has 2N possible realizations. An example is shown in Figure 6.7, where and . The set of 2N realizations is referred to as the ensemble of the random sequence: each realization is an outcome of the underlying probability space. For this discrete and finite example, we can easily express the sample space ...

Get Probability, Random Variables, and Random Processes: Theory and Signal Processing Applications now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.