Chapter 3Markov Chains

 

 

 

To describe the evolution of a system, we must prescribe how the future depends on the present or the past. Two major examples of such descriptions are differential equations and recurrent sequences. When a piece of randomness is added, it leads to stochastic differential equations (which are beyond the scope of this book) and stochastic recurrent sequences (SRS), which we already studied in Chapter 2. Among SRS, Markov chains are the most salient category. Behind a seemingly simple description lies a mathematical tool which is quite efficient for applications and rich of many properties.

Remember that it is recommended to read section A.1.1.

3.1. Definition and examples

Consider a sequence of random variables X = (Xn, n ≥ 0) with values in E, finite or countable, and the filtration Fn = σ{Xj, 0 ≤ jn} generated by this sequence.

Trajectories of X are elements of EN, that is to say sequences of elements of E. The shift (see section A) is then defined by

images

This shift is the non-bijective restriction to EN of the bijective flow defined on EZin section 2.1. As with the flow, we need to define the nth iteration of θ, denoted as θ,nand defined by

images

From now on, we identify θ and θ1.

DEFINITION 3.1.– The sequence X is a Markov chain when for any nm, theσ-field ...

Get Stochastic Modeling and Analysis of Telecoms Networks now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.