CHAPTER 11

Markov Processes

Often do the Spirits Of great events stride on before the events. And in today already walks tomorrow . . .

—Samuel Taylor Coleridge

I. Markov Chains

A. Definitions

Markov chains have been and continue to be one of the most important and popular tools of mathematical model builders. This chapter presents some of the fundamental ideas of Markov chains and indicates some of their uses. More extended applications are presented in Chapters 12, 13, and 17. The necessary mathematical prerequisites for reading this chapter are the concepts of probability presented in Sections II (A–D) and IV of Chapter 10 and the elementary properties of matrix algebra discussed in Appendix II.

The fundamental principle underlying Markov processes is the independence of the future from the past if the present is known. Imagine an experiment that is repeated once each day for many days. If the probabilities of the outcomes of tomorrow's experiment depend only on the outcome of today's experiment and do not depend on the results of any previous experiments, then you are dealing with a Markov process.

In slightly different language, a finite Markov chain is a stochastic process with a finite number of states in which the probability of being in a particular state at the (n + 1)st step depends only on the state occupied at the nth step; this dependence is the same at all steps. More formally, there is the following definition:

DEFINITION An experiment with a finite number of possible ...

Get Mathematical Modeling in the Social and Life Sciences now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.