The Markov chain

What do all the Markov-type models have in common? All of them are essentially based on the same stochastic process, which is better known as the Markov chain. Under this section, you will find a brief review of the fundamental ideas and properties that follow the Markov chain.

Even though this section relies on some statistical terms and matrixial notations, it is designed to be accessible, regardless of that.

Stochastic processes are frequently assumed to be independent and identically distributed (i.i.d.). Borrowing the dice analogy, this assumption rules that no matter what the past rolls were, they won't affect the likelihood (probability) of the next roll. The Markov chain won't assume i.i.d.; instead, it will define ...

Get Hands-On Data Science with R now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.