4

Discrete-Time Markov Chains

4.1 Introduction

The discrete-time process image is called a Markov chain if for all image, the following is true:

image (4.1)

The quantity image is called the state-transition probability, which is the conditional probability that the process will be in state j at time k immediately after the next transition, given that it is in state i at time ...

Get Markov Processes for Stochastic Modeling, 2nd Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.