Markov Processes for Stochastic Modeling, 2nd Edition

Book description

Markov processes are processes that have limited memory. In particular, their dependence on the past is only through the previous state. They are used to model the behavior of many systems including communications systems, transportation networks, image segmentation and analysis, biological systems and DNA sequence analysis, random atomic motion and diffusion in physics, social mobility, population studies, epidemiology, animal and insect migration, queueing systems, resource management, dams, financial engineering, actuarial science, and decision systems.

Covering a wide range of  areas of application of Markov processes, this second edition is revised to highlight the most important aspects as well as the most recent trends and applications of Markov processes. The author spent over 16 years in the industry before returning to academia, and he has applied many of the principles covered in this book in multiple research projects. Therefore, this is an applications-oriented book that also includes enough theory to provide a solid ground in the subject for the reader.



  • Presents both the theory and applications of the different aspects of Markov processes
  • Includes numerous solved examples as well as detailed diagrams that make it easier to understand the principle being presented
  • Discusses different applications of hidden Markov models, such as DNA sequence analysis and speech analysis.

Table of contents

  1. Cover image
  2. Title page
  3. Table of Contents
  4. Copyright
  5. Acknowledgments
  6. Preface to the Second Edition
  7. Preface to the First Edition
  8. 1. Basic Concepts in Probability
    1. 1.1 Introduction
    2. 1.2 Random Variables
    3. 1.3 Transform Methods
    4. 1.4 Bivariate Random Variables
    5. 1.5 Many Random Variables
    6. 1.6 Fubini’s Theorem
    7. 1.7 Sums of Independent Random Variables
    8. 1.8 Some Probability Distributions
    9. 1.9 Limit Theorems
    10. 1.10 Problems
  9. 2. Basic Concepts in Stochastic Processes
    1. 2.1 Introduction
    2. 2.2 Classification of Stochastic Processes
    3. 2.3 Characterizing a Stochastic Process
    4. 2.4 Mean and Autocorrelation Function of a Stochastic Process
    5. 2.5 Stationary Stochastic Processes
    6. 2.6 Ergodic Stochastic Processes
    7. 2.7 Some Models of Stochastic Processes
    8. 2.8 Problems
  10. 3. Introduction to Markov Processes
    1. 3.1 Introduction
    2. 3.2 Structure of Markov Processes
    3. 3.3 Strong Markov Property
    4. 3.4 Applications of Discrete-Time Markov Processes
    5. 3.5 Applications of Continuous-Time Markov Processes
    6. 3.6 Applications of Continuous-State Markov Processes
    7. 3.7 Summary
  11. 4. Discrete-Time Markov Chains
    1. 4.1 Introduction
    2. 4.2 State-Transition Probability Matrix
    3. 4.3 State-Transition Diagrams
    4. 4.4 Classification of States
    5. 4.5 Limiting-State Probabilities
    6. 4.6 Sojourn Time
    7. 4.7 Transient Analysis of Discrete-Time Markov Chains
    8. 4.8 First Passage and Recurrence Times
    9. 4.9 Occupancy Times
    10. 4.10 Absorbing Markov Chains and the Fundamental Matrix
    11. 4.11 Reversible Markov Chains
    12. 4.12 Problems
  12. 5. Continuous-Time Markov Chains
    1. 5.1 Introduction
    2. 5.2 Transient Analysis
    3. 5.3 Birth and Death Processes
    4. 5.4 First Passage Time
    5. 5.5 The Uniformization Method
    6. 5.6 Reversible CTMCs
    7. 5.7 Problems
  13. 6. Markov Renewal Processes
    1. 6.1 Introduction
    2. 6.2 Renewal Processes
    3. 6.3 Renewal-Reward Process
    4. 6.4 Regenerative Processes
    5. 6.5 Markov Renewal Process
    6. 6.6 Semi-Markov Processes
    7. 6.7 Markov Regenerative Process
    8. 6.8 Markov Jump Processes
    9. 6.9 Problems
  14. 7. Markovian Queueing Systems
    1. 7.1 Introduction
    2. 7.2 Description of a Queueing System
    3. 7.3 The Kendall Notation
    4. 7.4 The Little’s Formula
    5. 7.5 The PASTA Property
    6. 7.6 The M/M/1 Queueing System
    7. 7.7 Examples of Other M/M Queueing Systems
    8. 7.8 M/G/1 Queue
    9. 7.9 G/M/1 Queue
    10. 7.10 M/G/1 Queues with Priority
    11. 7.11 Markovian Networks of Queues
    12. 7.12 Applications of Markovian Queues
    13. 7.13 Problems
  15. 8. Random Walk
    1. 8.1 Introduction
    2. 8.2 Occupancy Probability
    3. 8.3 Random Walk as a Markov Chain
    4. 8.4 Symmetric Random Walk as a Martingale
    5. 8.5 Random Walk with Barriers
    6. 8.6 Gambler’s Ruin
    7. 8.7 Random Walk with Stay
    8. 8.8 First Return to the Origin
    9. 8.9 First Passage Times for Symmetric Random Walk
    10. 8.10 The Ballot Problem and the Reflection Principle
    11. 8.11 Returns to the Origin and the Arc-Sine Law
    12. 8.12 Maximum of a Random Walk
    13. 8.13 Random Walk on a Graph
    14. 8.14 Correlated Random Walk
    15. 8.15 Continuous-Time Random Walk
    16. 8.16 Self-Avoiding Random Walk
    17. 8.17 Nonreversing Random Walk
    18. 8.18 Applications of Random Walk
    19. 8.19 Summary
    20. 8.20 Problems
  16. 9. Brownian Motion
    1. 9.1 Introduction
    2. 9.2 Mathematical Description
    3. 9.3 Brownian Motion with Drift
    4. 9.4 Brownian Motion as a Markov Process
    5. 9.5 Brownian Motion as a Martingale
    6. 9.6 First Passage Time of a Brownian Motion
    7. 9.7 Maximum of a Brownian Motion
    8. 9.8 First Passage Time in an Interval
    9. 9.9 The Brownian Bridge
    10. 9.10 Geometric Brownian Motion
    11. 9.11 Introduction to Stochastic Calculus
    12. 9.12 Solution of Stochastic Differential Equations
    13. 9.13 Solution of the Geometric Brownian Motion
    14. 9.14 The Ornstein–Uhlenbeck Process
    15. 9.15 Mean-Reverting OU Process
    16. 9.16 Fractional Brownian Motion
    17. 9.17 Fractional Gaussian Noise
    18. 9.18 Multifractional Brownian Motion
    19. 9.19 Problems
  17. 10. Diffusion Processes
    1. 10.1 Introduction
    2. 10.2 Mathematical Preliminaries
    3. 10.3 Models of Diffusion
    4. 10.4 Examples of Diffusion Processes
    5. 10.5 Correlated Random Walk and the Telegraph Equation
    6. 10.6 Introduction to Fractional Calculus
    7. 10.7 Anomalous (or Fractional) Diffusion
    8. 10.8 Problems
  18. 11. Levy Processes
    1. 11.1 Introduction
    2. 11.2 Generalized Central Limit Theorem
    3. 11.3 Stable Distribution
    4. 11.4 Levy Distribution
    5. 11.5 Levy Processes
    6. 11.6 Infinite Divisibility
    7. 11.7 Jump-Diffusion Processes
  19. 12. Markovian Arrival Processes
    1. 12.1 Introduction
    2. 12.2 Overview of Matrix-Analytic Methods
    3. 12.3 Markovian Arrival Process
    4. 12.4 Batch Markovian Arrival Process
    5. 12.5 Markov-Modulated Poisson Process
    6. 12.6 Markov-Modulated Bernoulli Process
    7. 12.7 Sample Applications of MAP and Its Derivatives
    8. 12.8 Problems
  20. 13. Controlled Markov Processes
    1. 13.1 Introduction
    2. 13.2 Markov Decision Processes
    3. 13.3 Semi-MDPs
    4. 13.4 Partially Observable MDPs
    5. 13.5 Problems
  21. 14. Hidden Markov Models
    1. 14.1 Introduction
    2. 14.2 HMM Basics
    3. 14.3 HMM Assumptions
    4. 14.4 Three Fundamental Problems
    5. 14.5 Solution Methods
    6. 14.6 Types of HMMs
    7. 14.7 HMMs with Silent States
    8. 14.8 Extensions of HMMs
    9. 14.9 Other Extensions of HMM
    10. 14.10 Problems
  22. 15. Markov Point Processes
    1. 15.1 Introduction
    2. 15.2 Temporal Point Processes
    3. 15.3 Specific Temporal Point Processes
    4. 15.4 Spatial Point Processes
    5. 15.5 Specific Spatial Point Processes
    6. 15.6 Spatial–Temporal Point Processes
    7. 15.7 Operations on Point Processes
    8. 15.8 Marked Point Processes
    9. 15.9 Introduction to Markov Random Fields
    10. 15.10 Markov Point Processes
    11. 15.11 Markov Marked Point Processes
    12. 15.12 Applications of Markov Point Processes
    13. 15.13 Problems
  23. References

Product information

  • Title: Markov Processes for Stochastic Modeling, 2nd Edition
  • Author(s): Oliver Ibe
  • Release date: May 2013
  • Publisher(s): Elsevier
  • ISBN: 9780124078390