Chapter 16

Probabilistic Graphical Models

Part II

Abstract

This chapter is the second one dealing with probabilistic graphical models. Junction trees are first reviewed and a message-passing algorithm for such structures is developed. Then, the focus turns on approximate inference techniques on graphical models, based on variational methods, both for local as well as global approximation. Dynamic graphical models are discussed with an emphasis on HMMs. Inference and training of HMMs is viewed as a special case of the EM algorithm and the message-passing rationale. The Baum-Welch and the Viterbi algorithms are derived. Finally, some extensions, including factorial HMMs and time-varying dynamic Bayesian networks are presented. A discussion ...

Get Machine Learning now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.