You are previewing Mastering Probabilistic Graphical Models Using Python.
O'Reilly logo
Mastering Probabilistic Graphical Models Using Python

Book Description

Master probabilistic graphical models by learning through real-world problems and illustrative code examples in Python

About This Book

  • Gain in-depth knowledge of Probabilistic Graphical Models
  • Model time-series problems using Dynamic Bayesian Networks
  • A practical guide to help you apply PGMs to real-world problems
  • Who This Book Is For

    If you are a researcher or a machine learning enthusiast, or are working in the data science field and have a basic idea of Bayesian Learning or Probabilistic Graphical Models, this book will help you to understand the details of Graphical Models and use it in your data science problems. This book will also help you select the appropriate model as well as the appropriate algorithm for your problem.

    What You Will Learn

  • Get to know the basics of Probability theory and Graph Theory
  • Work with Markov Networks
  • Implement Bayesian Networks
  • Exact Inference Techniques in Graphical Models such as the Variable Elimination Algorithm
  • Understand approximate Inference Techniques in Graphical Models such as Message Passing Algorithms
  • Sample algorithms in Graphical Models
  • Grasp details of Naive Bayes with real-world examples
  • Deploy PGMs using various libraries in Python
  • Gain working details of Hidden Markov Models with real-world examples
  • In Detail

    Probabilistic Graphical Models is a technique in machine learning that uses the concepts of graph theory to compactly represent and optimally predict values in our data problems. In real world problems, it's often difficult to select the appropriate graphical model as well as the appropriate inference algorithm, which can make a huge difference in computation time and accuracy. Thus, it is crucial to know the working details of these algorithms.

    This book starts with the basics of probability theory and graph theory, then goes on to discuss various models and inference algorithms. All the different types of models are discussed along with code examples to create and modify them, and also to run different inference algorithms on them. There is a complete chapter devoted to the most widely used networks Naive Bayes Model and Hidden Markov Models (HMMs). These models have been thoroughly discussed using real-world examples.

    Style and approach

    An easy-to-follow guide to help you understand Probabilistic Graphical Models using simple examples and numerous code examples, with an emphasis on more widely used models.

    Downloading the example code for this book. You can download the example code files for all Packt books you have purchased from your account at http://www.PacktPub.com. If you purchased this book elsewhere, you can visit http://www.PacktPub.com/support and register to have the code file.

    Table of Contents

    1. Mastering Probabilistic Graphical Models Using Python
      1. Table of Contents
      2. Mastering Probabilistic Graphical Models Using Python
      3. Credits
      4. About the Authors
      5. About the Reviewers
      6. www.PacktPub.com
        1. Support files, eBooks, discount offers, and more
          1. Why subscribe?
          2. Free access for Packt account holders
      7. Preface
        1. What this book covers
        2. What you need for this book
        3. Who this book is for
        4. Conventions
        5. Reader feedback
        6. Customer support
          1. Downloading the example code
          2. Downloading the color images of this book
          3. Errata
          4. Piracy
          5. Questions
      8. 1. Bayesian Network Fundamentals
        1. Probability theory
          1. Random variable
          2. Independence and conditional independence
        2. Installing tools
          1. IPython
          2. pgmpy
        3. Representing independencies using pgmpy
        4. Representing joint probability distributions using pgmpy
        5. Conditional probability distribution
          1. Representing CPDs using pgmpy
        6. Graph theory
          1. Nodes and edges
          2. Walk, paths, and trails
        7. Bayesian models
          1. Representation
          2. Factorization of a distribution over a network
          3. Implementing Bayesian networks using pgmpy
            1. Bayesian model representation
          4. Reasoning pattern in Bayesian networks
          5. D-separation
            1. Direct connection
            2. Indirect connection
        8. Relating graphs and distributions
          1. IMAP
          2. IMAP to factorization
        9. CPD representations
          1. Deterministic CPDs
          2. Context-specific CPDs
            1. Tree CPD
            2. Rule CPD
        10. Summary
      9. 2. Markov Network Fundamentals
        1. Introducing the Markov network
          1. Parameterizing a Markov network – factor
            1. Factor operations
          2. Gibbs distributions and Markov networks
        2. The factor graph
        3. Independencies in Markov networks
        4. Constructing graphs from distributions
        5. Bayesian and Markov networks
          1. Converting Bayesian models into Markov models
          2. Converting Markov models into Bayesian models
          3. Chordal graphs
        6. Summary
      10. 3. Inference – Asking Questions to Models
        1. Inference
          1. Complexity of inference
        2. Variable elimination
          1. Analysis of variable elimination
          2. Finding elimination ordering
            1. Using the chordal graph property of induced graphs
            2. Minimum fill/size/weight/search
        3. Belief propagation
          1. Clique tree
          2. Constructing a clique tree
          3. Message passing
          4. Clique tree calibration
          5. Message passing with division
            1. Factor division
            2. Querying variables that are not in the same cluster
              1. MAP inference
        4. MAP using variable elimination
        5. Factor maximization
        6. MAP using belief propagation
        7. Finding the most probable assignment
        8. Predictions from the model using pgmpy
        9. A comparison of variable elimination and belief propagation
        10. Summary
      11. 4. Approximate Inference
        1. The optimization problem
        2. The energy function
        3. Exact inference as an optimization
        4. The propagation-based approximation algorithm
          1. Cluster graph belief propagation
          2. Constructing cluster graphs
            1. Pairwise Markov networks
            2. Bethe cluster graph
        5. Propagation with approximate messages
          1. Message creation
          2. Inference with approximate messages
            1. Sum-product expectation propagation
            2. Belief update propagation
              1. MAP inference
        6. Sampling-based approximate methods
        7. Forward sampling
        8. Conditional probability distribution
        9. Likelihood weighting and importance sampling
        10. Importance sampling
        11. Importance sampling in Bayesian networks
          1. Computing marginal probabilities
          2. Ratio likelihood weighting
          3. Normalized likelihood weighting
        12. Markov chain Monte Carlo methods
        13. Gibbs sampling
          1. Markov chains
        14. The multiple transitioning model
        15. Using a Markov chain
        16. Collapsed particles
        17. Collapsed importance sampling
        18. Summary
      12. 5. Model Learning – Parameter Estimation in Bayesian Networks
        1. General ideas in learning
          1. The goals of learning
          2. Density estimation
          3. Predicting the specific probability values
          4. Knowledge discovery
        2. Learning as an optimization
          1. Empirical risk and overfitting
        3. Discriminative versus generative training
          1. Learning task
            1. Model constraints
            2. Data observability
        4. Parameter learning
          1. Maximum likelihood estimation
          2. Maximum likelihood principle
          3. The maximum likelihood estimate for Bayesian networks
        5. Bayesian parameter estimation
          1. Priors
          2. Bayesian parameter estimation for Bayesian networks
        6. Structure learning in Bayesian networks
          1. Methods for the learning structure
          2. Constraint-based structure learning
            1. Structure score learning
            2. The likelihood score
            3. The Bayesian score
        7. The Bayesian score for Bayesian networks
        8. Summary
      13. 6. Model Learning – Parameter Estimation in Markov Networks
        1. Maximum likelihood parameter estimation
          1. Likelihood function
            1. Log-linear model
            2. Gradient ascent
          2. Learning with approximate inference
            1. Belief propagation and pseudo-moment matching
          3. Structure learning
            1. Constraint-based structure learning
            2. Score-based structure learning
            3. The likelihood score
            4. Bayesian score
        2. Summary
      14. 7. Specialized Models
        1. The Naive Bayes model
          1. Why does it even work?
          2. Types of Naive Bayes models
            1. Multivariate Bernoulli Naive Bayes model
            2. Multinomial Naive Bayes model
            3. Choosing the right model
        2. Dynamic Bayesian networks
          1. Assumptions
            1. Discrete timeline assumption
            2. The Markov assumption
            3. Model representation
        3. The Hidden Markov model
          1. Generating an observation sequence
          2. Computing the probability of an observation
            1. The forward-backward algorithm
            2. Computing the state sequence
        4. Applications
          1. The acoustic model
          2. The language model
        5. Summary
      15. Index