You are previewing Building Probabilistic Graphical Models with Python.
O'Reilly logo
Building Probabilistic Graphical Models with Python

Book Description

Solve machine learning problems using probabilistic graphical models implemented in Python with real-world applications

In Detail

With the increasing prominence in machine learning and data science applications, probabilistic graphical models are a new tool that machine learning users can use to discover and analyze structures in complex problems. The variety of tools and algorithms under the PGM framework extend to many domains such as natural language processing, speech processing, image processing, and disease diagnosis.

You've probably heard of graphical models before, and you're keen to try out new landscapes in the machine learning area. This book gives you enough background information to get started on graphical models, while keeping the math to a minimum.

What You Will Learn

  • Create Bayesian networks and make inferences
  • Learn the structure of causal Bayesian networks from data
  • Gain an insight on algorithms that run inference
  • Explore parameter estimation in Bayes nets with PyMC sampling
  • Understand the complexity of running inference algorithms in Bayes networks
  • Discover why graphical models can trump powerful classifiers in certain problems
  • Downloading the example code for this book. You can download the example code files for all Packt books you have purchased from your account at http://www.PacktPub.com. If you purchased this book elsewhere, you can visit http://www.PacktPub.com/support and register to have the files e-mailed directly to you.

    Table of Contents

    1. Building Probabilistic Graphical Models with Python
      1. Table of Contents
      2. Building Probabilistic Graphical Models with Python
      3. Credits
      4. About the Author
      5. About the Reviewers
      6. www.PacktPub.com
        1. Support files, eBooks, discount offers and more
          1. Why Subscribe?
          2. Free Access for Packt account holders
      7. Preface
        1. What this book covers
        2. What you need for this book
        3. Who this book is for
        4. Conventions
        5. Reader feedback
        6. Customer support
          1. Downloading the example code
          2. Errata
          3. Piracy
          4. Questions
      8. 1. Probability
        1. The theory of probability
        2. Goals of probabilistic inference
        3. Conditional probability
        4. The chain rule
        5. The Bayes rule
        6. Interpretations of probability
        7. Random variables
        8. Marginal distribution
        9. Joint distribution
        10. Independence
        11. Conditional independence
        12. Types of queries
          1. Probability queries
          2. MAP queries
        13. Summary
      9. 2. Directed Graphical Models
        1. Graph terminology
          1. Python digression
        2. Independence and independent parameters
        3. The Bayes network
          1. The chain rule
        4. Reasoning patterns
          1. Causal reasoning
          2. Evidential reasoning
          3. Inter-causal reasoning
        5. D-separation
          1. The D-separation example
          2. Blocking and unblocking a V-structure
        6. Factorization and I-maps
        7. The Naive Bayes model
          1. The Naive Bayes example
        8. Summary
      10. 3. Undirected Graphical Models
        1. Pairwise Markov networks
        2. The Gibbs distribution
        3. An induced Markov network
        4. Factorization
        5. Flow of influence
        6. Active trail and separation
        7. Structured prediction
          1. Problem of correlated features
          2. The CRF representation
          3. The CRF example
        8. The factorization-independence tango
        9. Summary
      11. 4. Structure Learning
        1. The structure learning landscape
        2. Constraint-based structure learning
          1. Part I
          2. Part II
          3. Part III
          4. Summary of constraint-based approaches
        3. Score-based learning
          1. The likelihood score
          2. The Bayesian information criterion score
          3. The Bayesian score
          4. Summary of score-based learning
        4. Summary
      12. 5. Parameter Learning
        1. The likelihood function
        2. Parameter learning example using MLE
        3. MLE for Bayesian networks
        4. Bayesian parameter learning example using MLE
        5. Data fragmentation
        6. Effects of data fragmentation on parameter estimation
        7. Bayesian parameter estimation
          1. An example of Bayesian methods for parameter learning
        8. Bayesian estimation for the Bayesian network
        9. Example of Bayesian estimation
        10. Summary
      13. 6. Exact Inference Using Graphical Models
        1. Complexity of inference
          1. Real-world issues
        2. Using the Variable Elimination algorithm
          1. Marginalizing factors that are not relevant
          2. Factor reduction to filter evidence
            1. Shortcomings of the brute-force approach
            2. Using the Variable Elimination approach
          3. Complexity of Variable Elimination
            1. Why does elimination ordering matter?
          4. Graph perspective
            1. Learning the induced width from the graph structure
              1. Why does induced width matter?
              2. Finding VE orderings
        3. The tree algorithm
          1. The four stages of the junction tree algorithm
          2. Using the junction tree algorithm for inference
            1. Stage 1.1 – moralization
            2. Stage 1.2 – triangulation
            3. Stage 1.3 – building the join tree
            4. Stage 2 – initializing potentials
            5. Stage 3 – message passing
        4. Summary
      14. 7. Approximate Inference Methods
        1. The optimization perspective
          1. Belief propagation in general graphs
          2. Creating a cluster graph to run LBP
          3. Message passing in LBP
        2. Steps in the LBP algorithm
          1. Improving the convergence of LBP
          2. Applying LBP to segment an image
            1. Understanding energy-based models
            2. Visualizing unary and pairwise factors on a 3 x 3 grid
            3. Creating a model for image segmentation
          3. Applications of LBP
        3. Sampling-based methods
          1. Forward sampling
          2. The accept-reject sampling method
          3. The Markov Chain Monte Carlo sampling process
            1. The Markov property
            2. The Markov chain
            3. Reaching a steady state
            4. Sampling using a Markov chain
          4. Gibbs sampling
            1. Steps in the Gibbs sampling procedure
            2. An example of Gibbs sampling
        4. Summary
      15. A. References
        1. Chapter 1
        2. Chapter 2
        3. Chapter 3
        4. Chapter 4
        5. Chapter 5
        6. Chapter 6
        7. Chapter 7
        8. Other references
      16. Index