You are previewing Modeling and Reasoning with Bayesian Networks.
O'Reilly logo
Modeling and Reasoning with Bayesian Networks

Book Description

A thorough introduction to the formal foundations and practical applications of Bayesian networks. It provides an extensive discussion of techniques for building Bayesian networks that model real-world situations, including techniques for synthesizing models from design, learning models from data, and debugging models using sensitivity analysis. It also treats exact and approximate inference algorithms at both theoretical and practical levels. The treatment of exact algorithms covers the main inference paradigms based on elimination and conditioning and includes advanced methods for compiling Bayesian networks, time-space tradeoffs, and exploiting local structure of massively connected networks. The treatment of approximate algorithms covers the main inference paradigms based on sampling and optimization and includes influential algorithms such as importance sampling, MCMC, and belief propagation. The author assumes very little background on the covered subjects, supplying in-depth discussions for theoretically inclined readers and enough practical details to provide an algorithmic cookbook for the system developer.

Table of Contents

  1. Cover
  2. Title Page
  3. Copyright Page
  4. Dedication
  5. Contents
  6. Preface
  7. 1: Introduction
    1. 1.1: Automated Reasoning
    2. 1.2: Degrees of Belief
    3. 1.3: Probabilistic Reasoning
    4. 1.4: Bayesian Networks
    5. 1.5: What is not Covered in this Book
  8. 2: Propositional Logic
    1. 2.1: Introduction
    2. 2.2: Syntax of Propositional Sentences
    3. 2.3: Semantics of Propositional Sentences
    4. 2.4: The Monotonicity of Logical Reasoning
    5. 2.5: Multivalued Variables
    6. 2.6: Variable Instantiations and Related Notations
    7. 2.7: Logical Forms
      1. Bibliographic Remarks
    8. 2.8: Exercises
  9. 3: Probability Calculus
    1. 3.1: Introduction
    2. 3.2: Degrees of Belief
    3. 3.3: Updating Beliefs
    4. 3.4: Independence
    5. 3.5: Further Properties of Beliefs
    6. 3.6: Soft Evidence
    7. 3.7: Continuous Variables as Soft Evidence
      1. Bibliographic Remarks
    8. 3.8: Exercises
  10. 4: Bayesian Networks
    1. 4.1: Introduction
    2. 4.2: Capturing Independence Graphically
    3. 4.3: Parameterizing the Independence Structure
    4. 4.4: Properties of Probabilistic Independence
    5. 4.5: A Graphical Test of Independence
    6. 4.6: More on DAGs and Independence
      1. Bibliographic Remarks
    7. 4.7: Exercises
    8. 4.8: Proofs
  11. 5: Building Bayesian Networks
    1. 5.1: Introduction
    2. 5.2: Reasoning with Bayesian Networks
    3. 5.3: Modeling with Bayesian Networks
    4. 5.4: Dealing with Large CPTs
    5. 5.5: The Significance of Network Parameters
      1. Bibliographic Remarks
    6. 5.6: Exercises
  12. 6: Inference by Variable Elimination
    1. 6.1: Introduction
    2. 6.2: The Process of Elimination
    3. 6.3: Factors
    4. 6.4: Elimination as a Basis for Inference
    5. 6.5: Computing Prior Marginals
    6. 6.6: Choosing an Elimination Order
    7. 6.7: Computing Posterior Marginals
    8. 6.8: Network Structure and Complexity
    9. 6.9: Query Structure and Complexity
    10. 6.10: Bucket Elimination
      1. Bibliographic Remarks
    11. 6.11: Exercises
    12. 6.12: Proofs
  13. 7: Inference by Factor Elimination
    1. 7.1: Introduction
    2. 7.2: Factor Elimination
    3. 7.3: Elimination Trees
    4. 7.4: Separators and Clusters
    5. 7.5: A Message-Passing Formulation
    6. 7.6: The Jointree Connection
    7. 7.7: The Jointree Algorithm: A Classical View
      1. Bibliographic Remarks
    8. 7.8: Exercises
    9. 7.9: Proofs
  14. 8: Inference by Conditioning
    1. 8.1: Introduction
    2. 8.2: Cutset Conditioning
    3. 8.3: Recursive Conditioning
    4. 8.4: Any-Space Inference
    5. 8.5: Decomposition Graphs
    6. 8.6: The Cache Allocation Problem
      1. Bibliographic Remarks
    7. 8.7: Exercises
    8. 8.8: Proofs
  15. 9: Models for Graph Decomposition
    1. 9.1: Introduction
    2. 9.2: Moral Graphs
    3. 9.3: Elimination Orders
    4. 9.4: Jointrees
    5. 9.5: Dtrees
    6. 9.6: Triangulated Graphs
      1. Bibliographic Remarks
    7. 9.7: Exercises
    8. 9.8: Lemmas
    9. 9.9: Proofs
  16. 10: Most Likely Instantiations
    1. 10.1: Introduction
    2. 10.2: Computing MPE Instantiations
    3. 10.3: Computing MAP Instantiations
      1. Bibliographic Remarks
    4. 10.4: Exercises
    5. 10.5: Proofs
  17. 11: The Complexity of Probabilistic Inference
    1. 11.1: Introduction
    2. 11.2: Complexity Classes
    3. 11.3: Showing Hardness
    4. 11.4: Showing Membership
    5. 11.5: Complexity of MAP on Polytrees
    6. 11.6: Reducing Probability of Evidence to Weighted Model Counting
    7. 11.7: Reducing MPE to W-MAXSAT
      1. Bibliographic Remarks
    8. 11.8: Exercises
    9. 11.9: Proofs
  18. 12: Compiling Bayesian Networks
    1. 12.1: Introduction
    2. 12.2: Circuit Semantics
    3. 12.3: Circuit Propagation
    4. 12.4: Circuit Compilation
      1. Bibliographic Remarks
    5. 12.5: Exercises
    6. 12.6: Proofs
  19. 13: Inference with Local Structure
    1. 13.1: Introduction
    2. 13.2: The Impact of Local Structure on Inference Complexity
    3. 13.3: CNF Encodings with Local Structure
    4. 13.4: Conditioning with Local Structure
    5. 13.5: Elimination with Local Structure
      1. Bibliographic Remarks
    6. 13.6: Exercises
  20. 14: Approximate Inference by Belief Propagation
    1. 14.1: Introduction
    2. 14.2: The Belief Propagation Algorithm
    3. 14.3: Iterative Belief Propagation
    4. 14.4: The Semantics of IBP
    5. 14.5: Generalized Belief Propagation
    6. 14.6: Joingraphs
    7. 14.7: Iterative Joingraph Propagation
    8. 14.8: Edge-Deletion Semantics of Belief Propagation
      1. Bibliographic Remarks
    9. 14.9: Exercises
    10. 14.10: Proofs
  21. 15: Approximate Inference by Stochastic Sampling
    1. 15.1: Introduction
    2. 15.2: Simulating a Bayesian Network
    3. 15.3: Expectations
    4. 15.4: Direct Sampling
    5. 15.5: Estimating a Conditional Probability
    6. 15.6: Importance Sampling
    7. 15.7: Markov Chain Simulation
      1. Bibliographic Remarks
    8. 15.8: Exercises
    9. 15.9: Proofs
  22. 16: Sensitivity Analysis
    1. 16.1: Introduction
    2. 16.2: Query Robustness
    3. 16.3: Query Control
      1. Bibliographic Remarks
    4. 16.4: Exercises
    5. 16.5: Proofs
  23. 17: Learning: The Maximum Likelihood Approach
    1. 17.1: Introduction
    2. 17.2: Estimating Parameters from Complete Data
    3. 17.3: Estimating Parameters from Incomplete Data
    4. 17.4: Learning Network Structure
    5. 17.5: Searching for Network Structure
      1. Bibliographic Remarks
    6. 17.6: Exercises
    7. 17.7: Proofs
  24. 18: Learning: The Bayesian Approach
    1. 18.1: Introduction
    2. 18.2: Meta-Networks
    3. 18.3: Learning with Discrete Parameter Sets
    4. 18.4: Learning with Continuous Parameter Sets
    5. 18.5: Learning Network Structure
      1. Bibliographic Remarks
    6. 18.6: Exercises
    7. 18.7: Proofs
  25. A Notation
  26. B Concepts from Information Theory
  27. C Fixed Point Iterative Methods
  28. D Constrained Optimization
  29. Bibliography
  30. Index