## With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

No credit card required

## Book Description

Familiarize yourself with probabilistic graphical models through real-world problems and illustrative code examples in R

• Predict and use a probabilistic graphical models (PGM) as an expert system

• Comprehend how your computer can learn Bayesian modeling to solve real-world problems

• Know how to prepare data and feed the models by using the appropriate algorithms from the appropriate R package

• Who This Book Is For

This book is for anyone who has to deal with lots of data and draw conclusions from it, especially when the data is noisy or uncertain. Data scientists, machine learning enthusiasts, engineers, and those who curious about the latest advances in machine learning will find PGM interesting.

What You Will Learn

• Understand the concepts of PGM and which type of PGM to use for which problem

• Tune the model’s parameters and explore new models automatically

• Understand the basic principles of Bayesian models, from simple to advanced

• Transform the old linear regression model into a powerful probabilistic model

• Use standard industry models but with the power of PGM

• Understand the advanced models used throughout today's industry

• See how to compute posterior distribution with exact and approximate inference algorithms

• In Detail

Probabilistic graphical models (PGM, also known as graphical models) are a marriage between probability theory and graph theory. Generally, PGMs use a graph-based representation. Two branches of graphical representations of distributions are commonly used, namely Bayesian networks and Markov networks. R has many packages to implement graphical models.

We’ll start by showing you how to transform a classical statistical model into a modern PGM and then look at how to do exact inference in graphical models. Proceeding, we’ll introduce you to many modern R packages that will help you to perform inference on the models. We will then run a Bayesian linear regression and you’ll see the advantage of going probabilistic when you want to do prediction.

Next, you’ll master using R packages and implementing its techniques. Finally, you’ll be presented with machine learning applications that have a direct impact in many fields. Here, we’ll cover clustering and the discovery of hidden information in big data, as well as two important methods, PCA and ICA, to reduce the size of big problems.

Style and approach

This book gives you a detailed and step-by-step explanation of each mathematical concept, which will help you build and analyze your own machine learning models and apply them to real-world problems. The mathematics is kept simple and each formula is explained thoroughly.

Downloading the example code for this book. You can download the example code files for all Packt books you have purchased from your account at http://www.PacktPub.com. If you purchased this book elsewhere, you can visit http://www.PacktPub.com/support and register to have the code file.

1. Learning Probabilistic Graphical Models in R
2. Learning Probabilistic Graphical Models in R
3. Credits
6. www.PacktPub.com
1. eBooks, discount offers, and more
7. Preface
1. What this book covers
2. What you need for this book
3. Who this book is for
4. Conventions
6. Customer support
8. 1. Probabilistic Reasoning
1. Machine learning
2. Representing uncertainty with probabilities
1. Beliefs and uncertainty as probabilities
2. Conditional probability
3. Probability calculus and random variables
4. Joint probability distributions
5. Bayes' rule
3. Probabilistic graphical models
4. Summary
9. 2. Exact Inference
1. Building graphical models
1. Types of random variable
2. Building graphs
2. Variable elimination
4. The junction tree algorithm
5. Examples of probabilistic graphical models
6. Summary
10. 3. Learning Parameters
1. Introduction
2. Learning by inference
3. Maximum likelihood
4. Learning with hidden variables – the EM algorithm
5. Principles of the EM algorithm
6. Summary
11. 4. Bayesian Modeling – Basic Models
1. The Naive Bayes model
2. Beta-Binomial
3. The Gaussian mixture model
4. Summary
12. 5. Approximate Inference
1. Sampling from a distribution
2. Basic sampling algorithms
3. Rejection sampling
4. Importance sampling
5. Markov Chain Monte-Carlo
6. MCMC for probabilistic graphical models in R
7. Summary
13. 6. Bayesian Modeling – Linear Models
1. Linear regression
2. Bayesian linear models
3. Summary
14. 7. Probabilistic Mixture Models
1. Mixture models
2. EM for mixture models
3. Mixture of Bernoulli
4. Mixture of experts
5. Latent Dirichlet Allocation
6. Summary
15. A. Appendix
1. References
16. Index