Summary

We began this chapter with the intuition that a good model is one that effectively explains the data and is also simple. Using this intuition, we discussed the problem of overfitting and underfitting that pervades statistical and machine learning practices. We then formalized our intuitions by introducing the concept of deviance and information criteria. We started with the rather unsophisticated AIC, and its more Bayesian cousin known as DIC. Then we learned about an improved version of both, WAIC. We also discussed briefly the empirical cross-validation method and a way to approximate its results using the LOO method. We briefly discussed priors and hierarchical models in the light of the new ideas exposed in this chapter. Finally, we ...

Get Bayesian Analysis with Python now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.