# Probabilistic Programming from Scratch 2: Bayes’ Theorem and Online Learning

## Mike Lee Williams explains the "Bayesian" in ABC, talks tough about prior and posterior distributions, and makes his ABC inference machine more Pythonic.

Previously, Fast Forward Lab's Mike Lee Williams introduced a probabilistic programming system that incorporated the Approximate Bayesian Computation (ABC) algorithm. In this session, he explains what's "Bayesian" about ABC, formalizes some ideas around prior and posterior distributions, makes his ABC system code more Pythonic, and extends it to solve the online case, which is when data arrives in multiple small batches.

What you will learn:

- Understand in detail how the ABC algorithm relates to Bayes' Theorem
- Hear some straight talk about the gnashing of teeth around priors
- Learn how reimplementing ABC w/idiomatic Python + many generators = better code
- Watch how the reimplemented system handles data in the online setting