Chapter Fourteen

Markov Chain Monte Carlo and Bayesian Statistics

We have introduced Bayesian parameter estimation in Section 4.6, as a possible way to overcome some limitations of orthodox statistics. The essence of the approach can be summarized as follows:

posterior ∝ prior × likelihood,

where the prior collects information, possibly of subjective nature, that we have about a set of parameters before observing new data; the likelihood measures how likely is what we observe, on the basis of our current knowledge or belief; and the posterior merges the two above pieces of information in order to provide us with an updated picture. This is an informal restatement of Eq. (4.12), which we rewrite below for convenience:

(14.1) equation

The parameters θj, j = 1, …, q, are regarded as random variables within the Bayesian framework, and p1, …, θq) is their prior. The prior can be a probability density function (PDF) or a probability mass function (PMF), or a mixed object, depending on the nature of the parameters involved. For the sake of simplicity, in this chapter we will not deal systematically with both cases, and we will refer to either one according to convenience. We have a set of independent observations, i.e., realizations of a random variable taking values xi, i = 1, …, n. Note that here we are considering n observations of a scalar random variable, whose distribution depends on

Get Handbook in Monte Carlo Simulation: Applications in Financial Engineering, Risk Management, and Economics now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.