We have introduced Bayesian parameter estimation in Section 4.6, as a possible way to overcome some limitations of orthodox statistics. The essence of the approach can be summarized as follows:

posterior ∝ prior × likelihood,

where the prior collects information, possibly of subjective nature, that we have about a set of parameters before observing new data; the likelihood measures how likely is what we observe, on the basis of our current knowledge or belief; and the posterior merges the two above pieces of information in order to provide us with an updated picture. This is an informal restatement of Eq. (4.12), which we rewrite below for convenience:

The parameters θ_{j}, *j* = 1, …, *q*, are regarded as random variables within the Bayesian framework, and *p*(θ_{1}, …, θ_{q}) is their prior. The prior can be a probability density function (PDF) or a probability mass function (PMF), or a mixed object, depending on the nature of the parameters involved. For the sake of simplicity, in this chapter we will not deal systematically with both cases, and we will refer to either one according to convenience. We have a set of independent observations, i.e., realizations of a random variable taking values *x*_{i}, *i* = 1, …, *n*. Note that here we are considering *n* observations of a scalar random variable, whose distribution depends on

Start Free Trial

No credit card required