2.1 Nature of Bayesian inference
2.1.1 Preliminary remarks
In this section, a general framework for Bayesian statistical inference will be provided. In broad outline, we take prior beliefs about various possible hypotheses and then modify these prior beliefs in the light of relevant data which we have collected in order to arrive at posterior beliefs. (The reader may prefer to return to this section after reading Section 2.2, which deals with one of the simplest special cases of Bayesian inference.)
2.1.2 Post is prior times likelihood
Almost all of the situations we will think of in this book fit into the following pattern. Suppose that you are interested in the values of k unknown quantities
(where k can be one or more than one) and that you have some a priori beliefs about their values which you can express in terms of the pdf
Now suppose that you then obtain some data relevant to their values. More precisely, suppose that we have n observations
which have a probability distribution that depends on these k unknown quantities as parameters, so that the pdf (continuous or discrete) of the vector X depends on the vector θ in a known way. Usually the components of θ and X will be integers ...