O'Reilly logo

Bayesian Estimation and Tracking: A Practical Guide by Anton J. Haug

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

3.1 Bayesian Estimation

Bayesian estimation has as its objective the estimation of successive values of a parameter vector x given an observation vector z. As noted above, it is customary to treat both x and z as random vectors. For the parameter vector, the stochastic assumption is inherent in the equations governing the dynamics of the parameter, where unmodeled effects are added as random noise. For the observation vector one can justify a stochastic nature by assuming that there is always some random measurement noise. The random vector x is assumed to have a known prior density function img. This prior distribution includes all that is known and unknown about the parameter vector prior to the availability of any observational data. If the true parameter value of x were known, then the probability density of z is given by the conditional density or likelihood function img and the complete statistical properties of z would be known.

Once an experiment has been conducted and a realization of the random variable z is available, one can use Bayes' law to obtain the posterior conditional density of x:

(3.1) equation

Thus, within the Bayesian framework, the posterior density contains everything there ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required