O'Reilly logo

Bayesian Signal Processing: Classical, Modern and Particle Filtering Methods by James V. Candy

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

2

BAYESIAN ESTIMATION

2.1 INTRODUCTION

In this chapter we motivate the idea of Bayesian estimation from probabilistic perspective, that is, we perform the required estimation using the underlying densities or mass functions. We start with the “batch” approach and evolve to the Bayesian sequential techniques. We discuss the most popular formulations: maximum a posteriori (MAP), maximum likelihood (ML), minimum variance (MV) or equivalently minimum mean-squared error (MMSE) and least-squares (LS) methods. Bayesian sequential techniques are then developed. The main idea is to develop the proper perspective for the subsequent chapters and construct a solid foundation for the techniques to follow.

2.2 BATCH BAYESIAN ESTIMATION

Suppose we are trying to estimate a random parameter X from data Y =y. Then the associated conditional density Pr(X|Y = y) is called the posterior density because the estimate is conditioned “after (post) the measurements” have been acquired. Estimators based on this a posteriori density are usually called Bayesian because they are constructed from Bayes’ theorem, since Pr(X|Y) is difficult to obtain directly. That is, Bayes’rule is defined

(2.1)

img

where Pr(X) is called the prior density (before measurement), Pr(Y|X) is called the likelihood (more likely to be true) and Pr(Y) is called the evidence (normalizes the posterior to assure its integral is unity). Bayesian ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required