Preface

In the real world, systems designed to extract signals from noisy measurements are plagued by errors evolving from constraints of the sensors employed, to random disturbances and noise and probably, most common, the lack of precise knowledge of the underlying physical phenomenology generating the process in the first place! Methods capable of extracting the desired signal from hostile environments require approaches that capture all of the a priori information available and incorporate them into a processing scheme. This approach is typically model-based [1] employing mathematical representations of the component processes involved. However, the actual implementation providing the algorithm evolves from the realm of statistical signal processing using a Bayesian approach based on Bayes’ rule. Statistical signal processing is focused on the development of processors capable of extracting the desired information from noisy, uncertain measurement data. This is a text that develops the “Bayesian approach” to statistical signal processing for a variety of useful model sets. It features the next generation of processors which have recently been enabled with the advent of high speed/high throughput computers. The emphasis is on nonlinear/non-Gaussian problems, but classical techniques are included as special cases to enable the reader familiar with such methods to draw a parallel between the approaches. The common ground is the model sets. Here the state-space approach is emphasized ...

Get Bayesian Signal Processing: Classical, Modern and Particle Filtering Methods now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.