12Kernel Feature Extraction in Signal Processing

Kernel‐based feature extraction and dimensionality reduction are becoming increasingly important in advanced signal processing. This is particularly relevant in applications dealing with very high‐dimensional data. Current methods tackle important problems in signal processing: from signal subspace identification to nonlinear blind source separation, as well as nonlinear transformations that maximize particular criteria, such as variance (KPCA), covariance (kernel PLS (KPLS)), MSE (kernel orthonormalized PLS (KOPLS)), correlation (kernel canonical correlation analysis (KCCA)), mutual information (kernel generalized variance) or SNR (kernel SNR), just to name a few. Kernel multivariate analysis (KMVA) has closed links to KFD analysis as well as interesting relations to information theoretic learning. The application of the methods is hampered in two extreme cases: when few examples are available the extracted features are either overfitted or meaningless, while in large‐scale settings the computational cost is prohibitive. Semi‐supervised and sparse learning have entered the field to alleviate these problems. Another field of intense activity is that of domain adaptation and manifold alignment, for which kernel method feature extraction is currently being used. All these topics are the subject of this chapter. We will review the main kernel feature extraction and dimensionality reduction methods, dealing with supervised, unsupervised ...

Get Digital Signal Processing with Kernel Methods now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.