Abstract
Kernel principal component analysis (KPCA) forms the basis for a class of methods commonly used for denoising a set of multivariate observations. Most KPCA algorithms involve two steps: projection and preimage approximation. We argue that this two-step procedure can be inefficient and result in poor denoising. We propose an alternative projection-free KPCA denoising approach that does not involve the usual projection and subsequent preimage approximation steps. In order to denoise an observation, our approach performs a single line search along the gradient descent direction of the squared projection error. The rationale is that this moves an observation towards the underlying manifold that represents the noiseless data in the most direct manner possible. We demonstrate that the approach is simple, computationally efficient, robust, and sometimes provides substantially better denoising than the standard KPCA algorithm.
Original language | English (US) |
---|---|
Pages (from-to) | 163-176 |
Number of pages | 14 |
Journal | Neurocomputing |
Volume | 357 |
DOIs | |
State | Published - Sep 10 2019 |
Funding
This work was supported by NSF Grant #CMMI-1265709 . Anh Tuan Bui was also supported by the Vietnam Education Foundation . The authors thank the anonymous referees for their careful reviews and helpful comments. This work was supported by NSF Grant #CMMI-1265709. Anh Tuan Bui was also supported by the Vietnam Education Foundation. The authors thank the anonymous referees for their careful reviews and helpful comments.
Keywords
- Feature space
- Image processing
- Pattern recognition
- Preimage problem
ASJC Scopus subject areas
- Computer Science Applications
- Cognitive Neuroscience
- Artificial Intelligence