Three iteratively reweighted least squares algorithms for L1 -norm principal component analysis

Young Woong Park*, Diego Klabjan

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

Principal component analysis (PCA) is often used to reduce the dimension of data by selecting a few orthonormal vectors that explain most of the variance structure of the data. L1 PCA uses the L1 norm to measure error, whereas the conventional PCA uses the L2 norm. For the L1 PCA problem minimizing the fitting error of the reconstructed data, we propose three algorithms based on iteratively reweighted least squares. We first develop an exact reweighted algorithm. Next, an approximate version is developed based on eigenpair approximation when the algorithm is near convergent. Finally, the approximate version is extended based on stochastic singular value decomposition. We provide convergence analyses, and compare their performance against benchmark algorithms in the literature. The computational experiment shows that the proposed algorithms consistently perform the best and the scalability is improved as we use eigenpair approximation and stochastic singular value decomposition.

Original languageEnglish (US)
Pages (from-to)541-565
Number of pages25
JournalKnowledge and Information Systems
Volume54
Issue number3
DOIs
StatePublished - Mar 1 2018

Keywords

  • Iteratively reweighted least squares
  • L PCA
  • Stochastic singular value decomposition (SVD)

ASJC Scopus subject areas

  • Software
  • Information Systems
  • Human-Computer Interaction
  • Hardware and Architecture
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Three iteratively reweighted least squares algorithms for L1 -norm principal component analysis'. Together they form a unique fingerprint.

Cite this