Low-rank and sparse structure pursuit via alternating minimization

Quanquan Gu, Zhaoran Wang, Han Liu

Research output: Contribution to conferencePaperpeer-review

58 Scopus citations

Abstract

In this paper, we present a nonconvex alternating minimization optimization algorithm for low-rank and sparse structure pursuit. Compared with convex relaxation based methods, the proposed algorithm is computationally more efficient for large scale problems. In our study, we define a notion of bounded difference of gradients, based on which we rigorously prove that with suitable initialization, the proposed nonconvex optimization algorithm enjoys linear convergence to the global optima and exactly recovers the underlying low rank and sparse matrices under standard conditions such as incoherence and sparsity conditions. For a wide range of statistical models such as multi-task learning and robust principal component analysis (RPCA), our algorithm provides a principled approach to learning the low rank and sparse structures with provable guarantee. Thorough experiments on both synthetic and real datasets backup our theory.

Original languageEnglish (US)
Pages600-609
Number of pages10
StatePublished - Jan 1 2016
Event19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016 - Cadiz, Spain
Duration: May 9 2016May 11 2016

Conference

Conference19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016
Country/TerritorySpain
CityCadiz
Period5/9/165/11/16

Funding

We would like to thank the anonymous reviewers for their helpful comments. Gu is supported by his startup funding at Department of Systems and Information Engineering, University of Virginia. Wang and Liu are partially supported by the grants NSF IIS1408910, NSF IIS1332109, NIH R01MH102339, NIH R01GM083084, and NIH R01HG06841.

ASJC Scopus subject areas

  • Artificial Intelligence
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Low-rank and sparse structure pursuit via alternating minimization'. Together they form a unique fingerprint.

Cite this