Provable sparse tensor decomposition

Will Wei Sun, Junwei Lu, Han Liu, Guang Cheng*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

77 Scopus citations

Abstract

We propose a novel sparse tensor decomposition method, namely the tensor truncated power method, that incorporates variable selection in the estimation of decomposition components. The sparsity is achieved via an efficient truncation step embedded in the tensor power iteration. Our method applies to a broad family of high dimensional latent variable models, including high dimensional Gaussian mixtures and mixtures of sparse regressions. A thorough theoretical investigation is further conducted. In particular, we show that the final decomposition estimator is guaranteed to achieve a local statistical rate, and we further strengthen it to the global statistical rate by introducing a proper initialization procedure. In high dimensional regimes, the statistical rate obtained significantly improves those shown in the existing non-sparse decomposition methods. The empirical advantages of tensor truncated power are confirmed in extensive simulation results and two real applications of click-through rate prediction and high dimensional gene clustering.

Original languageEnglish (US)
Pages (from-to)899-916
Number of pages18
JournalJournal of the Royal Statistical Society. Series B: Statistical Methodology
Volume79
Issue number3
DOIs
StatePublished - Jun 1 2017

Funding

Part of this research work was conducted when Will Wei Sun was a doctoral candidate at Purdue University. Han Liu was partially supported by National Science Foundation Career award DMS-1454377, National Science Foundation grants IIS 1546482-BIGDATA, IIS1408910 and IIS1332109, and National Institutes of Health grants R01MH102339 and R01GM083084. Guang Cheng was partially supported by National Science Foundation Career award DMS-1151692 and DMS-1418042, a Simons Fellowship in mathematics and a grant from the Office of Naval Research. Guang Cheng was on sabbatical leave at Princeton University while part of this work was carried out; he thanks the Princeton Department of Operations Research and Financial Engineering for its hospitality. All the authors thank the Joint Editor, the Associate Editor and three reviewers for their helpful comments and suggestions which led to a much improved presentation.

Keywords

  • Global convergence
  • Latent variable models
  • Non-convex optimization
  • Sparsity
  • Tensor decomposition

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'Provable sparse tensor decomposition'. Together they form a unique fingerprint.

Cite this