Penalized estimation of high-dimensional models under a generalized sparsity condition

Joel L. Horowitz, Jian Huang

Research output: Contribution to journalArticle

4 Scopus citations

Abstract

We consider estimation of a linear or nonparametric additive model in which a few coefficients or additive components are "large" and may be objects of substantive interest, whereas others are "small" but not necessarily zero. The number of small coefficients or additive components may exceed the sample size. It is not known which coefficients or components are large and which are small. The large coefficients or additive components can be estimated with a smaller meansquare error or integrated mean-square error if the small ones can be identified and the covariates associated with them dropped from the model. We give conditions under which several penalized least squares procedures distinguish correctly between large and small coefficients or additive components with probability approaching 1 as the sample size increases. The results of Monte Carlo experiments and an empirical example illustrate the benefits of our methods.

Original languageEnglish (US)
Pages (from-to)725-748
Number of pages24
JournalStatistica Sinica
Volume23
Issue number2
DOIs
StatePublished - Apr 1 2013

Keywords

  • High-dimensional data
  • Penalized regression
  • Variable selection

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint Dive into the research topics of 'Penalized estimation of high-dimensional models under a generalized sparsity condition'. Together they form a unique fingerprint.

  • Cite this