Feature selection in high-dimensional classification

Mladen Kolar, Han Liu

Research output: Contribution to conferencePaper

13 Scopus citations

Abstract

High-dimensional discriminant analysis is of fundamental importance in multivariate statistics. Existing theoretical results sharply characterize different procedures, providing sharp convergence results for the classification risk, as well as the ℓ2 convergence results to the discriminative rule. However, sharp theoretical results for the problem of variable selection have not been established, even though model interpretation is of importance in many scientific domains. In this paper, we bridge this gap by providing sharp sufficient conditions for consistent variable selection using the ROAD estimator (Fan et al., 2010). Our results provide novel theoretical insights for the ROAD estimator. Sufficient conditions are complemented by the necessary information theoretic limits on variable selection in high-dimensional discriminant analysis. This complementary result also establishes optimality of the ROAD estimator for a certain family of problems.

Original languageEnglish (US)
Pages329-337
Number of pages9
StatePublished - Jan 1 2013
Event30th International Conference on Machine Learning, ICML 2013 - Atlanta, GA, United States
Duration: Jun 16 2013Jun 21 2013

Conference

Conference30th International Conference on Machine Learning, ICML 2013
CountryUnited States
CityAtlanta, GA
Period6/16/136/21/13

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Sociology and Political Science

Fingerprint Dive into the research topics of 'Feature selection in high-dimensional classification'. Together they form a unique fingerprint.

  • Cite this

    Kolar, M., & Liu, H. (2013). Feature selection in high-dimensional classification. 329-337. Paper presented at 30th International Conference on Machine Learning, ICML 2013, Atlanta, GA, United States.