Sparse principal component analysis for high dimensional multivariate time series

Research output: Contribution to journalConference article

3 Citations (Scopus)

Abstract

We study sparse principal component analy- sis (sparse PCA) for high dimensional multi- variate vector autoregressive (VAR) time se- ries. By treating the transition matrix as a nuisance parameter, we show that sparse PCA can be directly applied on analyzing multivariate time series as if the data are i.i.d. generated. Under a double asymp- totic framework in which both the length of the sample period T and dimensionality d of the time series can increase (with possi- bly d ≥ T), we provide explicit rates of con- vergence of the angle between the estimated and population leading eigenvectors of the time series covariance matrix. Our results suggest that the spectral norm of the tran- sition matrix plays a pivotal role in deter- mining the final rates of convergence. Im- plications of such a general result is further illustrated using concrete examples. The re- sults of this paper have impacts on different applications, including financial time series, biomedical imaging, and social media, etc.

Original languageEnglish (US)
Pages (from-to)48-56
Number of pages9
JournalJournal of Machine Learning Research
Volume31
StatePublished - Jan 1 2013
Event16th International Conference on Artificial Intelligence and Statistics, AISTATS 2013 - Scottsdale, United States
Duration: Apr 29 2013May 1 2013

Fingerprint

Multivariate Time Series
Principal component analysis
Principal Component Analysis
Time series
High-dimensional
Rate of Convergence
Biomedical Imaging
Autoregressive Time Series
Spectral Norm
Social Media
Financial Time Series
Nuisance Parameter
Transition Matrix
Eigenvector
Covariance matrix
Dimensionality
Mining
Angle
Eigenvalues and eigenfunctions
Imaging techniques

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Statistics and Probability
  • Artificial Intelligence

Cite this

@article{be085d50bb9b412e98bad40c6bc81144,
title = "Sparse principal component analysis for high dimensional multivariate time series",
abstract = "We study sparse principal component analy- sis (sparse PCA) for high dimensional multi- variate vector autoregressive (VAR) time se- ries. By treating the transition matrix as a nuisance parameter, we show that sparse PCA can be directly applied on analyzing multivariate time series as if the data are i.i.d. generated. Under a double asymp- totic framework in which both the length of the sample period T and dimensionality d of the time series can increase (with possi- bly d ≥ T), we provide explicit rates of con- vergence of the angle between the estimated and population leading eigenvectors of the time series covariance matrix. Our results suggest that the spectral norm of the tran- sition matrix plays a pivotal role in deter- mining the final rates of convergence. Im- plications of such a general result is further illustrated using concrete examples. The re- sults of this paper have impacts on different applications, including financial time series, biomedical imaging, and social media, etc.",
author = "Zhaoran Wang and Fang Han and Han Liu",
year = "2013",
month = "1",
day = "1",
language = "English (US)",
volume = "31",
pages = "48--56",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "Microtome Publishing",

}

Sparse principal component analysis for high dimensional multivariate time series. / Wang, Zhaoran; Han, Fang; Liu, Han.

In: Journal of Machine Learning Research, Vol. 31, 01.01.2013, p. 48-56.

Research output: Contribution to journalConference article

TY - JOUR

T1 - Sparse principal component analysis for high dimensional multivariate time series

AU - Wang, Zhaoran

AU - Han, Fang

AU - Liu, Han

PY - 2013/1/1

Y1 - 2013/1/1

N2 - We study sparse principal component analy- sis (sparse PCA) for high dimensional multi- variate vector autoregressive (VAR) time se- ries. By treating the transition matrix as a nuisance parameter, we show that sparse PCA can be directly applied on analyzing multivariate time series as if the data are i.i.d. generated. Under a double asymp- totic framework in which both the length of the sample period T and dimensionality d of the time series can increase (with possi- bly d ≥ T), we provide explicit rates of con- vergence of the angle between the estimated and population leading eigenvectors of the time series covariance matrix. Our results suggest that the spectral norm of the tran- sition matrix plays a pivotal role in deter- mining the final rates of convergence. Im- plications of such a general result is further illustrated using concrete examples. The re- sults of this paper have impacts on different applications, including financial time series, biomedical imaging, and social media, etc.

AB - We study sparse principal component analy- sis (sparse PCA) for high dimensional multi- variate vector autoregressive (VAR) time se- ries. By treating the transition matrix as a nuisance parameter, we show that sparse PCA can be directly applied on analyzing multivariate time series as if the data are i.i.d. generated. Under a double asymp- totic framework in which both the length of the sample period T and dimensionality d of the time series can increase (with possi- bly d ≥ T), we provide explicit rates of con- vergence of the angle between the estimated and population leading eigenvectors of the time series covariance matrix. Our results suggest that the spectral norm of the tran- sition matrix plays a pivotal role in deter- mining the final rates of convergence. Im- plications of such a general result is further illustrated using concrete examples. The re- sults of this paper have impacts on different applications, including financial time series, biomedical imaging, and social media, etc.

UR - http://www.scopus.com/inward/record.url?scp=84954233512&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84954233512&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:84954233512

VL - 31

SP - 48

EP - 56

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

ER -