Online ICA: Understanding global dynamics of nonconvex optimization via diffusion processes

Chris Junchi Li, Zhaoran Wang, Han Liu

Research output: Contribution to journalConference article

3 Citations (Scopus)

Abstract

Solving statistical learning problems often involves nonconvex optimization. Despite the empirical success of nonconvex statistical optimization methods, their global dynamics, especially convergence to the desirable local minima, remain less well understood in theory. In this paper, we propose a new analytic paradigm based on diffusion processes to characterize the global dynamics of nonconvex statistical optimization. As a concrete example, we study stochastic gradient descent (SGD) for the tensor decomposition formulation of independent component analysis. In particular, we cast different phases of SGD into diffusion processes, i.e., solutions to stochastic differential equations. Initialized from an unstable equilibrium, the global dynamics of SGD transit over three consecutive phases: (i) an unstable Ornstein-Uhlenbeck process slowly departing from the initialization, (ii) the solution to an ordinary differential equation, which quickly evolves towards the desirable local minimum, and (iii) a stable Ornstein-Uhlenbeck process oscillating around the desirable local minimum. Our proof techniques are based upon Stroock and Varadhan's weak convergence of Markov chains to diffusion processes, which are of independent interest.

Original languageEnglish (US)
Pages (from-to)4967-4975
Number of pages9
JournalAdvances in Neural Information Processing Systems
StatePublished - Jan 1 2016
Event30th Annual Conference on Neural Information Processing Systems, NIPS 2016 - Barcelona, Spain
Duration: Dec 5 2016Dec 10 2016

Fingerprint

Independent component analysis
Ordinary differential equations
Markov processes
Tensors
Differential equations
Decomposition

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

@article{b39a2097b9d1462e83b4f36f57a8b4b2,
title = "Online ICA: Understanding global dynamics of nonconvex optimization via diffusion processes",
abstract = "Solving statistical learning problems often involves nonconvex optimization. Despite the empirical success of nonconvex statistical optimization methods, their global dynamics, especially convergence to the desirable local minima, remain less well understood in theory. In this paper, we propose a new analytic paradigm based on diffusion processes to characterize the global dynamics of nonconvex statistical optimization. As a concrete example, we study stochastic gradient descent (SGD) for the tensor decomposition formulation of independent component analysis. In particular, we cast different phases of SGD into diffusion processes, i.e., solutions to stochastic differential equations. Initialized from an unstable equilibrium, the global dynamics of SGD transit over three consecutive phases: (i) an unstable Ornstein-Uhlenbeck process slowly departing from the initialization, (ii) the solution to an ordinary differential equation, which quickly evolves towards the desirable local minimum, and (iii) a stable Ornstein-Uhlenbeck process oscillating around the desirable local minimum. Our proof techniques are based upon Stroock and Varadhan's weak convergence of Markov chains to diffusion processes, which are of independent interest.",
author = "Li, {Chris Junchi} and Zhaoran Wang and Han Liu",
year = "2016",
month = "1",
day = "1",
language = "English (US)",
pages = "4967--4975",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",

}

Online ICA : Understanding global dynamics of nonconvex optimization via diffusion processes. / Li, Chris Junchi; Wang, Zhaoran; Liu, Han.

In: Advances in Neural Information Processing Systems, 01.01.2016, p. 4967-4975.

Research output: Contribution to journalConference article

TY - JOUR

T1 - Online ICA

T2 - Understanding global dynamics of nonconvex optimization via diffusion processes

AU - Li, Chris Junchi

AU - Wang, Zhaoran

AU - Liu, Han

PY - 2016/1/1

Y1 - 2016/1/1

N2 - Solving statistical learning problems often involves nonconvex optimization. Despite the empirical success of nonconvex statistical optimization methods, their global dynamics, especially convergence to the desirable local minima, remain less well understood in theory. In this paper, we propose a new analytic paradigm based on diffusion processes to characterize the global dynamics of nonconvex statistical optimization. As a concrete example, we study stochastic gradient descent (SGD) for the tensor decomposition formulation of independent component analysis. In particular, we cast different phases of SGD into diffusion processes, i.e., solutions to stochastic differential equations. Initialized from an unstable equilibrium, the global dynamics of SGD transit over three consecutive phases: (i) an unstable Ornstein-Uhlenbeck process slowly departing from the initialization, (ii) the solution to an ordinary differential equation, which quickly evolves towards the desirable local minimum, and (iii) a stable Ornstein-Uhlenbeck process oscillating around the desirable local minimum. Our proof techniques are based upon Stroock and Varadhan's weak convergence of Markov chains to diffusion processes, which are of independent interest.

AB - Solving statistical learning problems often involves nonconvex optimization. Despite the empirical success of nonconvex statistical optimization methods, their global dynamics, especially convergence to the desirable local minima, remain less well understood in theory. In this paper, we propose a new analytic paradigm based on diffusion processes to characterize the global dynamics of nonconvex statistical optimization. As a concrete example, we study stochastic gradient descent (SGD) for the tensor decomposition formulation of independent component analysis. In particular, we cast different phases of SGD into diffusion processes, i.e., solutions to stochastic differential equations. Initialized from an unstable equilibrium, the global dynamics of SGD transit over three consecutive phases: (i) an unstable Ornstein-Uhlenbeck process slowly departing from the initialization, (ii) the solution to an ordinary differential equation, which quickly evolves towards the desirable local minimum, and (iii) a stable Ornstein-Uhlenbeck process oscillating around the desirable local minimum. Our proof techniques are based upon Stroock and Varadhan's weak convergence of Markov chains to diffusion processes, which are of independent interest.

UR - http://www.scopus.com/inward/record.url?scp=85018920746&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85018920746&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:85018920746

SP - 4967

EP - 4975

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -