An improved convergence analysis of cyclic block coordinate descent-type methods for strongly convex minimization

Xingguo Li, Tuo Zhao, Raman Arora, Han Liu, Mingyi Hong

Research output: Contribution to conferencePaper

6 Scopus citations

Abstract

The cyclic block coordinate descent-type (CBCD-type) methods have shown remarkable computational performance for solving strongly convex minimization problems. Typical applications include many popular statistical machine learning methods such as elastic-net regression, ridge penalized logistic regression, and sparse additive regression. Existing optimization literature has shown that the CBCD-type methods attain iteration complexity of O(p · log(1/ϵ)), where ϵ is a pre-specified accuracy of the objective value, and p is the number of blocks. However, such iteration complexity explicitly depends on p, and therefore is at least p times worse than those of gradient descent methods. To bridge this theoretical gap, we propose an improved convergence analysis for the CBCD-type methods. In particular, we first show that for a family of quadratic minimization problems, the iteration complexity of the CBCD-type methods matches that of the GD methods in term of dependency on p (up to a log2 p factor). Thus our complexity bounds are sharper than the existing bounds by at least a factor of p/log2 p. We also provide a lower bound to confirm that our improved complexity bounds are tight (up to a log2 p factor) if the largest and smallest eigen-values of the Hessian matrix do not scale with p. Finally, we generalize our analysis to other strongly convex minimization problems beyond quadratic ones.

Original languageEnglish (US)
Pages491-499
Number of pages9
StatePublished - Jan 1 2016
Event19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016 - Cadiz, Spain
Duration: May 9 2016May 11 2016

Conference

Conference19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016
CountrySpain
CityCadiz
Period5/9/165/11/16

ASJC Scopus subject areas

  • Artificial Intelligence
  • Statistics and Probability

Fingerprint Dive into the research topics of 'An improved convergence analysis of cyclic block coordinate descent-type methods for strongly convex minimization'. Together they form a unique fingerprint.

  • Cite this

    Li, X., Zhao, T., Arora, R., Liu, H., & Hong, M. (2016). An improved convergence analysis of cyclic block coordinate descent-type methods for strongly convex minimization. 491-499. Paper presented at 19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016, Cadiz, Spain.