Second-order methods for L1 regularized problems in machine learning

Samantha Hansen*, Jorge Nocedal

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

This paper proposes a Hessian-free Newton method for solving large-scale convex functions with an L1 regularization term. These problems arise in supervised machine learning models in which it is important to seek a sparse parameter vector. The proposed method operates in a batch setting, which is well suited for parallel computing environments, and employs sub-sampled Hessian information to accelerate progress of the iteration. The method consists of two phases, an active-set prediction phase that employs first-order and second-order information, and subspace phase that performs a Newton-like step. Numerical results on a speech recognition problem illustrate the practical behavior of the method.

Original languageEnglish (US)
Title of host publication2012 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2012 - Proceedings
Pages5237-5240
Number of pages4
DOIs
StatePublished - Oct 23 2012
Event2012 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2012 - Kyoto, Japan
Duration: Mar 25 2012Mar 30 2012

Other

Other2012 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2012
CountryJapan
CityKyoto
Period3/25/123/30/12

Keywords

  • Hessian-Free Newton
  • Iterative Shrinkage
  • L1 Regularization
  • Logistic Regression
  • Newton Method

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Second-order methods for L1 regularized problems in machine learning'. Together they form a unique fingerprint.

Cite this