An investigation of Newton-Sketch and subsampled Newton methods

Albert S. Berahas, Raghu Bollapragada, Jorge Nocedal*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

31 Scopus citations

Abstract

Sketching, a dimensionality reduction technique, has received much attention in the statistics community. In this paper, we study sketching in the context of Newton's method for solving finite-sum optimization problems in which the number of variables and data points are both large. We study two forms of sketching that perform dimensionality reduction in data space: Hessian subsampling and randomized Hadamard transformations. Each has its own advantages, and their relative tradeoffs have not been investigated in the optimization literature. Our study focuses on practical versions of the two methods in which the resulting linear systems of equations are solved approximately, at every iteration, using an iterative solver. The advantages of using the conjugate gradient method vs. a stochastic gradient iteration are revealed through a set of numerical experiments, and a complexity analysis of the Hessian subsampling method is presented.

Original languageEnglish (US)
Pages (from-to)661-680
Number of pages20
JournalOptimization Methods and Software
Volume35
Issue number4
DOIs
StatePublished - Jul 3 2020

Keywords

  • Newton's method
  • Sketching
  • machine learning
  • stochastic optimization
  • subsampling

ASJC Scopus subject areas

  • Software
  • Control and Optimization
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'An investigation of Newton-Sketch and subsampled Newton methods'. Together they form a unique fingerprint.

Cite this