Skip to main content
Article
A Damped Newton Method Achieves Global O(1/K2) and Local Quadratic Convergence Rate
Advances in Neural Information Processing Systems
  • Slavomír Hanzely, Mohamed Bin Zayed University of Artificial Intelligence
  • Dmitry Kamzolov, Mohamed Bin Zayed University of Artificial Intelligence
  • Dmitry Pasechnyuk, Mohamed Bin Zayed University of Artificial Intelligence
  • Alexander Gasnikov, Moscow Institute of Physics and Technology
  • Peter Richtárik, King Abdullah University of Science and Technology
  • Martin Takáč, Mohamed Bin Zayed University of Artificial Intelligence
Document Type
Conference Proceeding
Abstract

In this paper, we present the first stepsize schedule for Newton method resulting in fast global and local convergence guarantees. In particular, a) we prove an O (1/k2) global rate, which matches the state-of-the-art global rate of cubically regularized Newton method of Polyak and Nesterov (2006) and of regularized Newton method of Mishchenko (2021) and Doikov and Nesterov (2021), b) we prove a local quadratic rate, which matches the best-known local rate of second-order methods, and c) our stepsize formula is simple, explicit, and does not require solving any subproblem. Our convergence proofs hold under affine-invariance assumptions closely related to the notion of self-concordance. Finally, our method has competitive performance when compared to existing baselines, which share the same fast global convergence guarantees.

Publication Date
12-1-2022
Keywords
  • Damped Newton's methods,
  • Global conver-gence,
  • Local Convergence,
  • Local quadratic convergence,
  • Newton's methods,
  • Quadratic convergence rates,
  • Quadratic rates,
  • Regularized Newton method,
  • State of the art,
  • Step size
Comments

Open Access version available on NeurIPS Proceedings

Citation Information
Slavomír Hanzely, Dmitry Kamzolov, Dmitry Pasechnyuk, Alexander Gasnikov, et al.. "A Damped Newton Method Achieves Global O(1/K2) and Local Quadratic Convergence Rate" Advances in Neural Information Processing Systems Vol. 35 (2022) ISSN: 10495258
Available at: http://works.bepress.com/martin-takac/27/