In this paper, we present the first stepsize schedule for Newton method resulting in fast global and local convergence guarantees. In particular, a) we prove an O (1/k2) global rate, which matches the state-of-the-art global rate of cubically regularized Newton method of Polyak and Nesterov (2006) and of regularized Newton method of Mishchenko (2021) and Doikov and Nesterov (2021), b) we prove a local quadratic rate, which matches the best-known local rate of second-order methods, and c) our stepsize formula is simple, explicit, and does not require solving any subproblem. Our convergence proofs hold under affine-invariance assumptions closely related to the notion of self-concordance. Finally, our method has competitive performance when compared to existing baselines, which share the same fast global convergence guarantees.
- Damped Newton's methods,
- Global conver-gence,
- Local Convergence,
- Local quadratic convergence,
- Newton's methods,
- Quadratic convergence rates,
- Quadratic rates,
- Regularized Newton method,
- State of the art,
- Step size
Available at: http://works.bepress.com/martin-takac/27/
Open Access version available on NeurIPS Proceedings