Skip to main content
Presentation
An Improved Convergence Analysis of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics
  • Xingguo Li, University of Minnesota - Twin Cities
  • Tuo Zhao, Johns Hopkins University
  • Raman Arora, Johns Hopkins University
  • Han Liu, Princeton University
  • Mingyi Hong, Iowa State University
Document Type
Conference Proceeding
Publication Version
Published Version
Publication Date
1-1-2016
Conference Title
The 19th International Conference on Artificial Intelligence and Statistics
Conference Date
May 9-11, 2016
Geolocation
(36.5270612, -6.288596200000029)
Abstract
The cyclic block coordinate descent-type (CBCD-type) methods have shown remarkable computational performance for solving strongly convex minimization problems. Typical applications include many popular statistical machine learning methods such as elastic-net regression, ridge penalized logistic regression, and sparse additive regression. Existing optimization literature has shown that the CBCD-type methods attain iteration complexity of O(p · log(1/e)), where e is a pre-specified accuracy of the objective value, and p is the number of blocks. However, such iteration complexity explicitly depends on p, and therefore is at least p times worse than those of gradient descent methods. To bridge this theoretical gap, we propose an improved convergence analysis for the CBCD-type methods. In particular, we first show that for a family of quadratic minimization problems, the iteration complexity of the CBCD-type methods matches that of the GD methods in term of dependency on p (up to a log2 p factor). Thus our complexity bounds are sharper than the existing bounds by at least a factor of p/ log2 p. We also provide a lower bound to confirm that our improved complexity bounds are tight (up to a log2 p factor) if the largest and smallest eigenvalues of the Hessian matrix do not scale with p. Finally, we generalize our analysis to other strongly convex minimization problems beyond quadratic ones.
Comments

This is a proceeding from Proceedings of the 19th International Conference on Artificial Intelligence and Statistics (2016): 491. Posted with permission.

Copyright Owner
The Authors
Language
en
File Format
application/pdf
Citation Information
Xingguo Li, Tuo Zhao, Raman Arora, Han Liu, et al.. "An Improved Convergence Analysis of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization" Cadiz, SpainProceedings of the 19th International Conference on Artificial Intelligence and Statistics (2016) p. 491 - 499
Available at: http://works.bepress.com/mingyi_hong/34/