Skip to main content
An Accelerated Variance-Reduced Conditional Gradient Sliding Algorithm For First-Order And Zeroth-Order Optimization
  • Xiyuan Wei, Nanjing University of Information Science & Technology
  • Bin Gu, Nanjing University of Information Science & Technology & Mohamed bin Zayed University of Artificial Intelligence & JD Finance American Cooperation, USA
  • Heng Huang, JD Finance America Corporation & University of Pittsburgh
Document Type

The conditional gradient algorithm (also known as the Frank-Wolfe algorithm) has recently regained popularity in the machine learning community due to its projection-free property to solve constrained problems. Although many variants of the conditional gradient algorithm have been proposed to improve performance, they depend on first-order information (gradient) to optimize. Naturally, these algorithms are unable to function properly in the field of increasingly popular zeroth-order optimization, where only zeroth-order information (function value) is available. To fill in this gap, we propose a novel Accelerated variance-Reduced Conditional gradient Sliding (ARCS) algorithm for finite-sum problems, which can use either first-order or zeroth-order information to optimize. To the best of our knowledge, ARCS is the first zeroth-order conditional gradient sliding type algorithms solving convex problems in zeroth-order optimization. In first-order optimization, the convergence results of ARCS substantially outperform previous algorithms in terms of the number of gradient query oracle. Finally we validated the superiority of ARCS by experiments on real-world datasets. © 2021, CC BY.

Publication Date

Preprint: arXiv

  • Archived with thanks to arXiv
  • Preprint License: CC by 4
  • Uploaded 24 March 2022
Citation Information
X. Wei, B. Gu, and H. Huang, "An accelerated variance-reduced conditional gradient sliding algorithm for first-order and zeroth-order optimization," 2021, arXiv:2109.08858