Skip to main content
Article
Scaling Up Generalized Kernel Methods
IEEE Transactions on Pattern Analysis and Machine Intelligence
  • Bin Gu, Nanjing University of Information Science & Technology & Mohamed bin Zayed University of Artificial Intelligence
  • Zhiyuan Dang, Xidian University, Xi'an, China
  • Zhouyuan Huo, University of Pittsburgh
  • Cheng Deng, Xidian University, Xi'an, China
  • Heng Huang, University of Pittsburgh
Document Type
Article
Abstract

Kernel methods have achieved tremendous success in the past two decades. In the current big data era, data collection has grown tremendously. However, existing kernel methods are not scalable enough both at the training and predicting steps. To address this challenge, in this paper, we first introduce a general sparse kernel learning formulation based on the random feature approximation, where the loss functions are possibly non-convex. In order to reduce the scale of random features required in experiment, we also use that formulation based on the orthogonal random feature approximation. Then we propose a new asynchronous parallel doubly stochastic algorithm for large scale sparse kernel learning (AsyDSSKL). To the best our knowledge, AsyDSSKL is the first algorithm with the techniques of asynchronous parallel computation and doubly stochastic optimization. We also provide a comprehensive convergence guarantee to AsyDSSKL. Importantly, the experimental results on various large-scale real-world datasets show that, our AsyDSSKL method has the significant superiority on the computational efficiency at the training and predicting steps over the existing kernel methods.

DOI
10.1109/TPAMI.2021.3059702
Publication Date
2-16-2021
Keywords
  • asynchronous parallel computation,
  • coordinate descent,
  • Kernel method,
  • random feature,
  • stochastic gradient descent
Disciplines
Comments

IR deposit conditions:

OA (accepted version) - pathway a

  • No embargo
  • When accepted for publication, set statement to accompany deposit (see policy)
  • Must link to publisher version with DOI
  • Publisher copyright and source must be acknowledged
Citation Information
B. Gu, Z. Dang, Z. Huo, C. Deng and H. Huang, "Scaling Up Generalized Kernel Methods," in IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021, doi: 10.1109/TPAMI.2021.3059702.