Skip to main content
Large-Scale Nonlinear AUC Maximization via Triply Stochastic Gradients
IEEE Transactions on Pattern Analysis and Machine Intelligence
  • Zhiyuan Dang, Xidian University
  • Xiang Li, The University of Western Ontario
  • Bin Gu, Mohamed Bin Zayed University of Artificial Intelligence
  • Cheng Deng, Xidian University
Document Type

Learning to improve AUC performance for imbalanced data is an important machine learning research problem. Most methods of AUC maximization assume that the model function is linear in the original feature space. However, this assumption is not suitable for nonlinear separable problems. Although there have been some nonlinear methods of AUC maximization, scaling up nonlinear AUC maximization is still an open question. To address this challenging problem, in this paper, we propose a novel large-scale nonlinear AUC maximization method (named as TSAM) based on the triply stochastic gradient descents. Specifically, we first use the random Fourier feature to approximate the kernel function. After that, we use the triply stochastic gradients w.r.t. the pairwise loss and random feature to iteratively update the solution. Finally, we prove that TSAM converges to the optimal solution with the rate of mathcal O(1/t)O(1/t) after tt iterations. Experimental results on a variety of benchmark datasets not only confirm the scalability of TSAM, but also show a significant reduction of computational time compared with existing batch learning algorithms, while retaining the similar generalization performance.

Publication Date
  • AUC maximization,
  • kernel methods,
  • random fourier features

IR deposit conditions:

  • OA version (accepted version) - pathway a
  • No embargo
  • When accepted for publication, set statement to accompany deposit (see policy)
  • Must link to publisher version with DOI
  • Publisher copyright and source must be acknowledged
Citation Information
Z. Dang, X. Li, B. Gu, C. Deng, and H. Huang, "Large-scale nonlinear AUC maximization via triply stochastic gradients," in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 44, no. 3, pp. 1385-1398, 1 March 2022, doi: 10.1109/TPAMI.2020.3024987.