Skip to main content
Article
Privacy-Preserving Asynchronous Vertical Federated Learning Algorithms for Multiparty Collaborative Learning
IEEE Transactions on Neural Networks and Learning Systems
  • Bin Gu, Mohamed bin Zayed University of Artificial Intelligence
  • An Xu, University of Pittsburgh
  • Zhouyuan Huo, University of Pittsburgh
  • Cheng Deng, Xidian University
Document Type
Article
Abstract

The privacy-preserving federated learning for vertically partitioned (VP) data has shown promising results as the solution of the emerging multiparty joint modeling application, in which the data holders (such as government branches, private finance, and e-business companies) collaborate throughout the learning process rather than relying on a trusted third party to hold data. However, most of the existing federated learning algorithms for VP data are limited to synchronous computation. To improve the efficiency when the unbalanced computation/communication resources are common among the parties in the federated learning system, it is essential to develop asynchronous training algorithms for VP data while keeping the data privacy. In this article, we propose an asynchronous federated stochastic gradient descent (AFSGD-VP) algorithm and its two variance reduction variants, including stochastic variance reduced gradient (SVRG) and SAGA on the VP data. Moreover, we provide the convergence analyses of AFSGD-VP and its SVRG and SAGA variants under the condition of strong convexity and without any restrictions of staleness. We also discuss their model privacy, data privacy, computational complexities, and communication costs. To the best of our knowledge, AFSGD-VP and its SVRG and SAGA variants are the first asynchronous federated learning algorithms for VP data with theoretical guarantees. Extensive experimental results on a variety of VP datasets not only verify the theoretical results of AFSGD-VP and its SVRG and SAGA variants but also show that our algorithms have much higher efficiency than the corresponding synchronous algorithms.

DOI
10.1109/TNNLS.2021.3072238
Publication Date
1-1-2021
Keywords
  • Asynchronous distributed computation,
  • Collaborative work,
  • Convergence,
  • Data models,
  • Data privacy,
  • Distributed databases,
  • Partitioning algorithms,
  • privacy-preserving,
  • stochastic gradient descent (SGD),
  • Stochastic processes,
  • vertical federated learning.
Comments

IR deposit conditions:

  • OA version (accepted version) - pathway a
  • No embargo
  • When accepted for publication, set statement to accompany deposit (see policy)
  • Must link to publisher version with DOI
  • Publisher copyright and source must be acknowledged
Citation Information
B. Gu, A. Xu, Z. Huo, C. Deng, and H. Huang, "Privacy-preserving asynchronous vertical federated learning algorithms for multiparty collaborative learning," in IEEE Transactions on Neural Networks and Learning Systems, pp1-13, doi: 10.1109/TNNLS.2021.3072238.