Skip to main content
Article
FedShuffle: Recipes for Better Use of Local Work in Federated Learning
arXiv
  • Samuel Horváth, Mohamed bin Zayed University of Artificial Intelligence
  • Maziar Sanjabi, Meta AI, United States
  • Lin Xiao, Meta AI, United States
  • Peter Richtárik, KAUST, Saudi Arabia
  • Michael Rabbat, Meta AI, United States
Document Type
Article
Abstract

The practice of applying several local updates before aggregation across clients has been empirically shown to be a successful approach to overcoming the communication bottleneck in Federated Learning (FL). Such methods are usually implemented by having clients perform one or more epochs of local training per round, while randomly reshuffling their finite dataset in each epoch. Data imbalance, where clients have different numbers of local training samples, is ubiquitous in FL applications, resulting in different clients performing different numbers of local updates in each round. In this work, we propose a general recipe, FedShuffle, that better utilizes the local updates in FL, especially in this regime encompassing random reshuffling and heterogeneity. FedShuffle is the first local update method with theoretical convergence guarantees that incorporates random reshuffling, data imbalance, and client sampling - features that are essential in large-scale cross-device FL. We present a comprehensive theoretical analysis of FedShuffle and show, both theoretically and empirically, that it does not suffer from the objective function mismatch that is present in FL methods that assume homogeneous updates in heterogeneous FL setups, such as FedAvg (McMahan et al., 2017). In addition, by combining the ingredients above, FedShuffle improves upon FedNova (Wang et al., 2020), which was previously proposed to solve this mismatch. Similar to Mime (Karimireddy et al., 2020), we show that FedShuffle with momentum variance reduction (Cutkosky & Orabona, 2019) improves upon non-local methods under a Hessian similarity assumption. Copyright © 2022, The Authors. All rights reserved.

DOI
10.48550/arXiv.2204.13169
Publication Date
4-27-2022
Keywords
  • Data client,
  • Data imbalance,
  • Large-scales,
  • Learning methods,
  • Local training,
  • Non local methods,
  • Objective functions,
  • Training sample,
  • Variance reductions
Comments

IR Deposit conditions: non-described

Citation Information
S. Horváth, M. Sanjabi, L. Xiao, P. Richtárik and M. Rabbat, "FedShuffle: Recipes for Better Use of Local Work in Federated Learning", 2022, doi:10.48550/arXiv.2204.13169