Skip to main content
Article
Resource Optimized Hierarchical Split Federated Learning for Wireless Networks
ACM International Conference Proceeding Series
  • Latif U. Khan, Mohamed Bin Zayed University of Artificial Intelligence
  • Mohsen Guizani, Mohamed Bin Zayed University of Artificial Intelligence
  • Choong Seon Hong, Kyung Hee University
Document Type
Conference Proceeding
Abstract

Federated learning (FL) uses distributed fashion of training via local models (e.g., convolutional neural network) computation at devices followed by central aggregation at the edge or cloud. Such distributed training uses a significant amount of computational resources (i.e., CPU-cycles/sec) that seem difficult to be met by Internet of Things (IoT) sensors. Addressing these challenges, split FL (SFL) was recently proposed based on computing a part of a model at devices and remaining at edge/cloud servers. Although SFL resolves devices computing resources constraints, it still suffers from fairness issues and slow convergence. To enable FL with these features, we propose a novel hierarchical SFL (HSFL) architecture that combines SFL with a hierarchical fashion of learning. To avoid a single point of failure and fairness issues, HSFL has a truly distributed nature (i.e., distributed aggregations). We also define a cost function that can be minimized relative local accuracy, transmit power, resource allocation, and association. Due to the non-convex nature, we propose a block successive upper bound minimization (BSUM) based solution. Finally, numerical results are presented.

DOI
10.1145/3576914.3590148
Publication Date
5-9-2023
Keywords
  • Federated learning,
  • hierarchical federated learning.,
  • Internet of Things,
  • split learning
Comments

IR conditions: non-described

Citation Information
L. U. Khan, M. Guizani, and C.S. Hong, "Resource Optimized Hierarchical Split Federated Learning for Wireless Networks", In Proceedings of Cyber-Physical Systems and Internet of Things Week 2023 (CPS-IoT Week '23), ACL, pp. 254–259, May 2023. doi:10.1145/3576914.3590148