Skip to main content
Article
AMP: Automatically Finding Model Parallel Strategies with Heterogeneity Awareness
Advances in Neural Information Processing Systems
  • Dacheng Li, Carnegie Mellon University
  • Hongyi Wang, Carnegie Mellon University
  • Eric Xing, Carnegie Mellon University & Mohamed bin Zayed University of Artificial Intelligence
  • Hao Zhang, University of California, Berkeley
Document Type
Conference Proceeding
Abstract

Scaling up model sizes can lead to fundamentally new capabilities in many machine learning (ML) tasks. However, training big models requires strong distributed system expertise to carefully design model-parallel execution strategies that suit the model architectures and cluster setups. In this paper, we develop AMP, a framework that automatically derives such strategies. AMP identifies a valid space of model parallelism strategies and efficiently searches the space for high-performed strategies, by leveraging a cost model designed to capture the heterogeneity of the model and cluster specifications. Unlike existing methods, AMP is specifically tailored to support complex models composed of uneven layers and cluster setups with more heterogeneous accelerators and bandwidth. We evaluate AMP on popular models and cluster setups from public clouds and show that AMP returns parallel strategies that match the expert-tuned strategies on typical cluster setups. On heterogeneous clusters or models with heterogeneous architectures, AMP finds strategies with 1.54× and 1.77× higher throughput than state-of-the-art model-parallel systems, respectively.

Publication Date
12-1-2022
Keywords
  • Design models,
  • Distributed systems,
  • Execution strategies,
  • Learning tasks,
  • Machine-learning,
  • Model size,
  • Modeling architecture,
  • Parallel executions,
  • Parallel strategies,
  • Scaling-up
Comments

IR conditions: non-described

Open Access version available on NeurIPS Proceedings

Citation Information
D. Li, H. Wang, E. Xing, and H. Zhang, "AMP: Automatically Finding Model Parallel Strategies with Heterogeneity Awareness", in 36th Conference on Neural Information Processing Systems (NeurIPS 2022), New Orleans, 2022.