Skip to main content
Article
UPTPU: Improving Energy Efficiency of a Tensor Processing Unit through Underutilization Based Power-Gating
Proceedings - Design Automation Conference
  • Pramesh Pandey, Utah State University
  • Noel Daniel Gundi, Utah State University
  • Koushik Chakraborty, Utah State University
  • Sanghamitra Roy, Utah State University
Document Type
Conference Paper
Publisher
Institute of Electrical and Electronics Engineers
Publication Date
12-5-2021
Funder

National Science Foundation

Abstract

The AI boom is bringing a plethora of domain-specific architectures for Neural Network computations. Google's Tensor Processing Unit (TPU), a Deep Neural Network (DNN) accelerator, has replaced the CPUs/GPUs in its data centers, claiming more than 15 × rate of inference. However, the unprecedented growth in DNN workloads with the widespread use of AI services projects an increasing energy consumption of TPU based data centers. In this work, we parametrize the extreme hardware underutilization in TPU systolic array and propose UPTPU: an intelligent, dataflow adaptive power-gating paradigm to provide a staggering 3.5 ×-6.5× energy efficiency to TPU for different input batch sizes.

Comments

© 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

Citation Information
Pramesh Pandey, Noel Gundi, Sanghamitra Roy and Koushik Chakraborty, UPTPU: Improving Energy Efficiency of a Tensor Processing Unit through Underutilization Based Power-Gating. Proceedings of the IEEE/ACM Design Automation Conference (DAC), December 2021, San Francisco, California (Accepted).