Skip to main content
Article
Nonuniform-to-uniform quantization: Towards accurate quantization via generalized straight-through estimation
arXiv
  • Zechun Liu, Hong Kong University of Science and Technology, Hong Kong & Carnegie Mellon University, United States
  • Kwang-Ting Cheng, Hong Kong University of Science and Technology, Hong Kong
  • Dong Huang, Carnegie Mellon University, United States
  • Eric P. Xing, Carnegie Mellon University, United States & Mohamed bin Zayed University of Artificial Intelligence
  • Zhiqiang Shen, Carnegie Mellon University, United States & Mohamed bin Zayed University of Artificial Intelligence
Document Type
Article
Abstract

The nonuniform quantization strategy for compressing neural networks usually achieves better performance than its counterpart, i.e., uniform strategy, due to its superior representational capacity. However, many nonuniform quantization methods overlook the complicated projection process in implementing the nonuniformly quantized weights/activations, which incurs non-negligible time and space overhead in hardware deployment. In this study, we propose Nonuniform-to-Uniform Quantization (N2UQ), a method that can maintain the strong representation ability of nonuniform methods while being hardware-friendly and efficient as the uniform quantization for model inference. We achieve this through learning the flexible in-equidistant input thresholds to better fit the underlying distribution while quantizing these real-valued inputs into equidistant output levels. To train the quantized network with learnable input thresholds, we introduce a generalized straight-through estimator (G-STE) for intractable backward derivative calculation w.r.t. threshold parameters. Additionally, we consider entropy preserving regularization to further reduce information loss in weight quantization. Even under this adverse constraint of imposing uniformly quantized weights and activations, our N2UQ outperforms state-of-the-art nonuniform quantization methods by 0.7∼1.8% on ImageNet, demonstrating the contribution of N2UQ design. Code will be made publicly available. Copyright © 2021, The Authors. All rights reserved.

DOI
10.48550/arXiv.2111.14826
Publication Date
11-29-2021
Keywords
  • Model inference,
  • Neural-networks,
  • Non-uniform quantization,
  • Output levels,
  • Performance,
  • Quantisation,
  • Space overhead,
  • Threshold parameters,
  • Underlying distribution,
  • Uniform quantization,
  • Machine learning,
  • Artificial Intelligence (cs.AI),
  • Computer Vision and Pattern Recognition (cs.CV),
  • Machine Learning (cs.LG)
Comments

IR Deposit conditions: non-described

Preprint: arXiv

Citation Information
Z. Liu, K.T. Cheng, D. Huang, E. Xing, and Z. Shen, "Nonuniform-to-uniform quantization: Towards accurate quantization via generalized straight-through estimation", arXiv, Nov 2021, doi: 10.48550/arXiv.2111.14826