Skip to main content
Article
Data-Free Neural Architecture Search via Recursive Label Calibration
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
  • Zechun Liu, Hong Kong University of Science and Technology, Hong Kong & Carnegie Mellon University, Pittsburgh, United States
  • Zhiqiang Shen, Hong Kong University of Science and Technology, Hong Kong & Carnegie Mellon University & Mohamed bin Zayed University of Artificial Intelligence
  • Yun Long, Google Research, Mountain View, United States
  • Eric Xing, Carnegie Mellon University, Pittsburgh, United States & Mohamed bin Zayed University of Artificial Intelligence
  • Kwang-Ting Cheng, Hong Kong University of Science and Technology, Hong Kong
  • Chas Leichner, Google Research, Mountain View, United States
Document Type
Conference Proceeding
Abstract

This paper aims to explore the feasibility of neural architecture search (NAS) given only a pre-trained model without using any original training data. This is an important circumstance for privacy protection, bias avoidance, etc., in real-world scenarios. To achieve this, we start by synthesizing usable data through recovering the knowledge from a pre-trained deep neural network. Then we use the synthesized data and their predicted soft labels to guide NAS. We identify that the quality of the synthesized data will substantially affect the NAS results. Particularly, we find NAS requires the synthesized images to possess enough semantics, diversity, and a minimal domain gap from the natural images. To meet these requirements, we propose recursive label calibration to encode more relative semantics in images, as well as regional update strategy to enhance the diversity. Further, we use input and feature-level regularization to mimic the original data distribution in latent space and reduce the domain gap. We instantiate our proposed framework with three popular NAS algorithms: DARTS, ProxylessNAS and SPOS. Surprisingly, our results demonstrate that the architectures discovered by searching with our synthetic data achieve accuracy that is comparable to, or even higher than, architectures discovered by searching from the original ones, for the first time, deriving the conclusion that NAS can be done effectively with no need of access to the original or called natural data if the synthesis method is well designed. Code and models are available at: https://github.com/liuzechun/Data-Free-NAS. © 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG.

DOI
10.1007/978-3-031-20053-3_23
Publication Date
11-6-2022
Keywords
  • Calibration,
  • Deep neural networks,
  • Image enhancement,
  • Network architecture
Comments

IR conditions: non-described

Citation Information
Z. Liu, Z., Shen, Y. Long, E. Xing, KT. Cheng, and C. Leichner, "Data-Free Neural Architecture Search via Recursive Label Calibration", in Computer Vision (ECCV 2022), Lecture Notes in Computer Science, vol 13684, Springer, Cham., Nov. 2022, doi:10.1007/978-3-031-20053-3_23