Skip to main content
Article
Learning to Learn with Variational Information Bottleneck for Domain Generalization
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
  • Yingjun Du, Universiteit van Amsterdam
  • Jun Xu, Nankai University
  • Huan Xiong, Mohamed bin Zayed University of Artificial Intelligence
  • Qiang Qiu, Duke University
  • Xiantong Zhen, Universiteit van Amsterdam
  • Cees G.M. Snoek, Universiteit van Amsterdam
  • Ling Shao, Inception Institute of Artificial Intelligence
Document Type
Conference Proceeding
Abstract

Domain generalization models learn to generalize to previously unseen domains, but suffer from prediction uncertainty and domain shift. In this paper, we address both problems. We introduce a probabilistic meta-learning model for domain generalization, in which classifier parameters shared across domains are modeled as distributions. This enables better handling of prediction uncertainty on unseen domains. To deal with domain shift, we learn domain-invariant representations by the proposed principle of meta variational information bottleneck, we call MetaVIB. MetaVIB is derived from novel variational bounds of mutual information, by leveraging the meta-learning setting of domain generalization. Through episodic training, MetaVIB learns to gradually narrow domain gaps to establish domain-invariant representations, while simultaneously maximizing prediction accuracy. We conduct experiments on three benchmarks for cross-domain visual recognition. Comprehensive ablation studies validate the benefits of MetaVIB for domain generalization. The comparison results demonstrate our method outperforms previous approaches consistently.

DOI
10.1007/978-3-030-58607-2_12
Publication Date
11-7-2020
Keywords
  • Domain generalization,
  • Information bottleneck,
  • Meta learning,
  • Variational inference
Disciplines
Comments

IR deposit conditions:

  • OA version (accepted version) - pathway A
  • 12 month embargo
  • Must link to published article
  • Set statement to accompany deposit
Citation Information
Y. Du et al., “Learning to learn with variational information bottleneck for domain generalization,” Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 12355 LNCS, pp. 200–216, 2020, doi: 10.1007/978-3-030-58607-2_12/TABLES/6.