Skip to main content
Article
Variational Continual Bayesian Meta-Learning
Advances in Neural Information Processing Systems
  • Qiang Zhang, Hangzhou Innovation Center, Zhejiang University, China & College of Computer Science and Technology, Zhejiang University, China & AZFT Knowledge Engine Lab, China
  • Jinyuan Fang, Sun Yat-sen University, China
  • Zaiqiao Meng, University of Glasgow, United Kingdom & Mohamed bin Zayed University of Artificial Intelligence
  • Shangsong Liang, Sun Yat-sen University, China & Mohamed bin Zayed University of Artificial Intelligence
  • Emine Yilmaz, University College London, United Kingdom
Document Type
Conference Proceeding
Abstract

Conventional meta-learning considers a set of tasks from a stationary distribution. In contrast, this paper focuses on a more complex online setting, where tasks arrive sequentially and follow a non-stationary distribution. Accordingly, we propose a Variational Continual Bayesian Meta-Learning (VC-BML) algorithm. VC-BML maintains a Dynamic Gaussian Mixture Model for meta-parameters, with the number of component distributions determined by a Chinese Restaurant Process. Dynamic mixtures at the meta-parameter level increase the capability to adapt to diverse and dissimilar tasks due to a larger parameter space, alleviating the negative knowledge transfer problem. To infer the posteriors of model parameters, compared to the previously used point estimation method, we develop a more robust posterior approximation method – structured variational inference for the sake of avoiding forgetting knowledge. Experiments on tasks from non-stationary distributions show that VC-BML is superior in transferring knowledge among diverse tasks and alleviating catastrophic forgetting in an online setting. © 2021 Neural information processing systems foundation. All rights reserved.

Publication Date
12-6-2021
Keywords
  • Gaussian distribution,
  • Bayesian,
  • Component distributions,
  • Dynamic gaussian mixture models,
  • Metalearning,
  • Metaparameters,
  • Nonstationary,
  • Number of components,
  • On-line setting,
  • Parameter levels,
  • Stationary distribution,
  • Knowledge management
Comments

IR Deposit conditions: non-described

Citation Information
Z. Qhiang, J. Fang, Z. Meng, S. Liang, and E. Yilmaz, "Variational Continual Bayesian Meta-Learning", in 35th Conference on Neural Information Processing Systems (NeurIPS 2021)", [Online], Dec. 6-14, 2021, in Advances in Neural Information Processing Systems, vol.29, 2021, p. 24556-24568. Available: https://proceedings.neurips.cc/paper/2021/file/cdd0500dc0ef6682fa6ec6d2e6b577c4-Paper.pdf