Skip to main content
Article
Knowledge-Aware Meta-Learning for Low-Resource Text Classification
arXiv
  • Huaxiu Yao, Stanford University
  • Yingxin Wu, University of Science and Technology China
  • Maruan Al-Shedivat, Carnegie Mellon University
  • Eric P. Xing, Carnegie Mellon University & Mohamed bin Zayed University of Artificial Intelligence
Document Type
Article
Abstract

Meta-learning has achieved great success in leveraging the historical learned knowledge to facilitate the learning process of the new task. However, merely learning the knowledge from the historical tasks, adopted by current meta-learning algorithms, may not generalize well to testing tasks when they are not well-supported by training tasks. This paper studies a low-resource text classification problem and bridges the gap between meta-training and meta-testing tasks by leveraging the external knowledge bases. Specifically, we propose KGML to introduce additional representation for each sentence learned from the extracted sentence-specific knowledge graph. The extensive experiments on three datasets demonstrate the effectiveness of KGML under both supervised adaptation and unsupervised adaptation settings. Copyright © 2021, The Authors. All rights reserved.

Publication Date
1-1-2021
Keywords
  • Computation and Language (cs.CL); Machine Learning (cs.LG)
Disciplines
Comments

Preprint: arXiv

Citation Information
H. Yao, Y. Wu, M. Al-Shedivat, and E. P. Xing, "Knowledge-aware meta-learning for low-resource text classification," 2021, arXiv:2109.04707