Skip to main content
Article
Locality Sensitive Teaching
Advances in Neural Information Processing Systems
  • Zhaozhuo Xu, Rice University, United States
  • Beidi Chen, Stanford University, United States
  • Chaojian Li, Rice University, United States
  • Weiyang Liu, University of Cambridge, United Kingdom & MPI-IS Tübingen, Germany
  • Le Song, Mohamed bin Zayed University of Artificial Intelligence
  • Yingyan Lin, Rice University, United States
  • Anshumali Shrivastava, Rice University, United States & ThirdAI Corp., United States
Document Type
Conference Proceeding
Abstract

The emergence of the Internet-of-Things (IoT) sheds light on applying the machine teaching (MT) algorithms for online personalized education on home devices. This direction becomes more promising during the COVID-19 pandemic when in-person education becomes infeasible. However, as one of the most influential and practical MT paradigms, iterative machine teaching (IMT) is prohibited on IoT devices due to its inefficient and unscalable algorithms. IMT is a paradigm where a teacher feeds examples iteratively and intelligently based on the learner's status. In each iteration, current IMT algorithms greedily traverse the whole training set to find an example for the learner, which is computationally expensive in practice. We propose a novel teaching framework, Locality Sensitive Teaching (LST), based on locality sensitive sampling, to overcome these challenges. LST has provable near-constant time complexity, which is exponentially better than the existing baseline. With at most 425.12× speedups and 99.76% energy savings over IMT, LST is the first algorithm that enables energy and time efficient machine teaching on IoT devices. Owing to LST's substantial efficiency and scalability, it is readily applicable in real-world education scenarios. © 2021 Neural information processing systems foundation. All rights reserved.

Publication Date
12-6-2021
Keywords
  • Energy conservation,
  • Iterative methods,
  • Learning systems,
  • 'current,
  • Constant time complexity,
  • Energy savings,
  • Energy-savings,
  • Home devices,
  • Locality sensitives,
  • Teachers',
  • Teaching algorithms,
  • Teaching paradigm,
  • Training sets
Comments

IR Deposit conditions: non-described

Proceedings available in NeurIPS 2021

Citation Information
Z. Xu, et al., "Locality Sensitive Teaching", in 35th Conference on Neural Information Processing Systems (NeurIPS 2021), Dec 6-14, 2021, [Online], in Advances in Neural Information Processing Systems, vol 22, 2021, pp. 18049-18062, [Online]. https://proceedings.neurips.cc/paper/2021/file/95c3f1a8b262ec7a929a8739e21142d7-Paper.pdf