Skip to main content
Article
Tree-based Unidirectional Neural Networks for Low-Power Computer Vision
IEEE Design and Test
  • Abhinav Goel, Purdue University
  • Caleb Tung, Purdue University
  • Nick Eliopoulos, Purdue University
  • Amy Wang, West Lafayette Junior-Senior High School
  • Jamie C Davis, Purdue University
  • George K. Thiruvathukal, Loyola University Chicago
  • Yung-Hisang Lu, Purdue University
Document Type
Article
Publication Date
6-1-2023
Pages
53-61
Publisher Name
IEEE
Abstract

This article describes the novel Tree-based Unidirectional Neural Network (TRUNK) architecture. This architecture improves computer vision efficiency by using a hierarchy of multiple shallow Convolutional Neural Networks (CNNs), instead of a single very deep CNN. We demonstrate this architecture’s versatility in performing different computer vision tasks efficiently on embedded devices. Across various computer vision tasks, the TRUNK architecture consumes 65% less energy and requires 50% less memory than representative low-power CNN architectures, e.g., MobileNet v2, when deployed on the NVIDIA Jetson Nano.

Identifier
Electronic ISSN: 2168-2364
Comments

Author Posting © IEEE 2023. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The definitive version of this work was published at IEEE Design & Test, Vol.40, ISS.3, (June 2023), http://dx.doi.org/10.1109/MDAT.2022.3217016

Creative Commons License
Creative Commons Attribution-Noncommercial-No Derivative Works 3.0
Citation Information
A. Goel, C. Tung, N. Eliopoulos, A. Wang, J.C. Davis, G.K. Thiruvathukal, and Y.-H. Lu, "Tree-based Unidirectional Neural Networks for Low-Power Computer Vision," in IEEE Design & Test, 2022, doi: 10.1109/MDAT.2022.3217016.