Fast and Accurate Sparse Coding of Visual Stimuli with a Simple, Ultra-Low-Energy Spiking ArchitectureElectrical and Computer Engineering Faculty Publications and Presentations
SponsorNational Science Foundation under award # 1028378 and by DARPA under award # HR0011-13-2-0015.
- Emerging technologies,
- Memristors -- Technological innovations,
- Neural networks (Computer science),
- Sparse coding -- Algorithms
AbstractMemristive crossbars have become a popular means for realizing unsupervised and supervised learning techniques. Often, to preserve mathematical rigor, the crossbar itself is separated from the neuron capacitors. In this work, we sought to simplify the design, removing extraneous components to consume significantly lower power at a minimal cost of accuracy. This work provides derivations for the design of such a network, named the Simple Spiking Locally Competitive Algorithm, or SSLCA, as well as CMOS designs and results on the CIFAR and MNIST datasets. Compared to a non-spiking model which scored 33% on CIFAR-10 with a single-layer classifier, this hardware scored 32% accuracy. When used with a state-of-the-art deep learning classifier, the non-spiking model achieved 82% and our simplified, spiking model achieved 80%, while compressing the input data by 79%. Compared to a previously proposed spiking model, our proposed hardware consumed 99% less energy to do the same work at 21 times the throughput. Accuracy held out with online learning to a write variance of 3% and a read variance of 40%. The proposed architecture's excellent accuracy and significantly lower energy usage demonstrate the utility of our innovations. This work provides a means for extremely low-energy sparse coding in mobile devices, such as cellular phones, or for very sparse coding as is needed by self-driving cars or robotics that must integrate data from multiple, high-resolution sensors.
Citation InformationWoods, W., & Teuscher, C. (2017). Fast and Accurate Sparse Coding of Visual Stimuli with a Simple, Ultra-Low-Energy Spiking Architecture. arXiv:1704.05877