Task 019: Robustness of Neural Systems
| Event Date: | November 3, 2022 |
|---|---|
| Time: | 11:00 am (ET) / 8:00am (PT) |
| Priority: | No |
| College Calendar: | Show |
Ruokai Yin, Yale University
SATA: Sparsity-Aware Training Accelerator for Spiking Neural Networks
ABSTRACT: Spiking Neural Networks (SNNs) have gained huge attention as a potential energy-efficient alternative to conventional Artificial Neural Networks (ANNs) due to their inherent high-sparsity activation. Recently, SNNs with backpropagation through time (BPTT) have achieved state-of-the-art performance on image recognition tasks than other SNN training algorithms. Despite the algorithmic success, prior works neglect the evaluation of the hardware energy overheads of BPTT due to the lack of a hardware evaluation platform for SNN training. Moreover, although SNNs have long been seen as an energy-efficient counterpart of ANNs, a quantitative comparison between the training cost of SNNs and ANNs is missing. To address the aforementioned issues, in this talk, I will introduce SATA (Sparsity-Aware Training Accelerator), a BPTT-based training accelerator for SNNs. The proposed SATA provides a simple and re-configurable systolic-based accelerator architecture, which makes it easy to analyze the training energy for BPTT-based SNN training algorithms. SATA supports the handling of various groups of inherent sparsity in BPTT that significantly increase its computation energy efficiency. Based on SATA, we show quantitative analyses of the energy efficiency of SNN training and compare the training cost between SNNs and ANNs. Moreover, to propel future SNN training algorithm design, we provide several observations on energy efficiency for different SNN-specific training parameters and propose an energy estimation framework for SNN training. Code for our framework is made publicly available at https://github.com/RuokaiYin/SATA_Sim.
BIO: Ruokai Yin is a Ph.D. student in the Department of Electrical Engineering at Yale University, USA, advised by Prof. Priyadarshini Panda. His research interests lie in designing high-performance computer architectures for neural networks. Prior to joining Yale, he received his BS-Electrical Engineering degree from the University of Wisconsin-Madison, USA, in 2021, where he worked with Prof. Joshua San Miguel on computer architectures for stochastic computing.
