Task 016/017: End-to-End Performance Benchmark Frameworks towards Connectivity-Limited Neural Computing Systems / Neuromorphic Design Flow
Event Date: | June 30, 2022 |
---|---|
Time: | 11:00 am (ET) / 8:00am (PT) |
Priority: | No |
College Calendar: | Show |
Akul Malhotra, Purdue University
Design of Robust Synaptic Arrays for Binary Neural Networks
Abstract: Ultra-low precision networks such as binary neural networks (BNNs) have gained momentum in recent times, since the reduced precision alleviates the costs associated with storage, computation, and communication, enabling inference at the edge. Resistive Random Access Memory (RRAM) crossbar-based BNN accelerators have shown tremendous potential in boosting the speed and energy efficiency of compute intensive Deep Learning applications at the edge. Although RRAM-based in-memory BNN accelerators alleviate many of the drawbacks imposed by SRAM-based accelerators, they are susceptible to Stuck At-Faults (SAFs), which commonly occur in the memory. SAFs in the accelerator may lead to drastic degradation in the classification accuracy of the BNN, resulting in unintended system behavior.
In this work, we investigate the impact of SAFs on a state-of-the-art RRAM-based BNN accelerator. To mitigate the impact of SAFs on the BNN inference accuracy, we propose a robust RRAM-based 2T2R differential bitcell capable of accomplishing in-situ fault tolerance. We build synaptic arrays for BNNs utilizing the proposed bitcell to demonstrate a significant improvement in the classification accuracy. BNNs implemented using the proposed bitcell, when evaluated on image-based datasets are recover up to 98.16% of the original BNN accuracy, at a fault rate of 5%, thereby exhibiting substantial fault tolerance over the state-of-the-art RRAM-based BNN accelerator. The proposed synaptic array has a negligible energy overhead of 2.62% over the state-of-the-art RRAM array, while having 24.4% lesser latency and providing immense fault tolerance at the edge.
Bio: Akul Malhotra is a PhD student at Purdue University, advised by Dr. Sumeet Gupta. He received his B.E. degree in Electrical and Electronics engineering from Birla Institute of Technology and Science (BITS) Pilani, India, in 2020. He did his bachelor’s thesis at Pennsylvania State University in 2019, where he worked on exploiting the randomness in post CMOS devices for the hardware acceleration of Bayesian neural networks. Prior to that, he worked as a MITACS Globalink summer research intern at the University of Calgary, Canada. His research interests are in-memory computing and hardware accelerator design for ultra-low precision neural networks.