C-BRIC Researchers Presented at DATE Conference
C-BRIC researchers presented their work at the 2022 Design, Automation and Test in Europe Conference (DATE). The conference was held virtually March 14-23, 2022.
Congratulations to researchers Amogh Agrawal, Aayush Ankit, Abhiroop Bhattacharjee, Yu (Kevin) Cao, Indranil Chakraborty, Youngeun Kim, Adarsh Kosta, Jian Meng, Vijaykrishnan Narayanan, Priyadarshini Panda, Kaushik Roy, Utkarsh Saxena, Jae-sun Seo, and Efstathia Soufleri.
Vijaykrishnan Narayanan, along with collaborators Cyan Subhra Mishra, John Sampson, and Mahmut Taylan Kandemir all of Pennsylvania State University, presented their paper entitled “Origin: Enabling On-Device Intelligence for Human Activity Recognition Using Energy Harvesting Wireless Sensor Networks” at DATE. Their work presents, Origin, a novel scheduling policy and adaptive ensemble learner to efficiently perform human activity recognition (HAR) on a distributed energy-harvesting body area network to personalize the optimizations based on each user. Origin strategically ensures efficient and accurate individual inference execution at each sensor node by using a novel activity-aware scheduling approach. It also leverages the continuous nature of human activity when coordinating and aggregating results from all the sensor nodes to improve final classification accuracy. This work was a DATE Best Paper nominee.
Abhiroop Bhattacharjee and Priyadarshini Panda of Yale Univeristy, and collaborator Lakshya Bhatnagar of IIT Delhi presented “Examining and Mitigating the Impact of Crossbar Non-idealities for Accurate Implementation of Sparse Deep Neural Network” at DATE. This paper proposed two mitigation approaches to accuracy loss - crossbar-column rearrangement and Weight-Constrained-Training (WCT). These approaches help mitigate non-idealities by increasing the proportion of low conductance synapses on crossbars, thereby improving their computational accuracy. Bhattacharjee is a PhD student advised by Priyadarshini Panda.
Youngeun Kim and Priyadarshini Panda of Yale University, and Samsung collaborators Hyunsoo Kim, Seijoon Kim, and Sang Joon Kim presented “Gradient-based Bit Encoding Optimization for Noise-Robust Binary Memristive Crossbar” at DATE. This work explored a new perspective on mitigating crossbar noise in a more generalized way by manipulating an input binary bit encoding rather than training the weight of networks with respect to noise data. They proposed Gradient-based Bit Encoding Optimization (GBO), which optimizes a different number of pulses at each layer, based on the group’s in-depth analysis that each layer has a different level of noise sensitivity. The proposed heterogeneous layer-wise bit encoding scheme achieves high noise robustness with low computational cost. Kim is a PhD student advised by Priyadarshini Panda.
Adarsh K. Kosta, Efstathia Soufleri, Indranil Chakraborty, Amogh Agrawal, Aayush Ankit, and Kaushik Roy’s work on “HyperX: A Hybrid RRAM-SRAM Partitioned System for Error Recovery in Memristive Xbars” was presented at DATE. This work presented HyperX, a hybrid RRAM-SRAM system that leverages the complementary benefits of Non-Volatile Memory (NVM) and CMOS technologies. The HyperX system consists of a fixed RRAM block offering area and energy-efficient Matrix-Vector-Multiplications (MVMs) and an SRAM block enabling on-chip training to recover the accuracy drop due to the RRAM non-idealities. Kosta and Soufleri are current PhD students at Purdue University, advised by Kaushik Roy. Chakraborty (Google), Agrawal (Apple), and Ankit (Microsoft) are PhD graduates who Roy advised during their studies at Purdue.
Utkarsh Saxena, Indranil Chakraborty, and Kaushik Roy presented their paper entitled “Towards ADC-Less Compute-In-Memory Accelerators for Energy Efficient Deep Learning” at DATE. This work proposed a hardware-software co-design approach to reduce the analog to digital converters (ADC) costs through partial-sum quantization. Specifically, they replace ADCs with 1-bit sense amplifiers and develop a quantization aware training methodology to compensate for the loss in representation ability. They showed that the ADC-less DNN model reduces energy consumption while maintaining accuracy without partial-sum quantization. Saxena is a current PhD student, and Chakraborty, now with Google, is a PhD graduate, both under the direction of Kaushik Roy.
C-BRIC researchers Jian Meng, Jae-sun Seo, and Yu (Kevin) Cao, along with Fan Zhang, Li Yang, and Deliang Fan of Arizona State University presented “XST: A Crossbar Column-wise Sparse Training for Efficient Continual Learning” at DATE, which earned them the DATE Best Interactive Presentation (IP) Award. This work proposed XST, a novel crossbar column-wise sparse training framework for continual learning. XST significantly reduces the training cost and saves inference energy. More importantly, it is friendly to existing crossbar-based convolution engine with almost no hardware overhead. Meng is a PhD student advised by Jae-sun Seo of Arizona State University. Yu (Kevin) Cao is also a faculty member at Arizona State.