Task 019: Robustness of Neural Systems

Event Date: September 30, 2021
Time: 11:00 am (ET) / 8:00am (PT)
Priority: No
College Calendar: Show
Yeshwanth Venkatesha, Yale University
Federated Learning with Spiking Neural Networks
Abstract: As neural networks get widespread adoption in resource-constrained embedded devices, there is a growing need for low-power neural systems. Spiking Neural Networks (SNNs) are emerging to be an energy-efficient alternative to the traditional Artificial Neural Networks (ANNs) which are known to be computationally intensive. From an application perspective, as federated learning involves multiple energy-constrained devices, there is a huge scope to leverage energy efficiency provided by SNNs. Specifically, we design a federated learning method for training decentralized and privacy preserving SNNs. To validate the proposed method, we experimentally evaluate the advantages of SNNs on various aspects of federated learning with CIFAR10 and CIFAR100 benchmarks. We observe that SNNs outperform ANNs in terms of overall accuracy by over 15% when the data is distributed across a large number of clients in the federation while providing up to 4.3 times energy efficiency. In addition to efficiency, we also analyze the sensitivity of the proposed federated SNN framework to data distribution among the clients, stragglers, and gradient noise and perform a comprehensive comparison with ANNs. 
 
Bio: Yeshwanth Venkatesha received B.Tech. in Computer Science and Engineering from Indian Institute of Technology and Science Kharagpur, India, in 2017. He joined Yale University, New Haven, CT, USA, in 2020 as a Ph.D. student in the Electrical Engineering department. Prior to joining Yale University, he worked as a Data Scientist and Software Engineer at WalmartLabs India and Samsung Advanced Institute of Technology India respectively. His research interests lie in the areas of efficient processing of neural networks and distributed machine learning.