Task 004: Accelerating Memory Augmented Neural Networks
|Event Date:||November 14, 2019|
|Time:||2:00pm ET/ 11:00am PT
|School or Program:||Electrical and Computer Engineering
Jacob Stevens, Purdue University Accelerating Memory Augmented Neural Networks
Memory-augmented neural networks (MANNs) -- which augment a traditional Deep Neural Network (DNN) with an external, differentiable memory -- are emerging as a promising direction in machine learning. MANNs have been shown to achieve one-shot learning and complex cognitive capabilities that are well beyond those of classical DNNs. In addition to introducing new capabilities, MANNs also introduce new computational challenges not present in traditional DNNs, such as a much lower compute-to-memory-access ration and non-reductive element-wise operations. To address these new challenges, we propose two inference accelerators: X-MANN and Manna. X-MANN is a Post-CMOS accelerator that leverages the intrinsic ability of resistive crossbars to efficiently execute differentiable memory operations in-memory. Manna, on the other hand is a CMOS-based accelerator and takes a full-system approach, specializing the architecture for all the kernels present in MANNs and thereby avoiding bottlenecks that emerge when greatly accelerating only part of the workload. We evaluate both accelerators by developing a detailed architectural simulator with timing and power models calibrated through synthesis and circuit simulation. Across a suite of benchmarks, X-MANN results in an average 35x reduction in latency with average energy improvements of 120x. Similarly, Manna demonstrates average speedups of 39x with average energy improvements of 122x over an NVIDIA 1080-Ti Pascal GPU.
Jacob Stevens is a PhD student studying Computer Engineering at Purdue University, advised by Dr. Anand Raghunathan. Jacob also received his bachelor's degree in Computer Engineering from Purdue University. His research focuses on the new computation challenges posed by emerging neural algorithms such as memory-augmented neural networks.