Task 007: Block sketching for decentralized learning

Event Date: June 27, 2019
Time: 2:00pm ET/ 11:00am PT
Priority: No
College Calendar: Show
Rakshith Sharma, PhD student at Georgia Institute of Technology
Abstract:
Improving the efficiency of linear algebraic computations is a necessity to arm modern signal processing and machine learning algorithms against the ever growing size of data, both in terms of number of data points and the number of features. In this work, we consider two fundamental learning problems with distributional constraints: ridge regression and low rank matrix recovery. For both problems, we assume that the data is distributed among several nodes and consider the problem of sketching at individual nodes and then performing inference at a central location using the compressed data. We provide theoretical guarantees on the amount of compression possible for each of the problems and provide simulation results.
 
Bio:
Rakshith Sharma is a fourth year PhD student at Georgia Institute of Technology, working with Prof. Justin Romberg. His interests lie in the fields of signal processing and machine learning for high-dimensional data in general, and in providing efficient algorithms and theoretical analysis for fundamental problems but after imposition of various constraints, inspired by real-world applications.