Task 001/002 - Neuro-inspired Algorithms and Theory
|Event Date:||November 12, 2020|
|Time:||11:00am (ET) / 8:00am (PT)
|School or Program:||Electrical and Computer Engineering
Akshay Rangamani, Massachusetts Institute of Technology For interpolating kernel machines, minimizing the norm minimizes stability
Abstract: We study the average CV Leave One Out stability of kernel ridge-less regression and derive corresponding risk bounds. We show that the interpolating solution with minimum norm minimizes a bound on CV Leave One Out stability, which in turn is controlled by the condition number of the empirical kernel matrix. The latter can be characterized in the asymptotic regime where both the dimension and cardinality of the data go to infinity. Under the assumption of random kernel matrices, the corresponding test error should be expected to follow a double descent curve.
Bio: Akshay Rangamani is a Postdoctoral Associate at the Center for Brains Minds and Machines at MIT. He received his PhD in Electrical and Computer Engineering at Johns Hopkins University, where he was a member of the Digital Signal Processing Laboratory, advised by Dr. Trac D. Tran. His research interests are at the intersection of the theory of machine learning, signal processing, and optimization.