Tensor Networks for Machine Learning Applications
The main aim of the project is to study the tradeoffs between computational complexity, description complexity and statistical inference through the lens of tensor algebraic framework, exploiting low rank tensor representations. Tensor factorization and representations lead to polynomial complexity representations of an exponentially large data, which are modeled as multidimensional arrays of numbers indexed along several axes a.k.a. tensors. Recently, there has been a surge in
using tensor algebraic methods (utilizing the structure of tensor networks) that have proven to be quite useful in a variety of applications where the data is inherently or naturally multidimensional, as well as in several other applications such as deep learning and clustering where the problem can be embedded into a tensor algebraic setting. The project aims to consider both the fundamental limits and applications of tensor networks in a variety of settings, like scattering theory.
PhD in Electrical Engineering, Math, Statistics, Physics, or related areas.
Vaneet Aggarwal, email@example.com, https://engineering.purdue.edu/CLANLabs
Zubin Jacob, firstname.lastname@example.org, www.zjresearchgroup.org/
1. Or Sharir, Ronen Tamari, Nadav Cohen, Amnon Shashua, Tensorial Mixture Models, https://arxiv.org/abs/1610.04167
2. Jure Sokolic, Raja Giryes, Guillermo Sapiro, and Miguel RD Rodrigues. Robust large margin deep neural networks. IEEE Transactions on Signal Processing, 2017.
3. Wenqi Wang, Yifan Sun, Brian Eriksson, Wenlin Wang, and Vaneet Aggarwal, "Wide Compression: Tensor Ring Nets," in Proc. CVPR, Jun 2018
4. Roman Orus. A practical introduction to tensor networks: Matrix product states and projected entangled pair states. Annals of Physics, 349:117-158, 2014.
5. Jacob Biamonte, Ville Bergholm, "Tensor Networks in a Nutshell," https://arxiv.org/abs/1708.00006