[Ececourtesy-list] CCAM seminar 11:30-12:30 Friday on Large-Scale Low-rank Optimization
Scutari, Gesualdo
gscutari at purdue.edu
Mon Oct 28 11:44:16 EDT 2024
FYI
Please contact Zhang, Xiangxiong (zhan1966 at purdue.edu) by Wed if you are interested in meeting with Richard.
Time: Friday, 11:30 AM - 12:20 PM
Location: LWSN 1106
Speaker: Richard Zhang<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fryz.ece.illinois.edu%2F&data=05%7C02%7Cececourtesy-list%40ecn.purdue.edu%7C537c6b9e09ab44b3844208dcf7676121%7C4130bd397c53419cb1e58758d6d63f21%7C0%7C0%7C638657270606537202%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=ZnMed6grce4q1dQHN%2FvLDFUIRMYB94Lyh8WckqkEDVU%3D&reserved=0>, Electrical and Computer Engineering, UIUC
Title: Rank Overparameterization and Chordal Conversion for Large-Scale Low-rank Optimization
Abstract: Optimization problems over low-rank matrices are ubiquitous in the real-world. In principle, these can be reliably solved to global optimality via convex relaxation, but the computational costs can become prohibitive on a large scale. In practice, it is much more common to optimize over the low-rank matrices directly, as in the Burer-Monteiro approach, but their nonconvexity can cause failure by getting stuck at a spurious local minimum. For safety-critical societal applications, such as the operation and planning of an electricity grid, our inability to reliably achieve global optimality can have significant real-world consequences.
In the first part of this talk, we investigate global optimality guarantees by overparameterizing the low-rank factorization. In the smooth and strongly-convex setting, we rigorously show that, as the rank is increased, spurious local minima become increasingly rare in a step-wise fashion. In other words, rank-2 has fewer spurious local minima than rank-1, and rank-3 has fewer than rank-2, etc. Once the rank exceeds an O(1) threshold, every remaining local minimum is a global minimum, and every saddle point can be escaped.
In the second part of this talk, we revisit convex relaxations. Here, chordal conversion is a heuristic widely used to reduce the per-iteration cost of interior-point methods to as low as linear time, but a theoretical explanation for this speedup was previously unknown. We give a sufficient condition for the speedup, which covers many successful applications of chordal conversion, including the MAX-k-CUT relaxation, the Lovász Theta problem, sensor network localization, polynomial optimization, and the AC optimal power flow relaxation, thus allowing theory to match practical experience.
Main related papers: https://arxiv.org/abs/2207.01789<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Farxiv.org%2Fabs%2F2207.01789&data=05%7C02%7Cececourtesy-list%40ecn.purdue.edu%7C537c6b9e09ab44b3844208dcf7676121%7C4130bd397c53419cb1e58758d6d63f21%7C0%7C0%7C638657270606537202%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=RWcVAHcvD%2BsYy%2FUehQSrFUYMPsXqy%2FIBQB2d6rcanz8%3D&reserved=0> https://arxiv.org/abs/2306.15288<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Farxiv.org%2Fabs%2F2306.15288&data=05%7C02%7Cececourtesy-list%40ecn.purdue.edu%7C537c6b9e09ab44b3844208dcf7676121%7C4130bd397c53419cb1e58758d6d63f21%7C0%7C0%7C638657270606693465%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=KBTThKCi08ulqyJrE30CwGHeE%2FZi7S%2BCKif3Zkhgk6E%3D&reserved=0>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: </ECN/mailman/archives/ececourtesy-list/attachments/20241028/047b112b/attachment.htm>
More information about the Ececourtesy-list
mailing list