## Seminar: Implicit Regularization for Optimal Sparse Recovery

Speaker Varun Kanade University of Oxford Friday, 15 Nov 2019 13:00 - 14:00 Roberts Building G06 Sir Ambrose Fleming LT DeepMind CSML Seminar Series We present an implicit regularization scheme for gradient descent methodsapplied to unpenalized least squares regression to solve the problem ofreconstructing a sparse signal from an underdetermined system of linearmeasurements under the restricted isometry assumption. For a givenparameterization yielding a non-convex optimization problem, we show thatprescribed choices of initialization, step size and stopping time yield astatistically and computationally optimal algorithm that achieves the minimaxrate with the same cost required to read the data up to poly-logarithmicfactors. Beyond minimax optimality, we show that our algorithm adapts toinstance difficulty and yields a dimension-independent rate when thesignal-to-noise ratio is high enough. We validate our findings with numericalexperiments and compare our algorithm against explicit $\ell_{1}$ penalization.Going from hard instances to easy ones, our algorithm is seen to undergo aphase transition, eventually matching least squares with an oracle knowledge ofthe true support. (based on joint work with Patrick Rebeschini and Tomas Vaskevicius) Varun Kanade is an associate professor at University of Oxford in the Department of Computer Science. He has been a Simons Postdoctoral Fellow at the University of California, Berkeley and a FSMP postdoctoral fellow at ENS, Paris. He obtained his Ph.D. from Harvard University in 2012. His research interests are in machine learning and theoretical computer science. csml_id_398.ics