Elizaveta Rebrova, Princeton University
The sketch-and-project is a unifying framework for many well-known projective iterative methods for solving linear systems, as well as their extensions to nonlinear problems. It generalizes popular algorithms such as Randomized Kaczmarz and Gauss-Seidel, as well as their block variants. In this talk, I will present our recent work that develops better ways to quantify the convergence of the sketch-and-project methods and concludes that — if properly designed — these methods quickly capture the large outlying singular values of the linear system, implicitly preconditioning it. It creates new particularly efficient solvers for approximately low-rank systems, such as those commonly arising in machine learning (e.g., kernel matrices, signal-plus-noise models). Our approach combines novel spectral analysis of randomly sketched projection matrices with classical numerical analysis techniques, such as including momentum, adaptive regularization, and memoization.