Optimization for Statistical Learning with Low Dimensional Structure: Regularity and Conditioning

Lijun Ding, Wisconsin Institute for Discovery, Univ. of Wisconsin
January 24, 2023 1:00 pm ESB 4133

Many statistical machine learning problems, where one aims to recover an underlying low-dimensional signal, are based on optimization. Existing work often either overlooked the computational complexity in solving the optimization problem, or required case-specific algorithm and analysis — especially for nonconvex problems. This talk addresses the above two issues from a unified perspective of conditioning. In particular, we show that once the sample size exceeds the intrinsic dimension, (1) a broad class of convex and nonsmooth nonconvex problems are well-conditioned, (2) well conditioning, in turn, ensures the efficiency of off-the-shelf optimization methods and inspires new algorithms. Lastly, we show that a conditioning notion called flatness leads to accurate recovery in overparametrized models.