Restarts Subject to Approximate Sharpness: A Parameter-Free and Optimal Scheme for Accelerating First-Order Optimization Methods

Ben Adcock, SFU Mathematics
October 30, 2023 3:00 pm LSK 306

Sharpness is a generic assumption in continuous optimization that bounds the distance to the set of minimizers in terms of the suboptimality in the objective function. It leads to the acceleration of first-order optimization methods via so-called restarting schemes. However, sharpness involves problem-specific constants that are typically unknown, and previous restart schemes often result in reduced convergence rates. Such schemes are also challenging to apply in the presence of noise or approximate model classes (e.g., in compressed sensing or machine learning problems). In this talk, I will introduce the notion of approximate sharpness, a generalization of sharpness that incorporates an unknown constant perturbation to the objective function error. By employing a new type of search over the unknown constants, I will then describe a restart scheme that applies to general first-order methods. This scheme maintains the same convergence rate as when assuming knowledge of the constants. Moreover, for broad classes of problems it gives rates of convergence that either match known optimal rates or improve on previously established rates. Finally, I will demonstrate the practical efficacy of this scheme on applications including sparse recovery, compressive imaging, and feature selection in machine learning.

This is joint work with Matthew J. Colbrook (Cambridge) and Maksym Neyra-Nesterenko (SFU). The corresponding paper can be found here: https://arxiv.org/abs/2301.02268

Refreshments will be served preceding the talk, starting at 2:45.

We gratefully acknowledge funding support from the Pacific Institute of Mathematical Sciences (PIMS).