I will discuss new work on practically popular algorithms–including the kernel polynomial method (KPM) and moment matching method–for approximating the spectral density (eigenvalue distribution) of an nxn diagonalizable matrix A with real-valued eigenvalues. We will see that natural variants of these algorithms achieve strong worst-case approximation guarantees: they can approximate any spectral density to epsilon accuracy in the Wasserstein-1 distance with roughly O(1/epsilon) matrix-vector multiplications with A. Moreover, we will show that the methods are robust to *inaccuracy* in these matrix-vector multiplications, which allows them to be combined with any approximation multiplication algorithm. As an application, we develop a randomized sublinear time algorithm for approximating the spectral density of normalized graph adjacency or Laplacian matrices. The talk will cover the main tools used in our work, which include Jackson’s seminal work on polynomial approximation and stability results for computing orthogonal polynomials via three-term recurrence relations.
This lecture will be delivered via Zoom; the link will be included in the emailed seminar announcement. If you do not regularly receive these, please RSVP here to request the announcement for this and other IAM seminars.