This talk will cover recent work on Bayesian coresets (“core of a dataset”), a methodology for statistical inference via data compression. Coresets achieve compression by forming a small weighted subset of data that replaces the full dataset during inference, leading to significant computational gains with provably minimal loss in inferential quality. In particular, the talk will cover recently developed methods for Bayesian coreset construction—including subsampling, stochastic conditional gradient descent, and sparse optimization on statistical manifolds—as well as corresponding theoretical convergence guarantees and empirical results on some representative problems.
RSVP here. (No need to RSVP if you already receive IAM seminar announcements by email.)