Recent advances in technology have led to a monumental increase in large-scale data across many platforms. One mathematical model that has gained a lot of recent attention is the use of sparsity. Sparsity captures the idea that high dimensional signals often contain a very small amount of intrinsic information. Using this notion, one may design efficient low-dimensional representations of large-scale data as well as robust reconstruction methods for those representations. Binary, or one-bit, representations of data for example, arise naturally in many applications, and are appealing in both hardware implementations and algorithm design. In this talk, we provide a brief background to sparsity and 1-bit measurements, and present new results on the problem of data classification with low computation and resource costs. We illustrate the utility of the proposed approach on recently acquired data about Lyme disease.
Deanna Needell earned her PhD from UC Davis before working as a postdoctoral fellow at Stanford University. She is currently a full professor of mathematics at UCLA. She has earned many awards including the IEEE Best Young Author award, the Hottest paper in Applied and Computational Harmonic Analysis award, the Alfred P. Sloan fellowship, an NSF CAREER and NSF BIGDATA award, and the IMA prize in Applied Mathematics. She was a research professor fellow at MSRI last Fall and is now a (semi-) long term visitor at Simons this Fall. She also serves as associate editor for IEEE Signal Processing Letters, Linear Algebra and its Applications, the SIAM Journal on Imaging Sciences, and Transactions in Mathematics and its Applications as well as on the organizing committee for SIAM sessions and the Association for Women in Mathematics.