Algorithms have two costs: arithmetic and communication, i.e. moving data between levels of a memory hierarchy or processors over a network. Communication costs (measured in time or energy per operation) greatly exceed arithmetic costs, so our goal is to design algorithms that minimize communication. We survey some known algorithms that communicate asymptotically less than their classical counterparts, for a variety of linear algebra and machine learning problems, often attaining lower bounds. We also discuss recent work on automating the design and implementation of these algorithms, starting from a simple specification as nested loops.
The IAM’s annual welcome-back tea will follow the talk, in the IAM lounge (LSK 306).
We gratefully acknowledge funding support from the Pacific Institute of Mathematical Sciences (PIMS), and organizing assistance from SCAIM.