The University of Michigan Combinatorics Seminar


Abstract 

In the heavy hitters problem, a vector of frequencies describes a large and changing collection of items and the goal is to track the most frequent items from the information in a small number of nonadaptive linear measurements (dot products) that are made to the frequency vector. This problem also arises in image and signal processing, and is also known as "sparse recovery" and "compressed sensing." Over the last decade, a number of algorithms have been proposed for this problem. We present the first algorithms that simultaneously satisfy three important criteria: (i) the number of measurements is optimal, up to log factors; (ii) the reconstruction time is polynomial in the number of heavy hitters and polylogarithmic in the universe size; (iii) a single (randomly constructed) set of measurements works for all frequency vectors. At the heart of our algorithm is a technique called Combinatorial Group Testing; we will demonstrate its important role in stateoftheart algorithms. Our construction is probabilistic; an important open question is to provide a deterministic construction of a suitable set of measurements, which amounts to constructing a certain type of combinatorial design. No prior experience with algorithms (good or bad) will be assumed for this talk. Joint work with Anna Gilbert, Joel Tropp, and Roman Vershynin. The talk is based on http://www.eecs.umich.edu/~martinjs/papers/GSTV06OneSketch.pdf. 