|Date: Friday, October 03, 2014
Location: 1084 East Hall (3:00 PM to 4:00 PM)
Title: Optimized first-order convex minimization methods
Abstract: Many problems in signal and image processing, machine learning, and estimation require optimization of convex cost functions. For convex cost functions with Lipschitz continuous gradients, Nesterov's fast gradient method decreases the cost function at least as fast as the square of the number of iterations, a rate order that is optimal. This talk presents a new first-order optimization method that converges twice as fast yet has a remarkably simple implementation comparable to Nesterov's method. Examples in machine learning and X-ray computed tomography (CT) will be shown. This work is from Donghwan Kim's doctoral thesis.
Speaker: Jeffrey Fessler
Institution: University of Michigan