|Date: Wednesday, October 02, 2019
Location: 1360 East Hall (4:00 PM to 5:00 PM)
Title: Deep Learning: Applications and Asymptotics
Abstract: Deep learning has revolutionized image, text, and speech recognition. Motivated by this success, there is growing interest in developing deep learning methods for financial applications. We will present some of our recent results in this area. In the second part of the seminar, we will study single-layer neural networks with the Xavier initialization in the asymptotic regime of large numbers of hidden units and large numbers of stochastic gradient descent training steps. We prove the neural network converges in distribution to a random ODE with a Gaussian distribution using mean field analysis. Although the pre-limit problem of optimizing a neural network is non-convex (and therefore the neural network may converge to a local minimum), the limit equation minimizes a (quadratic) convex objective function and therefore converges to a global minimum. Furthermore, under reasonable assumptions, the matrix in the limiting quadratic objective function is positive definite and thus the neural network (in the limit) will converge to a global minimum with zero loss on the training set.
Speaker: Justin Sirignano